Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Jeneen Interlandi The human brain’s memory-storage capacity is an order of magnitude greater than previously thought, researchers at the Salk Institute for Biological Studies reported last week. The findings, recently detailed in eLife, are significant not only for what they say about storage space but more importantly because they nudge us toward a better understanding of how, exactly, information is encoded in our brains. The question of just how much information our brains can hold is a longstanding one. We know that the human brain is made up of about 100 billion neurons, and that each one makes 1,000 or more connections to other neurons, adding up to some 100 trillion in total. We also know that the strengths of these connections, or synapses, are regulated by experience. When two neurons on either side of a synapse are active simultaneously, that synapse becomes more robust; the dendritic spine (the antenna on the receiving neuron) also becomes larger to support the increased signal strength. These changes in strength and size are believed to be the molecular correlates of memory. The different antenna sizes are often compared with bits of computer code, only instead of 1s and 0s they can assume a range of values. Until last week scientists had no idea how many values, exactly. Based on crude measurements, they had identified just three: small, medium and large. But a curious observation led the Salk team to refine those measurements. In the course of reconstructing a rat hippocampus, an area of the mammalian brain involved in memory storage, they noticed some neurons would form two connections with each other: the axon (or sending cable) of one neuron would connect with two dendritic spines (or receiving antennas) on the same neighboring neuron, suggesting that duplicate messages were being passed from sender to receiver. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 21866 - Posted: 02.06.2016
By Jonathan Leo Last week, according to many media accounts, scientists from Harvard Medical School, Boston Children’s Hospital, and the Broad Institute discovered the genetic basis of schizophrenia. The researchers reported in Nature that people with schizophrenia were more likely to have the overactive forms of a gene called complement component 4, or C4, which is involved in pruning synapses during adolescence. However, suggesting a biologic mechanism for a small subset of those diagnosed with schizophrenia is not the same as confirming the genetic theory of schizophrenia. Benedict Carey, science reporter for the New York Times, delved into the details and reported the all-important fact that having the C4 variant would increase a person’s risk by about 25 percent over the 1-percent base rate of schizophrenia—that is, to 1.25 percent. Genes for schizophrenia and depression have been discovered before, and in those cases, the subsequent enthusiastic headlines were shortly followed by retractions and more sober thinking. There are so many open questions (for instance, why do many people with the problematic variant not develop schizophrenia, and why do many people who don’t have the variant develop schizophrenia?) that the same may occur with the C4 discovery. The idea that mental illness is the result of a genetic predisposition is the foundation for modern-day psychiatry and has been the driving force for how research money is allocated, how patients are treated, and how society views people diagnosed with conditions identified in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. Schizophrenia holds a unique spot in the annals of mental health research because of its perceived anatomical underpinnings and is often cited as evidence in favor of a genetic predisposition to other conditions.
By Susana Martinez-Conde Take a look at the red chips on the two Rubik cubes below. They are actually orange on the left and purple on the right, if you look at them in isolation. They only appear more or less equally red across the images because your brain is interpreting them as red chips lit by either yellow or blue light. This kind of misperception is an example of perceptual constancy, the mechanism that allows you to recognize an object as being the same in different environments, and under very diverse lighting conditions. Constancy illusions are adaptive: consider what would have happened if your ancestors thought a friend became a foe whenever a cloud hid the sun, or if they lost track of their belongings–and even their own children—every time they stepped out of the cave and into the sunlight. Why, they might have even eaten their own kids! You are here because the perceptual systems of your predecessors were resistant to annoying changes in the physical reality–as is your own (adult) perception. There are many indications that constancy effects must have helped us survive (and continue to do so). One such clue is that we are not born with perceptual constancy, but develop it many months after birth. So at first we see all differences, and then we learn to ignore certain types of differences so that we can recognize the same object as unchanging in many varied scenarios. When perceptual constancy arises, we lose the ability to detect multiple contradictions that are nevertheless highly noticeable to young babies. © 2016 Scientific American
Mo Costandi The human brain is immediately recognizable by its cortex (meaning bark in Latin), the prominent outer layer of tissue, with its characteristic pattern of ridges and furrows, which sits atop the deep structures. The cortex is just several millimetres thick, but has a surface area of about two-and-a-half square feet, and is therefore heavily convoluted so it can be packed into the skull. This fleshy landscape begins to form during the second trimester of pregnancy, and continues into the first year of life. It is often assumed to be the result of genetics, like most other aspects of brain development. Forty years ago, however, Harvard researchers put forward the controversial idea that the brain folds up because of physical forces, and a new study now provides the first evidence this. According to this old model, the brain’s folds form as a result of differential growth which causes the cortex to grow in size far more quickly than other brain structures, leading it to buckle and fold as its surface area increases, due to the constraints of the skull. To test this, Tuomos Tallinen of the University of Jyväskylä in Finland and his colleagues used magnetic resonance images to create a 3D-printed cast of an unfolded 22-week-old human brain. This was made with a technique called layer-by-layer drop casting, and consisted of a soft polymer core coated with a thin sheet of an absorbent elastomer gel representing the cortex. © 2016 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 21857 - Posted: 02.04.2016
Heidi Ledford Difficulty with concentration, memory and other cognitive tasks is often associated with depression. In the past quarter of a century, a wave of drugs has transformed the treatment of depression. But the advances have struggled to come to grips with symptoms that often linger long after people start to feel better: cognitive problems such as memory loss and trouble concentrating. On 3 February, the US Food and Drug Administration (FDA) will convene a meeting of its scientific advisers to discuss whether such cognitive impairments are components of the disorder that drugs might be able to target — or just a result of depressed mood. The discussion will help the agency to decide whether two companies that sell the antidepressant vortioxetine should be allowed to label it as a treatment for the cognitive effects. A ‘yes’ could spur drug developers to invest in ways to test cognitive function during their antidepressant trials. Psychiatrists have long noted that some people with depression also struggle to concentrate and to make decisions. The question has been whether such difficulties are merely an offshoot of altered mood and would thus clear up without specific treatment, says Diego Pizzagalli, a neuroscientist at McLean Hospital, an affiliate of Harvard Medical School in Belmont, Massachusetts. But some patients who report improved mood after treatment still struggle with cognitive deficits — so psychiatrists sometimes prescribe concentration-enhancing drugs that are approved to treat attention deficit hyperactivity disorder to people with depression. © 2016 Nature Publishing Group
Fears over surveillance seem to figure large in the bird world, too. Ravens hide their food more quickly if they think they are being watched, even when no other bird is in sight. It’s the strongest evidence yet that ravens have a “theory of mind” – that they can attribute mental states such as knowledge to others. Many studies have shown that certain primates and birds behave differently in the presence of peers who might want to steal their food. While some researchers think this shows a theory of mind, others say they might just be reacting to visual cues, rather than having a mental representation of what others can see and know. Through the peephole Thomas Bugnyar and colleagues at the University of Vienna, Austria, devised an experiment to rule out the possibility that birds are responding to another’s cues. The setup involved two rooms separated by a wooden wall, with windows and peepholes that could be covered. First, a raven was given food with another raven in the next room, with the window open or covered, to see how quickly it caches its prize. With the window open, the birds hid their food more quickly and avoided going back to conceal it further. Then individual ravens were then trained to use the peephole to see where humans were putting food in the other room. The idea here was to allow the bird to realise it could be seen through the peephole. © Copyright Reed Business Information Ltd.
By Jonathan Webb Science reporter, BBC News Scientists have reproduced the wrinkled shape of a human brain using a simple gel model with two layers. They made a solid replica of a foetal brain, still smooth and unfolded, and coated it with a second layer which expanded when dunked into a solvent. That expansion produced a network of furrows that was remarkably similar to the pattern seen in a real human brain. This suggests that brain folds are caused by physics: the outer part grows faster than the rest, and crumples. Such straightforward, mechanical buckling is one of several proposed explanations for the distinctive twists and turns of the brain's outermost blanket of cells, called the "cortex". Alternatively, researchers have suggested that biochemical signals might trigger expansion and contraction in particular parts of the sheet, or that the folds arise because of stronger connections between specific areas. "There have been several hypotheses, but the challenge has been that they are difficult to test experimentally," said Tuomas Tallinen, a soft matter physicist at the University of Jyväskylä in Finland and a co-author of the study, which appears in Nature Physics. "I think it's very significant... that we can actually recreate the folding process using this quite simple, physical model." Humans are one of just a few animals - among them whales, pigs and some other primates - that possess these iconic undulations. In other creatures, and early in development, the cortex is smooth. The replica in the study was based on an MRI brain scan from a 22-week-old foetus - the stage just before folds usually appear. © 2016 BBC.
Keyword: Development of the Brain
Link ID: 21848 - Posted: 02.02.2016
By CATHERINE SAINT LOUIS The images pouring out of Brazil are haunting: struggling newborns with misshapen heads, cradled by mothers who desperately want to know whether their babies will ever walk or talk. There are thousands of these children in Brazil, and scientists fear thousands more might come as the Zika virus leaps across Latin America and the Caribbean. But the striking deformity at the center of the epidemic, microcephaly, is not new: It has pained families across the globe and mystified experts for decades. For parents, having a child with microcephaly can mean a life of uncertainty. The diagnosis usually comes halfway through pregnancy, if at all; the cause may never be determined — Zika virus is only suspected in the Brazilian cases, while many other factors are well documented. And no one can say what the future might hold for a particular child with microcephaly. For doctors, the diagnosis means an ailment with no treatment, no cure and no clear prognosis. If the condition surges, it will significantly burden a generation of new parents for decades. Dr. Hannah M. Tully, a neurologist at Seattle Children’s Hospital, sees the pain regularly, particularly among expectant parents who have just been told that an ultrasound showed their child to be microcephalic: “a terrible situation with which to be confronted in a pregnancy,” she said. An estimated 25,000 babies receive a microcephaly diagnosis each year in the United States. Microcephaly simply means that the baby’s head is abnormally small — sometimes just because the parents themselves have unusually small heads. “By itself, it doesn’t necessarily mean you have a neurological problem,” said Dr. Marc C. Patterson, a pediatric neurologist at the Mayo Clinic Children’s Center in Rochester, Minn. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21844 - Posted: 02.01.2016
By Lisa Rapaport Mothers who are obese during pregnancy have almost twice the odds of having a child with autism as women who weigh less, a U.S. study suggests. When women are both obese and have diabetes, the autism risk for their child is at least quadrupled, researchers reported online January 29 in Pediatrics. "In terms of absolute risk, compared to common pediatric diseases such as obesity and asthma, the rate of autism spectrum disorder (ASD) in the U.S. population is relatively low, however, the personal, family and societal impact of ASD is enormous," said senior study author Dr. Xiaobin Wang, a public health and pediatrics researcher at Johns Hopkins University in Baltimore. About one in 68 children have ASD, according to the U.S. Centers for Disease Control and Prevention, or about 1.5 percent of U.S. children. The study findings suggest the risk rises closer to about 3 percent of babies born to women who are obese or have diabetes, and approaches 5 percent to 6 percent when mothers have the combination of obesity and diabetes. Wang and colleagues analyzed data on 2,734 mother-child pairs followed at Boston Medical Center between 1998 and 2014. Most of the children, 64 percent, weren't diagnosed with any other development disorders, but there were 102 kids who did receive an ASD diagnosis. © 2016 Scientific American
By Mitch Leslie Identical twins may be alike in everything from their eye color to their favorite foods, but they can diverge in one important characteristic: their weight. A new study uncovers a molecular mechanism for obesity that might explain why one twin can be extremely overweight even while the other is thin. Heredity influences whether we become obese, but the genes researchers have linked to the condition don’t explain many of the differences in weight among people. Identical twins with nonidentical weights are a prime example. So what accounts for the variation? Changes in the intestinal microbiome—the collection of bacteria living in the gut—are one possibility. Another is epigenetic changes, or alterations in gene activity. These changes occur when molecules latch on to DNA or the proteins it wraps around, turning sets of genes “on” or “off.” Triggered by factors in the environment, epigenetic modifications can be passed down from one generation to the next. This type of transmission happened during the Hunger Winter, a famine that occurred when the Germans cut off food supplies to parts of the Netherlands in the final months of World War II. Mothers who were pregnant during the famine gave birth to children who were prone to obesity decades later, suggesting that the mothers’ diets had a lasting impact on their kids’ metabolism. However, which epigenetic changes in people promote obesity remains unclear. © 2016 American Association for the Advancement of Science
By BENEDICT CAREY Scientists reported on Wednesday that they had taken a significant step toward understanding the cause of schizophrenia, in a landmark study that provides the first rigorously tested insight into the biology behind any common psychiatric disorder. More than two million Americans have a diagnosis of schizophrenia, which is characterized by delusional thinking and hallucinations. The drugs available to treat it blunt some of its symptoms but do not touch the underlying cause. The finding, published in the journal Nature, will not lead to new treatments soon, experts said, nor to widely available testing for individual risk. But the results provide researchers with their first biological handle on an ancient disorder whose cause has confounded modern science for generations. The finding also helps explain some other mysteries, including why the disorder often begins in adolescence or young adulthood. “They did a phenomenal job,” said David B. Goldstein, a professor of genetics at Columbia University who has been critical of previous large-scale projects focused on the genetics of psychiatric disorders. “This paper gives us a foothold, something we can work on, and that’s what we’ve been looking for now, for a long, long time.” The researchers pieced together the steps by which genes can increase a person’s risk of developing schizophrenia. That risk, they found, is tied to a natural process called synaptic pruning, in which the brain sheds weak or redundant connections between neurons as it matures. During adolescence and early adulthood, this activity takes place primarily in the section of the brain where thinking and planning skills are centered, known as the prefrontal cortex. People who carry genes that accelerate or intensify that pruning are at higher risk of developing schizophrenia than those who do not, the new study suggests. Some researchers had suspected that the pruning must somehow go awry in people with schizophrenia, because previous studies showed that their prefrontal areas tended to have a diminished number of neural connections, compared with those of unaffected people. © 2016 The New York Times Company
By Ellen Hendriksen This topic comes by request on the Savvy Psychologist Facebook page from listener Anita M. of Detroit. Anita works with foster kids and, too often, sees disadvantaged kids who have been on a cocktail of psychiatric medications from as early as age 6. She asks, does such early use alter a child’s brain or body? And have the effects of lifelong psychiatric medication been studied? Childhood mental illness (and resulting medication) is equally overblown and under-recognized. Approximately 21% of American kids - that’s 1 in 5 - will battle a diagnosable mental illness before they reach the age of 17, whether or not they actually get treatment. The problem is anything but simple. Some childhood illnesses - ADHD and autism, for example - often get misused as “grab-bag” diagnoses when something’s wrong but no one knows what. This leads to overdiagnosis and sometimes, overmedicating. Other illnesses, like substance abuse, get overlooked or written off as rebellion or experimentation, leading to underdiagnosis and kids slipping through the cracks. But the most common problem is inconsistent diagnosis. For example, a 2008 study found that fewer than half of individuals diagnosed with bipolar disorder actually had the illness, while 5% of those diagnosed with something completely different actually had bipolar disorder. But let’s get back to Anita’s questions: Does early psychotropic medication alter a child’s brain? The short answer is yes, but the long answer might be different than you think. © 2016 Scientific American
Nell Greenfieldboyce The state of New Jersey has been trying to help jurors better assess the reliability of eyewitness testimony, but a recent study suggests that the effort may be having unintended consequences. That's because a new set of instructions read to jurors by a judge seems to make them skeptical of all eyewitness testimony — even testimony that should be considered reasonably reliable. Back in 2012, New Jersey's Supreme Court did something groundbreaking. It said that in cases that involve eyewitness testimony, judges must give jurors a special set of instructions. The instructions are basically a tutorial on what scientific research has learned about eyewitness testimony and the factors that can make it more dependable or less so. "The hope with this was that jurors would then be able to tell what eyewitness testimony was trustworthy, what sort wasn't, and at the end of the day it would lead to better decisions, better court outcomes, better justice," says psychologist David Yokum. Yokum was a graduate student at the University of Arizona, doing research on decision-making, when he and two colleagues, Athan Papailiou and Christopher Robertson, decided to test the effect of these new jury instructions, using videos of a mock trial that they showed to volunteers. © 2016 npr
Keyword: Learning & Memory
Link ID: 21828 - Posted: 01.27.2016
James Gorman Spotted hyenas are the animals that got Sarah Benson-Amram thinking about how smart carnivores are and in what ways. Dr. Benson-Amram, a researcher at the University of Wyoming in Laramie, did research for her dissertation on hyenas in the wild under Kay E. Holekamp of Michigan State University. Hyenas have very complicated social structures and they require intelligence to function in their clans, or groups. But the researchers also tested the animals on a kind of intelligence very different from figuring out who ranks the highest: They put out metal boxes that the animals had to open by sliding a bolt in order to get at meat inside. Only 15 percent of the hyenas solved the problem in the wild, but in captivity, the animals showed a success rate of 80 percent. Dr. Benson-Amram and Dr. Holekamp decided to test other carnivores, comparing species and families. They and other researchers presented animals in several different zoos with a metal puzzle box with a treat inside and recorded the animals’ efforts. They tested 140 animals in 39 species that were part of nine families. They reported their findings on Monday in the Proceedings of the National Academy of Sciences. They compared the success rates of different families with absolute brain size, relative brain size, and the size of the social groups that the species form in the wild. Just having a bigger brain did not make difference, but the relative size of the brain, compared with the size of the body, was the best indication of which animals were able to solve the problem of opening the box. © 2016 The New York Times Company
Ian Sample Science editor Genetically modified (GM) monkeys that develop symptoms of autism have been created to help scientists discover treatments for the condition. The macaques carry a genetic fault that causes a rare disorder in humans called MeCP2 duplication syndrome. This produces a wide range of medical conditions, some of which mirror those seen in autism, such as difficulties with social interactions. Researchers say groups of the GM monkeys could be used to identify brain circuits involved in common autistic behaviours and to test new treatments designed to alleviate the symptoms. Because the monkeys pass the genetic defects on to their offspring, scientists can breed large populations of the animals for medical research. A group of 200 monkeys has been established at the scientists’ lab in China. The research, described in the journal Nature, paves the way for more varieties of GM monkeys that develop different mental and psychiatric problems which are almost impossible to study in other animals. “The first cohort of transgenic monkeys shows very similar behaviour to human autism, including increased anxiety, but most importantly, defects in social interactions,” said Zilong Qiu who led the research at the Institute of Neuroscience in Shanghai. © 2016 Guardian News and Media Limited or it
Alison Abbott For the second time in four months, researchers have reported autopsy results that suggest Alzheimer’s disease might occasionally be transmitted to people during certain medical treatments — although scientists say that neither set of findings is conclusive. The latest autopsies, described in the Swiss Medical Weekly1 on 26 January, were conducted on the brains of seven people who died of the rare, brain-wasting Creutzfeldt–Jakob disease (CJD). Decades before their deaths, the individuals had all received surgical grafts of dura mater — the membrane that covers the brain and spinal cord. These grafts had been prepared from human cadavers and were contaminated with the prion protein that causes CJD. But in addition to the damage caused by the prions, five of the brains displayed some of the pathological signs that are associated with Alzheimer’s disease, researchers from Switzerland and Austria report. Plaques formed from amyloid-β protein were discovered in the grey matter and blood vessels. The individuals, aged between 28 and 63, were unusually young to have developed such plaques. A set of 21 controls, who had not had surgical grafts of dura mater but died of sporadic CJD at similar ages, did not have this amyloid signature. According to the authors, it is possible that the transplanted dura mater was contaminated with small ‘seeds’ of amyloid-β protein — which some scientists think could be a trigger for Alzheimer’s — along with the prion protein that gave the recipients CJD. © 2016 Nature Publishing Group,
By Esther Landhuis Amid gloomy reports of an impending epidemic of Alzheimer’s and other dementias, emerging research offers a promising twist. Recent studies in North America, the U.K. and Europe suggest that dementia risk among seniors in some high-income countries has dropped steadily over the past 25 years. If the trend is driven by midlife factors such as building “brain reserve” and maintaining heart health, as some experts suspect, this could lend credence to staying mentally engaged and taking cholesterol-lowering drugs as preventive measures. At first glance, the overall message seems somewhat confusing. Higher life expectancy and falling birth rates are driving up the global elderly population. “And if there are more 85-year-olds, it’s almost certain there will be more cases of age-related diseases,” says Ken Langa, professor of internal medicine at the University of Michigan. According to the World Alzheimer Report 2015 (pdf), 46.8 million people around the globe suffered from dementia last year, and that number is expected to double every 20 years. Looking more closely, though, new epidemiological studies reveal a surprisingly hopeful trend. Analyses conducted over the last decade in the U.S., Canada, England, the Netherlands, Sweden and Denmark suggest that “a 75- to 85-year-old has a lower risk of having Alzheimer’s today than 15 or 20 years ago,” says Langa, who discussed the research on falling dementia rates in a 2015 Alzheimer’s Research & Therapy commentary (pdf). © 2016 Scientific America
Link ID: 21821 - Posted: 01.26.2016
by Graham McDougall, Jr., behavioral scientist at U. of Alabama Chemo brain is a mental cloudiness reported by about 30 percent of cancer patients who receive chemotherapy. Symptoms typically include impairments in attention, concentration, executive function, memory and visuospatial skills. Since the 1990s researchers have tried to understand this phenomenon, particularly in breast cancer patients. But the exact cause of chemo brain remains unclear. Some studies indicate that chemotherapy may trigger a variety of related neurological symptoms. One study, which examined the effects of chemotherapy in 42 breast cancer patients who underwent a neuropsychological evaluation before and after treatment, found that almost three times more patients displayed signs of cognitive dysfunction after treatment as compared with before (21 versus 61 percent). A 2012 review of 17 studies considering 807 breast cancer patients found that cognitive changes after chemotherapy were pervasive. Other research indicates that the degree of mental fogginess that a patient experiences may be directly related to how much chemotherapy that person receives: higher doses lead to greater dysfunction. There are several possible mechanisms to explain the cognitive changes associated with chemotherapy treatments. The drugs may have direct neurotoxic effects on the brain or may indirectly trigger immunological responses that may cause an inflammatory reaction in the brain. Chemotherapy, however, is not the only possible culprit. Research also shows that cancer itself may cause changes to the brain. In addition, it is possible that the observed cognitive decline may simply be part of the natural aging process, especially considering that many cancer patients are older than 50 years. © 2016 Scientific American,
By David Shultz A rat navigating a maze has to rank somewhere near the top of science tropes. Now, scientists report that they’ve developed an analogous test for humans—one that involves driving through a virtual landscape in a simulated car. The advance, they say, may provide a more sensitive measure for detecting early signs of Alzheimer’s disease. “I think it’s a very well-done study,” says Keith Vossel, a translational neuroscientist at the University of California, San Francisco (UCSF), who was not involved with the work. In the rodent version of the so-called Morris Maze Test, researchers fill a large cylindrical container with water and place a platform just above the waterline. A scientist then places a rat into the tank, and the rodent must swim to the platform to avoid drowning. The experimenter then raises the water level just above the height of the platform and adds a compound to the water to make it opaque. The trial is repeated, but now the rat must find the platform without seeing it, using only its memory of where the safe zone exists relative to the tank’s walls and the surrounding environment. In subsequent trials, researchers place the rat at different starting points along the tank’s edge, but the platform stays put. In essence, the task requires the rat to move to a specific but invisible location within a circular arena from different starting points. © 2016 American Association for the Advancement of Science.
Link ID: 21803 - Posted: 01.20.2016
By Emily Underwood Roughly half of Americans use marijuana at some point in their lives, and many start as teenagers. Although some studies suggest the drug could harm the maturing adolescent brain, the true risk is controversial. Now, in the first study of its kind, scientists have analyzed long-term marijuana use in teens, comparing IQ changes in twin siblings who either used or abstained from marijuana for 10 years. After taking environmental factors into account, the scientists found no measurable link between marijuana use and lower IQ. “This is a very well-conducted study … and a welcome addition to the literature,” says Valerie Curran, a psychopharmacologist at the University College London. She and her colleagues reached “broadly the same conclusions” in a separate, nontwin study of more than2000 British teenagers, published earlier this month in the Journal of Psychopharmacology, she says. But, warning that the study has important limitations, George Patton, a psychiatric epidemiologist at the University of Melbourne in Australia, adds that it in no way proves that marijuana—particularly heavy, or chronic use —is safe for teenagers. Most studies that linked marijuana to cognitive deficits, such as memory loss and low IQ, looked at a single “snapshot” in time, says statistician Nicholas Jackson of the University of Southern California in Los Angeles, lead author of the new work. That makes it impossible to tell which came first: drug use or poor cognitive performance. “It's a classic chicken-egg scenario,” he says. © 2016 American Association for the Advancement of Science.