Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Kate Darby Rauch When Marian Diamond was growing up in Southern California, she got her first glimpse of a real brain at Los Angeles County Hospital with her dad, a physician. She was 15. Looking back now, at age 90, Diamond, a Berkeley resident, points to that moment as the start of something profound — a curiosity, wonderment, drive. “It just blew my mind, the fact that a cell could create an idea,” Diamond said in a recent interview, reflecting on her first encounter with that sinewy purple-tinged mass. She didn’t know that this was the start of a distinguished legacy that would stretch for decades, touching millions. But today, she’d be one of the first to scientifically equate that adolescent thrill with her life’s work. Because she helped prove a link. Brains, we now know, thanks in large part to research by Diamond, thrive on challenge, newness, discovery. With this enrichment, brain cells are stimulated and grow. This week, Diamond, a UC Berkeley emeritus professor of integrative biology and the first woman to earn a PhD in anatomy at Cal, is being honored by the Berkeley City Council, which is designating March 14 as Marian Diamond Day. And on March 22, KQED TV will air a new documentary film about her life’s work, My Love Affair With the Brain. © Berkeleyside All Rights Reserved.
Keyword: Development of the Brain
Link ID: 23366 - Posted: 03.16.2017
By DENISE GRADY Three women suffered severe, permanent eye damage after stem cells were injected into their eyes, in an unproven treatment at a loosely regulated clinic in Florida, doctors reported in an article published Wednesday in The New England Journal of Medicine. One, 72, went completely blind from the injections, and the others, 78 and 88, lost much of their eyesight. Before the procedure, all had some visual impairment but could see well enough to drive. The cases expose gaps in the ability of government health agencies to protect consumers from unproven treatments offered by entrepreneurs who promote the supposed healing power of stem cells. The women had macular degeneration, an eye disease that causes vision loss, and they paid $5,000 each to receive stem-cell injections in 2015 at a private clinic in Sunrise, Fla. The clinic was part of a company then called Bioheart, now called U.S. Stem Cell. Staff members there used liposuction to suck fat out of the women’s bellies, and then extracted stem cells from the fat to inject into the women’s eyes. The disastrous results were described in detail in the journal article, by doctors who were not connected to U.S. Stem Cell and treated the patients within days of the injections. An accompanying article by scientists from the Food and Drug Administration warned that stem cells from fat “are being used in practice on the basis of minimal clinical evidence of safety or efficacy, sometimes with the claims that they constitute revolutionary treatments for various conditions.” © 2017 The New York Times Company
By Anna Azvolinsky Delivering a CRISPR/Cas9–based therapy directly to the eye via a viral vector can prevent retinal degeneration in a mouse model of retinitis pigmentosa, a team led by researchers at the National Eye Institute reported in Nature Communications today (March 14). Retinitis pigmentosa, which affects around one in 4,000 people, causes retinal degeneration that eventually leads to blindness. The inherited disorder has been mapped to more than 60 genes (and more than 3,000 mutations), presenting a challenge for researchers working toward a gene therapy. The results of this latest study suggest that a broader, gene-editing–based therapeutic approach could be used to target many of the genetic defects underlying retinitis pigmentosa. “Given the lack of effective therapies for retinal degeneration, particularly the lack of therapies applicable to a broad range of different genetic varieties of this disease, this study represents a very exciting and important advance in our field,” Joseph Corbo, a neuropathologist at the Washington University School of Medicine in St. Louis who was not involved in the work, wrote in an email to The Scientist. This combination of “CRISPR technology with an adeno-associated virus vector, a system tried and true for delivering genetic information to the retina, may represent the first step in a global treatment approach for rod-mediated degenerative disease,” Shannon Boye, whose University of Florida lab develops gene replacement strategies for eye disorders, wrote in an email to The Scientist. © 1986-2017 The Scientist
Link ID: 23364 - Posted: 03.16.2017
By Andy Coghlan A woman in her 80s has become the first person to be successfully treated with induced pluripotent stem (iPS) cells. A slither of laboratory-made retinal cells has protected her eyesight, fighting her age-related macular degeneration – a common form of progressive blindness. Such stem cells can be coaxed to form many other types of cell. Unlike other types of stem cell, such as those found in an embryo, induced pluripotent ones can be made from adult non-stem cells – a discovery that earned a Nobel prize in 2012. Now, more than a decade after they were created, these stem cells have helped someone. Masayo Takahashi at the RIKEN Laboratory for Retinal Regeneration in Kobe, Japan, and her team took skin cells from the woman and turned them into iPS cells. They then encouraged these to form retinal pigment epithelial cells, which are important for supporting and nourishing the retina cells that capture light for vision. The researchers made a slither of cells measuring just 1 by 3 millimetres. Before transplanting this into the woman’s eye in 2014, they first removed diseased tissue on her retina that was gradually destroying her sight. They then inserted the small patch of cells they had created, hoping they would become a part of her eye and stop her eyesight from degenerating. © Copyright Reed Business Information Ltd.
By Mitch Leslie It sounds like a crazy way to improve your health—spend some time on a platform that vibrates at about the same frequency as the lowest string on a double bass. But recent research indicates that the procedure, known as whole-body vibration, may be helpful in illnesses from cerebral palsy to chronic obstructive pulmonary disease. Now, a new study of obese mice reveals that whole-body vibration provides similar metabolic benefits as walking on a treadmill, suggesting it may be useful for treating obesity and type II diabetes. “I think it’s very promising,” says exercise physiologist Lee Brown of the California State University in Fullerton, who wasn’t connected to the study. Although the effects are small, he says, researchers should follow-up to determine whether they can duplicate them in humans. Plenty of gyms feature whole-body vibration machines, and many athletes swear the activity improves their performance. The jiggling does seem to spur muscles to work harder, possibly triggering some of the same effects as exercise. But researchers still don’t know how the two compare, especially when it comes to people who are ill. So biomedical engineer Meghan McGee-Lawrence of the Medical College of Georgia in Augusta and colleagues decided to perform a head-to-head comparison of exercise and whole-body vibration. The researchers tested mutant mice resistant to the appetite-controlling hormone leptin, resulting in obesity and diabetes. © 2017 American Association for the Advancement of Science.
Link ID: 23362 - Posted: 03.16.2017
By Christof Koch We moderns take it for granted that consciousness is intimately tied up with the brain. But this assumption did not always hold. For much of recorded history, the heart was considered the seat of reason, emotion, valor and mind. Indeed, the first step in mummification in ancient Egypt was to scoop out the brain through the nostrils and discard it, whereas the heart, the liver and other internal organs were carefully extracted and preserved. The pharaoh would then have access to everything he needed in his afterlife. Everything except for his brain! Several millennia later Aristotle, one of the greatest of all biologists, taxonomists, embryologists and the first evolutionist, had this to say: “And of course, the brain is not responsible for any of the sensations at all. The correct view [is] that the seat and source of sensation is the region of the heart.” He argued consistently that the primary function of the wet and cold brain is to cool the warm blood coming from the heart. Another set of historical texts is no more insightful on this question. The Old and the New Testaments are filled with references to the heart but entirely devoid of any mentions of the brain. Debate about what the brain does grew ever more intense over ensuing millennia. The modern embodiment of these arguments seeks to identify the precise areas within the three-pound cranial mass where consciousness arises. What follows is an attempt to size up the past and present of this transmillennial journey. The field has scored successes in delineating a brain region that keeps the neural engine humming. Switched on, you are awake and conscious. In another setting, your body is asleep, yet you still have experiences—you dream. In a third position, you are deeply asleep, effectively off-line. © 2017 Scientific American
Link ID: 23361 - Posted: 03.16.2017
Ian Sample Science editor Researchers have overcome one of the major stumbling blocks in artificial intelligence with a program that can learn one task after another using skills it acquires on the way. Developed by Google’s AI company, DeepMind, the program has taken on a range of different tasks and performed almost as well as a human. Crucially, and uniquely, the AI does not forget how it solved past problems, and uses the knowledge to tackle new ones. The AI is not capable of the general intelligence that humans draw on when they are faced with new challenges; its use of past lessons is more limited. But the work shows a way around a problem that had to be solved if researchers are ever to build so-called artificial general intelligence (AGI) machines that match human intelligence. “If we’re going to have computer programs that are more intelligent and more useful, then they will have to have this ability to learn sequentially,” said James Kirkpatrick at DeepMind. The ability to remember old skills and apply them to new tasks comes naturally to humans. A regular rollerblader might find ice skating a breeze because one skill helps the other. But recreating this ability in computers has proved a huge challenge for AI researchers. AI programs are typically one trick ponies that excel at one task, and one task only.
By Nicole Mortillaro, CBC News Have you ever been witness to an event with a friend only to conclude you both had different accounts about what had occurred? This is known as perception bias. Our views and beliefs can cloud the way we perceive things — and perception bias can take on many forms. New research published in the Journal of Personality and Social Psychology found that people tend to perceive young black men as larger, stronger and more threatening than white men of the same size. This, the authors say, could place them at risk in situations with police. The research was prompted by recent police shootings against black men in the United States — particularly those involving descriptions of men that didn't correspond with reality. Take, for example, the case of Dontre Hamilton. In 2014, the unarmed Hamilton was shot 14 times and killed by police in Milkwaukee. The officer involved testified that he believed he would have been easily overpowered by Hamilton, who he described as having a muscular build. But the autopsy report found that Hamilton was just five foot seven and weighed 169 pounds. Looking at the Hamilton case, as well as many other examples, the researchers sought to determine whether or not there were psychologically driven preconceived notions about black men over white men. ©2017 CBC/Radio-Canada.
Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017
By MATT RICHTEL Amid an opioid epidemic, the rise of deadly synthetic drugs and the widening legalization of marijuana, a curious bright spot has emerged in the youth drug culture: American teenagers are growing less likely to try or regularly use drugs, including alcohol. With minor fits and starts, the trend has been building for a decade, with no clear understanding as to why. Some experts theorize that falling cigarette-smoking rates are cutting into a key gateway to drugs, or that antidrug education campaigns, long a largely failed enterprise, have finally taken hold. But researchers are starting to ponder an intriguing question: Are teenagers using drugs less in part because they are constantly stimulated and entertained by their computers and phones? The possibility is worth exploring, they say, because use of smartphones and tablets has exploded over the same period that drug use has declined. This correlation does not mean that one phenomenon is causing the other, but scientists say interactive media appears to play to similar impulses as drug experimentation, including sensation-seeking and the desire for independence. Or it might be that gadgets simply absorb a lot of time that could be used for other pursuits, including partying. Nora Volkow, director of the National Institute on Drug Abuse, says she plans to begin research on the topic in the next few months, and will convene a group of scholars in April to discuss it. The possibility that smartphones were contributing to a decline in drug use by teenagers, Dr. Volkow said, was the first question she asked when she saw the agency’s most recent survey results. The survey, “Monitoring the Future,” an annual government-funded report measuring drug use by teenagers, found that past-year use of illicit drugs other than marijuana was at the lowest level in the 40-year history of the project for eighth, 10th and 12th graders. © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23357 - Posted: 03.15.2017
Heidi Ledford Like a zombie that keeps on kicking, legal battles over mutant mice used for Alzheimer’s research are haunting the field once again — four years after the last round of lawsuits. In the latest case, the University of South Florida (USF) in Tampa has sued the US National Institutes of Health (NIH) for authorizing the distribution of a particular type of mouse used in the field. The first pre-trial hearing in the case is set to begin in a federal court on 21 March. The university holds a patent on the mouse, but the NIH has contracted the Jackson Laboratory, a non-profit organization in Bar Harbor, Maine, to supply the animals to researchers. The USF is now claiming that it deserves some of the money that went to the contractor. If the suit, filed in December 2015, is successful, it could set a precedent for other universities, cautions Robert Cook-Deegan, an intellectual-property scholar at the Washington DC centre of Arizona State University in Tempe. And that would threaten the affordability of and access to lab animals used to investigate. “It feels greedy to me,” Cook-Deegan says. “If other universities start doing this, all it does is push up the cost of research tools.” The mice, on which the USF filed a patent in 1997, express mutated forms of two genes1. These modifications help researchers to study how amyloid plaques develop in the brain, and enable them to investigate behavioural changes that manifest before those plaques appear. © 2017 Macmillan Publishers Limited,
Link ID: 23356 - Posted: 03.15.2017
By Warren Cornwall The number of years someone spends behind bars can hinge on whether they were clearly aware that they were committing a crime. But how is a judge or jury to know for sure? A new study suggests brain scans can distinguish between hardcore criminal intent and simple reckless behavior, but the approach is far from being ready for the courtroom. The study is unusual because it looks directly at the brains of people while they are engaged in illicit activity, says Liane Young, a Boston College psychologist who was not involved in the work. Earlier research, including work by her, has instead generally looked at the brains of people only observing immoral activity. Researchers led by Read Montague, a neuroscientist at Virginia Tech Carilion Research Insitute in Roanoke and at University College London, used functional magnetic resonance imaging (fMRI), which can measure brain activity based on blood flow. They analyzed the brains of 40 people—a mix of men and women mostly in their 20s and 30s—as they went through scenarios that simulated trying to smuggle something through a security checkpoint. In some cases, the people knew for certain they had contraband in a suitcase. In other cases, they chose from between two and five suitcases, with only one containing contraband (and thus they weren’t sure they were carrying contraband). The risk of getting caught also varied based on how many of the 10 security checkpoints had a guard stationed there. The results showed distinctive patterns of brain activity for when the person knew for certain the suitcase had contraband and when they only knew there was a chance of it, the team reports today in the Proceedings of the National Academy of Sciences. But there was an unexpected twist. Those differing brain patterns only showed up when people were first shown how many security checkpoints were guarded, and then offered the suitcases. In that case, a computer analysis of the fMRI images correctly classified people as knowing or reckless between 71% and 80% of the time. © 2017 American Association for the Advancement of Science
Jon Hamilton An orangutan named Rocky is helping scientists figure out when early humans might have uttered the first word. Rocky, who is 12 and lives at the Indianapolis Zoo, has shown that he can control his vocal cords much the way people do. He can learn new vocal sounds and even match the pitch of sounds made by a person. "Rocky, and probably other great apes, can do things with their vocal apparatus that, for decades, people have asserted was impossible," says Rob Shumaker, the zoo's director, who has studied orangutans for more than 30 years. Rocky's abilities suggest that our human ancestors could have begun speaking 10 million years ago, about the time humans and great apes diverged, Shumaker says. Until now, many scientists thought that speech required changes in the brain and vocal apparatus that evolved more recently, during the past 2 million years. The vocal abilities of orangutans might have gone undetected had it not been for Rocky, an ape with an unusual past and a rare relationship with people. Rocky was separated from his mother soon after he was born, and spent his early years raised largely by people, and working in show business. "He was certainly the most visible orangutan in entertainment at the time," says Shumaker. "TV commercials, things like that."
By Catherine Offord A few years ago, UK composer and technology reporter LJ Rich participated in a music technology competition as part of a project with the BBC. The 24-hour event brought together various musicians, and entailed staying awake into the wee hours trying to solve technical problems related to music. Late into the night, during a break from work, Rich thought of a way to keep people’s spirits up. “At about four in the morning, I remember playing different tastes to people on a piano in the room we were working in,” she says. For instance, “to great amusement, during breakfast I played people the taste of eggs.” It didn’t take long before Rich learned, for the first time, that food’s association with music was not as universally appreciated as she had assumed. “You realize everybody else doesn’t perceive the world that way,” she says. “For me, it was quite a surprise to find that people didn’t realize that certain foods had different keys.” Rich had long known she had absolute pitch—the ability to identify a musical note, such as B flat, without any reference. But that night, she learned she also has what’s known as synesthesia, a little-understood mode of perception that links senses such as taste and hearing in unusual ways, and is thought to be present in around 4 percent of the general population. It’s a difficult phenomenon to get to the bottom of. Like Rich, many synesthetes are unaware their perception is atypical; what’s more, detecting synesthesia usually relies on self-reported experiences—an obstacle for standardized testing. But a growing body of evidence suggests that Rich is far from being alone in possessing both absolute pitch and synesthesia. © 1986-2017 The Scientist
Link ID: 23353 - Posted: 03.14.2017
There is widespread interest among teachers in the use of neuroscientific research findings in educational practice. However, there are also misconceptions and myths that are supposedly based on sound neuroscience that are prevalent in our schools. We wish to draw attention to this problem by focusing on an educational practice supposedly based on neuroscience that lacks sufficient evidence and so we believe should not be promoted or supported. Generally known as “learning styles”, it is the belief that individuals can benefit from receiving information in their preferred format, based on a self-report questionnaire. This belief has much intuitive appeal because individuals are better at some things than others and ultimately there may be a brain basis for these differences. Learning styles promises to optimise education by tailoring materials to match the individual’s preferred mode of sensory information processing. There are, however, a number of problems with the learning styles approach. First, there is no coherent framework of preferred learning styles. Usually, individuals are categorised into one of three preferred styles of auditory, visual or kinesthetic learners based on self-reports. One study found that there were more than 70 different models of learning styles including among others, “left v right brain,” “holistic v serialists,” “verbalisers v visualisers” and so on. The second problem is that categorising individuals can lead to the assumption of fixed or rigid learning style, which can impair motivation to apply oneself or adapt. Finally, and most damning, is that there have been systematic studies of the effectiveness of learning styles that have consistently found either no evidence or very weak evidence to support the hypothesis that matching or “meshing” material in the appropriate format to an individual’s learning style is selectively more effective for educational attainment. Students will improve if they think about how they learn but not because material is matched to their supposed learning style.
Keyword: Learning & Memory
Link ID: 23352 - Posted: 03.14.2017
An international team of researchers has conducted the first study of its kind to look at the genomic underpinnings of obesity in continental Africans and African-Americans. They discovered that approximately 1 percent of West Africans, African-Americans and others of African ancestry carry a genomic variant that increases their risk of obesity, a finding that provides insight into why obesity clusters in families. Researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and their African collaborators published their findings March 13, 2017, in the journal Obesity. People with genomic differences in the semaphorin-4D (SEMA4D) gene were about six pounds heavier than those without the genomic variant, according to the study. Most of the genomic studies conducted on obesity to date have been in people of European ancestry, despite an increased risk of obesity in people of African ancestry. Obesity is a global health problem, contributing to premature death and morbidity by increasing a person’s risk of developing diabetes, hypertension, heart disease and some cancers. While obesity mostly results from lifestyle and cultural factors, including excess calorie intake and inadequate levels of physical activity, it has a strong genomic component. The burden of obesity is, however, not the same across U.S. ethnic groups, with African-Americans having the highest age-adjusted rates of obesity, said Charles N. Rotimi, Ph.D., chief of NHGRI’s Metabolic, Cardiovascular and Inflammatory Disease Genomics Branch and director of the Center for Research on Genomics and Global Health (CRGGH) at NIH. CRGGH examines the socio-cultural and genomic factors at work in health disparities — the negative health outcomes that impact certain groups of people — so they can be translated into policies that reduce or eliminate healthcare inequalities in the United States and globally.
Richard A. Friedman Jet lag makes everyone miserable. But it makes some people mentally ill. There’s a psychiatric hospital not far from Heathrow Airport that is known for treating bipolar and schizophrenic travelers, some of whom are occasionally found wandering aimlessly through the terminals. A study from the 1980s of 186 of those patients found that those who’d traveled from the west had a higher incidence of mania, while those who’d traveled from the east had a higher incidence of depression. I saw the same thing in one of my patients who suffered from manic depression. When he got depressed after a vacation to Europe, we assumed he was just disappointed about returning to work. But then he had a fun trip out West and returned home in what’s called a hypomanic state: He was expansive, a fount of creative ideas. It was clear that his changes in mood weren’t caused by the vacation blues, but by something else. The problem turned out to be a disruption in his circadian rhythm. He didn’t need drugs; he needed the right doses of sleep and sunlight at the right time. It turns out that that prescription could treat much of what ails us. Clinicians have long known that there is a strong link between sleep, sunlight and mood. Problems sleeping are often a warning sign or a cause of impending depression, and can make people with bipolar disorder manic. Some 15 years ago, Dr. Francesco Benedetti, a psychiatrist in Milan, and colleagues noticed that hospitalized bipolar patients who were assigned to rooms with views of the east were discharged earlier than those with rooms facing the west — presumably because the early morning light had an antidepressant effect. The notion that we can manipulate sleep to treat mental illness has also been around for many years. Back in the late 1960s, a German psychiatrist heard about a woman in Tübingen who was hospitalized for depression and claimed that she normally kept her symptoms in check by taking all-night bike rides. He subsequently demonstrated in a group of depressed patients that a night of complete sleep deprivation produced an immediate, significant improvement in mood in about 60 percent of the group. © 2017 The New York Times Company
By Knvul Sheikh As we get older, we start to think a little bit more slowly, we are less able to multitask and our ability to remember things gets a little wobblier. This cognitive transformation is linked to a steady, widespread thinning of the cortex, the brain's outermost layer. Yet the change is not inevitable. So-called super agers retain their good memory and thicker cortex as they age, a recent study suggests. Researchers believe that studying what makes super agers different could help unlock the secrets to healthy brain aging and improve our understanding of what happens when that process goes awry. “Looking at successful aging could provide us with biomarkers for predicting resilience and for things that might go wrong in people with age-related diseases like Alzheimer's and dementia,” says study co-author Alexandra Touroutoglou, a neuroscientist at Harvard Medical School. Touroutoglou and her team gave standard recall tests to a group of 40 participants between the ages of 60 and 80 and 41 participants aged 18 to 35. Among the older participants, 17 performed as well as or better than adults four to five decades younger. When the researchers looked at MRI scans of the super agers' brains, they found that their brains not only functioned more like young brains, they also looked very similar. Two brain networks in particular seemed to be protected from shrinking: the default mode network, which helps to store and recall new information, and the salience network, which is associated with directing attention and identifying important details. In fact, the thicker these regions were, the better the super agers' memory was. © 2017 Scientific American,
Is there life after death for our brains? It depends. Loretta Norton, a doctoral student at Western University in Canada, was curious, so she and her collaborators asked critically ill patients and their families if they could record brain activity in the half hours before and after life support was removed. They ended up recording four patients with electroencephalography, better known as EEG, which uses small electrodes attached to a person’s head to measure electrical activity in the brain. In three patients, the EEG showed brain activity stopping up to 10 minutes before the person’s heart stopped beating. But in a fourth, the EEG picked up so-called delta wave bursts up to 10 minutes after the person’s heart stopped. Delta waves are associated with deep sleep, also known as slow-wave sleep. In living people, neuroscientists consider slow-wave sleep to be a key process in consolidating memories. The study also raises questions about the exact moment when death occurs. Here’s Neuroskeptic: Another interesting finding was that the actual moment at which the heart stopped was not associated with any abrupt change in the EEG. The authors found no evidence of the large “delta blip” (the so-called “death wave“), an electrical phenomena which has been observed in rats following decapitation. With only four patients, it’s difficult to draw any sort of broad conclusion from this study. But it does suggest that death may be a gradual process as opposed to a distinct moment in time. © 1996-2017 WGBH Educational Foundation
Link ID: 23348 - Posted: 03.13.2017
By Michael Price The objects and people children play with as early as toddlerhood may provide clues to their eventual sexual orientation, reveals the largest study of its kind. The investigation, which tracked more than 4500 kids over the first 15 years of their lives, seeks to answer one of the most controversial questions in the social sciences, but experts are mixed on the findings. “Within its paradigm, it’s one of the better studies I’ve seen,” says Anne Fausto-Sterling, professor emerita of biology and gender studies at Brown University. The fact that it looks at development over time and relies on parents’ observations is a big improvement over previous studies that attempted to answer similar questions based on respondents’ own, often unreliable, memories, she says. “That being said … they’re still not answering questions of how these preferences for toys or different kinds of behaviors develop in the first place.” The new study builds largely on research done in the 1970s by American sex and gender researcher Richard Green, who spent decades investigating sexuality. He was influential in the development of the term “gender identity disorder” to describe stress and confusion over one’s sex and gender, though the term—and Green’s work more broadly—has come under fire from many psychologists and social scientists today who say it’s wrong to label someone’s gender and sexuality “disordered.” In the decades since, other studies have reported that whether a child plays along traditional gender lines can predict their later sexual orientation. But these have largely been criticized for their small sample sizes, for drawing from children who exhibit what the authors call “extreme” gender nonconformity, and for various other methodological shortcomings. © 2017 American Association for the Advancement of Science