Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Lydia Pyne | On August 3, 1908, the first near-complete Neanderthal skeleton was discovered in a cave near the village of La Chapelle-aux-Saints in south central France, during a survey of the region’s Paleolithic archaeological sites. For decades prior, prehistorians had collected bits and pieces of curious but not-quite-human fossils from museums and excavations alike—the odd skull here, a scrap of tooth there. In 1863, the mélange of bones was finally given its own species designation, Homo neanderthalensis. Forty-five years later, the La Chapelle discovery was the first Neanderthal specimen found in an original archaeological context and the first to be expertly excavated and carefully studied. Because the body was arranged in a flexed, fetal position and carefully placed in the floor of the cave, excavators argued that fossil—nicknamed the Old Man—had been purposefully buried by his Neanderthal contemporaries. More than any other single individual, the Old Man of La Chapelle has shaped the way that science and popular culture have thought about Neanderthals. But why? What is it about this Neanderthal’s story that is so special? In short, the Old Man was the right fossil found at the right time. He was—and still is—offered as a key bit of evidence in debates about evolution and human origins. He quickly became a scientific touchstone, an archetype for how science and popular culture create celebrity fossils. I explore the stories of similarly spectacular paleoanthropological finds in my new book Seven Skeletons: The Evolution of the World’s Most Famous Human Fossils. © 1986-2016 The Scientist
Link ID: 22585 - Posted: 08.23.2016
Sara Reardon Neuroscientists have invented a way to watch the ebb and flow of the brain's chemical messengers in real time. They were able to see the surge of neurotransmitters as mice were conditioned — similarly to Pavlov's famous dogs — to salivate in response to a sound. The study, presented at the American Chemical Society’s meeting in Philadelphia, Pennyslvania, on 22 August, uses a technique that could help to disentangle the complex language of neurotransmitters. Ultimately, it could lead to a better understanding of brain circuitry. The brain’s electrical surges are easy to track. But detecting the chemicals that drive this activity — the neurotransmitters that travel between brain cells and lead them to fire — is much harder. “There’s a hidden signalling network in the brain, and we need tools to uncover it,” says Michael Strano, a chemical engineer at the Massachusetts Institute of Technology in Cambridge. In many parts of the brain, neurotransmitters can exist at undetectably low levels. Typically, researchers monitor them by sucking fluid out from between neurons and analysing the contents in the lab. But that technique cannot measure activity in real time. Another option is to insert a metal probe into the space between neurons to measure how neurotransmitters react chemically when they touch metal. But the probe is unable to distinguish between structurally similar molecules, such as dopamine, which is involved in pleasure and reward, and noradrenaline which is involved in alertness. © 2016 Macmillan Publishers Limited
Laura Sanders Fractions of a second after food hits the mouth, a specialized group of energizing nerve cells in mice shuts down. After the eating stops, the nerve cells spring back into action, scientists report August 18 in Current Biology. This quick response to eating offers researchers new clues about how the brain drives appetite and may also provide insight into narcolepsy. These nerve cells have intrigued scientists for years. They produce a molecule called orexin (also known as hypocretin), thought to have a role in appetite. But their bigger claim to fame came when scientists found that these cells were largely missing from the brains of people with narcolepsy. People with narcolepsy are more likely to be overweight than other people, and this new study may help explain why, says neuroscientist Jerome Siegel of UCLA. These cells may have more subtle roles in regulating food intake in people without narcolepsy, he adds. Results from earlier studies hinted that orexin-producing nerve cells are appetite stimulators. But the new results suggest the opposite. These cells actually work to keep extra weight off. “Orexin cells are a natural obesity defense mechanism,” says study coauthor Denis Burdakov of the Francis Crick Institute in London. “If they are lost, animals and humans gain weight.” Mice were allowed to eat normally while researchers eavesdropped on the behavior of their orexin nerve cells. Within milliseconds of eating, orexin nerve cells shut down and stopped sending signals. |© Society for Science & the Public 2000 - 2016
Link ID: 22583 - Posted: 08.22.2016
By KATHERINE KINZLER You may not be surprised to learn that food preference is a social matter. What we choose to eat depends on more than just what tastes good or is healthful. People in different cultures eat different things, and within a culture, what you eat can signal something about who you are. More surprising is that the sociality of food selection, it turns out, runs deep in human nature. In research published this month in the Proceedings of the National Academy of Sciences, my colleagues and I showed that even 1-year-old babies understand that people’s food preferences depend on their social or cultural group. Interestingly, we found that babies’ thinking about food preferences isn’t really about food per se. It’s more about the people eating foods, and the relationship between food choice and social groups. While it’s hard to know what babies think before they can talk, developmental psychologists have long capitalized on the fact that babies’ visual gaze is guided by their interest. Babies tend to look longer at something that is novel or surprising. Do something bizarre the next time you meet a baby, and you’ll notice her looking intently. Using this method, the psychologists Zoe Liberman, Amanda Woodward, Kathleen Sullivan and I conducted a series of studies. Led by Professor Liberman, we brought more than 200 1-year-olds (and their parents) into a developmental psychology lab, and showed them videos of people visibly expressing like or dislike of foods. For instance, one group of babies saw a video of a person who ate a food and expressed that she loved it. Next they saw a video of a second person who tried the same food and also loved it. This second event was not terribly surprising to the babies: The two people agreed, after all. Accordingly, the babies did not look for very long at this second video; it was what they expected. © 2016 The New York Times Company
Laura Sanders For some people, fentanyl can be a life-saver, easing profound pain. But outside of a doctor’s office, the powerful opioid drug is also a covert killer. In the last several years, clandestine drugmakers have begun experimenting with this ingredient, baking it into drugs sold on the streets, most notably heroin. Fentanyl and closely related compounds have “literally invaded the entire heroin supply,” says medical toxicologist Lewis Nelson of New York University Langone Medical Center. Fentanyl is showing up in other drugs, too. In San Francisco’s Bay Area in March, high doses of fentanyl were laced into counterfeit versions of the pain pill Norco. In January, fentanyl was found in illegal pills sold as oxycodone in New Jersey. And in late 2015, fentanyl turned up in fake Xanax pills in California. This ubiquitous recipe-tinkering makes it impossible for users to know whether they’re about to take drugs mixed with fentanyl. And that uncertainty has proved deadly. Fentanyl-related deaths are rising sharply in multiple areas. National numbers are hard to come by, but in many regions around the United States, fentanyl-related fatalities have soared in recent years. Maryland is one of the hardest-hit states. From 2007 to 2012, the number of fentanyl-related deaths hovered around 30 per year. By 2015, that number had grown to 340. A similar rise is obvious in Connecticut, where in 2012, there were 14 fentanyl-related deaths. In 2015, that number was 188. |© Society for Science & the Public 2000 - 2016.
By Andrea Anderson When we bed down in a new locale, our sleep often suffers. A recent study finds that this so-called first-night effect may be the result of partial wakefulness in one side of the brain—as if the brain is keeping watch. Researchers at Brown University and the Georgia Institute of Technology used neuroimaging and a brain wave–tracking approach called polysomnography to record activity in four brain networks in 11 individuals as they slept on two nights about a week apart. The subjects nodded off at their normal bedtimes, and their brain was scanned for about two hours—the length of a sleep cycle. As participants slept, right hemisphere regions showed consistent slow-wave activity regardless of the night. Yet average slow-wave activity was shallower in their left hemisphere during the first night—an asymmetry that was enhanced in those who took longer to fall asleep. The results, published in May in Current Biology, suggest systems in one side of the brain remain active as people venture into unfamiliar sleep situations—an apparent survival strategy reminiscent of the unihemispheric sleep reported in certain animals. Because the results represent just one sleep cycle, however, it is unclear whether the left side of the brain is always tasked with maintaining attentiveness, explains the study's senior author Yuka Sasaki, a cognitive, linguistic and psychological sciences researcher at Brown. It is possible the right hemisphere takes over guard dog duties at some point in the night. © 2016 Scientific American
Link ID: 22580 - Posted: 08.22.2016
By Virginia Morell Scientists have long worried whether animals can respond to the planet’s changing climate. Now, a new study reports that at least one species of songbird—and likely many more—already knows how to prep its chicks for a warming world. They do so by emitting special calls to the embryos inside their eggs, which can hear and learn external sounds. This is the first time scientists have found animals using sound to affect the growth, development, behavior, and reproductive success of their offspring, and adds to a growing body of research revealing that birds can “doctor” their eggs. “The study is novel, surprising, and fascinating, and is sure to lead to much more work on parent-embryo communication,” says Robert Magrath, a behavioral ecologist at the Australian National University in Canberra who was not involved in the study. The idea that the zebra finch (Taeniopygia guttata) parents were “talking to their eggs” occurred to Mylene Mariette, a behavioral ecologist at Deakin University in Waurn Ponds, Australia, while recording the birds’ sounds at an outdoor aviary. She noticed that sometimes when a parent was alone, it would make a rapid, high-pitched series of calls while sitting on the eggs. Mariette and her co-author, Katherine Buchanan, recorded the incubation calls of 61 female and 61 male finches inside the aviary. They found that parents of both sexes uttered these calls only during the end of the incubation period and when the maximum daily temperature rose above 26°C (78.8°F). © 2016 American Association for the Advancement of Scienc
By Diana Kwon When glial cells were discovered in the 1800s, they were thought to be passive, supporting structures—the “glue”—as their Greek name implies—that holds neurons together in the brain and throughout the nervous system. In recent years, however, neuroscientists have discovered that far from being passive, these small cells play an astonishing variety of roles in both the development and functioning of the brain. Some of the latest discoveries suggest that glia play complex roles in regulating appetite and metabolism, making them a possible target for treating obesity. Signs that glia might play such roles were first identified in the 1980s. Neuroscientist Pierre Magistretti and his colleagues found evidence that neurotransmitters could promote the release of glucose reserves stored in astrocytes, a star-shaped type of glial cell. Other studies revealed that obesity leads to increased activation of glial cells in the hypothalamus—the key area of the brain for controlling metabolic processes. This was despite the fact that, for a long time, “neurons were considered the only players in the control of energy metabolism,” says Cristina García-Cáceres, a neurobiologist at the Helmholtz Diabetes Center in Germany. Two recent studies add new evidence that glia play a key role in metabolism. In one study, published last week in Cell, García-Cáceres, together with Matthias Tschöp, the director of the Helmholtz Diabetes Center and colleagues, reported that insulin acts on astrocytes to regulate sugar intake in the brain. © 2016 Scientific American,
Researchers may have discovered a method of detecting changes in the eye which could identify Parkinson's disease before its symptoms develop. Scientists at University College London (UCL) say their early animal tests could lead to a cheap and non-invasive way to spot the disease. Parkinson's affects 1 in 500 people and is the second most common neurodegenerative disease worldwide. The charity Parkinson's UK welcomed the research as a "significant step". The researchers examined rats and found that changes could be seen at the back of their eyes before visible symptoms occurred. Professor Francesca Cordeiro who led the research said it was a "potentially revolutionary breakthrough in the early diagnosis and treatment of one of the world's most debilitating diseases". "These tests mean we might be able to intervene much earlier and more effectively treat people with this devastating condition." Symptoms of Parkinson's include tremors and muscle stiffness, slowness of movement and a reduced quality of life. These symptoms usually only emerge after brain cells have been damaged. But there is currently no brain scan, or blood test, that can definitively diagnose Parkinson's disease. Parkinson's does not directly cause people to die, but symptoms do get worse over time. © 2016 BBC
Link ID: 22577 - Posted: 08.20.2016
By Emily Underwood In 2010, neurobiologist Beth Stevens had completed a remarkable rise from laboratory technician to star researcher. Then 40, she was in her second year as a principal investigator at Boston Children’s Hospital with a joint faculty position at Harvard Medical School. She had a sleek, newly built lab and a team of eager postdoctoral investigators. Her credentials were impeccable, with high-profile collaborators and her name on an impressive number of papers in well-respected journals. But like many young researchers, Stevens feared she was on the brink of scientific failure. Rather than choosing a small, manageable project, she had set her sights on tackling an ambitious, unifying hypothesis linking the brain and the immune system to explain both normal brain development and disease. Although the preliminary data she’d gathered as a postdoc at Stanford University in Palo Alto, California, were promising, their implications were still murky. “I thought, ‘What if my model is just a model, and I let all these people down?’” she says. Stevens, along with her mentor at Stanford, Ben Barres, had proposed that brain cells called microglia prune neuronal connections during embryonic and later development in response to a signal from a branch of the immune system known as the classical complement pathway. If a glitch in the complement system causes microglia to prune too many or too few connections, called synapses, they’d hypothesized, it could lead to both developmental and degenerative disorders. © 2016 American Association for the Advancement of Science.
Meghan Rosen Zika may harm grown-up brains. The virus, which can cause brain damage in infants infected in the womb, kills stem cells and stunts their numbers in the brains of adult mice, researchers report August 18 in Cell Stem Cell. Though scientists have considered Zika primarily a threat to unborn babies, the new findings suggest that the virus may cause unknown — and potentially long-term — damage to adults as well. In adults, Zika has been linked to Guillain-Barré syndrome, a rare neurological disorder (SN: 4/2/16, p. 29). But for most people, infection is typically mild: a headache, fever and rash lasting up to a week, or no symptoms at all. In pregnant women, though, the virus can lodge in the brain of a fetus and kill off newly developing cells (SN: 4/13/16). If Zika targets newborn brain cells, adults may be at risk, too, reasoned neuroscientist Joseph Gleeson of Rockefeller University in New York City and colleagues. Parts of the forebrain and the hippocampus, which plays a crucial role in learning and memory, continue to generate nerve cells in adult brains. In mice infected with Zika, the virus hit these brain regions hard. Nerve cells died and the regions generated one-fifth to one-half as many new cells compared with those of uninfected mice. The results might not translate to humans; the mice were genetically engineered to have weak immune systems, making them susceptible to Zika. But Zika could potentially harm immunocompromised people and perhaps even healthy people in a similar way, the authors write. © Society for Science & the Public 2000 - 2016.
Keyword: Development of the Brain
Link ID: 22575 - Posted: 08.20.2016
By Nicholas Bakalar Taking antipsychotic medicines during pregnancy does not increase the risk for birth defects, a large new study has found. Antipsychotics are used to treat schizophrenia, bipolar disorder, depression and other psychiatric disorders. Previous studies of their use during pregnancy have been small and have had mixed results. This study, in JAMA Psychiatry, reviewed records of 1,341,715 pregnant women, of whom 9,258 filled prescriptions for the newer atypical antipsychotics like quetiapine (Seroquel) or aripiprazole (Abilify), and 733 for older typical antipsychotics such as haloperidol (Haldol). All prescriptions were filled in the first trimester of pregnancy. After controlling for race, number of pregnancies, smoking, alcohol use, psychiatric conditions, additional medications and other variables, there was no difference in the risk for birth defects between those who took the drugs and those who did not. One possible exception was a marginal increase in risk with one drug, risperidone (Risperdal), which the authors said will require further study. “These findings suggest that the use of antipsychotics during the first trimester does not seem to increase congenital malformation,” or birth defects, said the lead author, Krista F. Huybrechts, an assistant professor of medicine at Harvard. But, she added, “we only looked at congenital malformation, not other possible negative outcomes for women and their children.” © 2016 The New York Times Company
SINCE nobody really knows how brains work, those researching them must often resort to analogies. A common one is that a brain is a sort of squishy, imprecise, biological version of a digital computer. But analogies work both ways, and computer scientists have a long history of trying to improve their creations by taking ideas from biology. The trendy and rapidly developing branch of artificial intelligence known as “deep learning”, for instance, takes much of its inspiration from the way biological brains are put together. The general idea of building computers to resemble brains is called neuromorphic computing, a term coined by Carver Mead, a pioneering computer scientist, in the late 1980s. There are many attractions. Brains may be slow and error-prone, but they are also robust, adaptable and frugal. They excel at processing the sort of noisy, uncertain data that are common in the real world but which tend to give conventional electronic computers, with their prescriptive arithmetical approach, indigestion. The latest development in this area came on August 3rd, when a group of researchers led by Evangelos Eleftheriou at IBM’s research laboratory in Zurich announced, in a paper published in Nature Nanotechnology, that they had built a working, artificial version of a neuron. Neurons are the spindly, highly interconnected cells that do most of the heavy lifting in real brains. The idea of making artificial versions of them is not new. Dr Mead himself has experimented with using specially tuned transistors, the tiny electronic switches that form the basis of computers, to mimic some of their behaviour. © The Economist Newspaper Limited 2016.
By Jef Akst ANDRZEJ KRAUZEAs a psychiatrist at Western University in London, Ontario, Lena Palaniyappan regularly sees patients with schizophrenia, the chronic mental disorder that drastically affects how a person thinks, feels, and behaves. The disorder can be devastating, often involving hallucinations and delusions. But one thing Palaniyappan and other mental health professionals have noticed is that, unlike those with degenerative neurological disorders such as Alzheimer’s disease, Huntington’s, or Parkinson’s, sometimes schizophrenia patients eventually start to improve. “In the clinic we do actually see patients with schizophrenia having a very relentless progress in early years,” Palaniyappan says. “But a lot of them do get better over the years, or they don’t progress as [quickly].” So far, most research has focused on the neurological decline associated with schizophrenia—typically involving a loss of brain tissue. Palaniyappan and his colleagues wondered whether there might be “something happening in the brain [that] helps them come to a state of stability.” To get at this question, he and his colleagues performed MRI scans to assess the cortical thickness of 98 schizophrenia patients at various stages of illness. Sure enough, the researchers noted that, while patients who were less than two years removed from their diagnosis had significantly thinner tissue than healthy controls, those patients who’d had the disease for longer tended to show less deviation in some brain regions, suggesting some sort of cortical amelioration (Psychol Med, doi:10.1017/S0033291716000994, 2016). “Some brain regions are regaining or normalizing while other brain regions continue to show deficits,” Palaniyappan says. © 1986-2016 The Scientist
By Jessica Hamzelou JACK NICHOLSON has a lot to answer for. One of the knock-on effects of hit 1975 movie One Flew Over the Cuckoo’s Nest was a public backlash against electroconvulsive therapy (ECT). The treatment, used since the 1930s for a wide range of mental health conditions, delivers a jolt of electricity to the brain big enough to trigger a seizure. The film’s brutal depiction of ECT and lobbying helped it fall out of favour in the 1980s and 1990s. But ECT may now be undergoing a revival, led by psychiatrists who champion it because of its success rate. “It’s the most effective treatment we have in psychiatry,” says George Kirov at Cardiff University, UK, who oversees ECT treatments in the area. A report from the UK Royal College of Psychiatrists last September showed that three-quarters of people with mental health problems felt improvement after having ECT. And psychiatrists say that a similar percentage of people who have schizophrenia that doesn’t respond to drug treatment find ECT effective. “I’ve never seen an ECT treatment that doesn’t work,” says Helen Farrell, a psychiatrist at the Beth Israel Deaconess Medical Center in Boston. “People have such a skewed view of electroconvulsive therapy. It is seen as primitive and horrific“ Mounting evidence has convinced the US Food and Drug Administration (FDA) to consider reclassifying ECT devices to make the technology more accessible for people with depression or bipolar disorder. The public will still take some convincing, however. In a 2005 survey in Switzerland, for example, 56 per cent were against ECT, while just 1 per cent said they were in favour. © Copyright Reed Business Information Ltd.
Link ID: 22571 - Posted: 08.18.2016
Dean Burnett A lot of people, when they travel by car, ship, plane or whatever, end up feeling sick. They’re fine before they get into the vehicle, they’re typically fine when they get out. But whilst in transit, they feel sick. Particularly, it seems, in self-driving cars. Why? One theory is that it’s due to a weird glitch that means your brain gets confused and thinks it’s being poisoned. This may seem surprising; not even the shoddiest low-budget airline would get away with pumping toxins into the passengers (airline food doesn’t count, and that joke is out of date). So where does the brain get this idea that it’s being poisoned? Despite being a very “mobile” species, humans have evolved for certain types of movement. Specifically, walking, or running. Walking has a specific set of neurological processes tied into it, so we’ve had millions of years to adapt to it. Think of all the things going on in your body when you’re walking, and how the brain would pick up on these. There’s the steady thud-thud-thud and pressure on your feet and lower legs. There’s all the signals from your muscles and the movement of your body, meaning the motor cortex (which controls conscious movement of muscles) and proprioception (the sense of the arrangement of your body in space, hence you can know, for example, where your arm is behind your back without looking at it directly) are all supplying particular signals. © 2016 Guardian News and Media Limited
Angus Chen Once people realized that opioid drugs could cause addiction and deadly overdoses, they tried to use newer forms of opioids to treat the addiction to its parent. Morphine, about 10 times the strength of opium, was used to curb opium cravings in the early 19th century. Codeine, too, was touted as a nonaddictive drug for pain relief, as was heroin. Those attempts were doomed to failure because all opioid drugs interact with the brain in the same way. They dock to a specific neural receptor, the mu-opioid receptor, which controls the effects of pleasure, pain relief and need. Now scientists are trying to create opioid painkillers that give relief from pain without triggering the euphoria, dependence and life-threatening respiratory suppression that causes deadly overdoses. That wasn't thought possible until 2000, when a scientist named Laura Bohn found out something about a protein called beta-arrestin, which sticks to the opioid receptor when something like morphine activates it. When she gave morphine to mice that couldn't make beta-arrestin, they were still numb to pain, but a lot of the negative side effects of the drug were missing. They didn't build tolerance to the drug. At certain dosages, they had less withdrawal. Their breathing was more regular, and they weren't as constipated as normal mice on morphine. Before that experiment, scientists thought the mu-opioid receptor was a simple switch that flicked all the effects of opioids on or off together. Now it seems they could be untied. © 2016 npr
By Melinda Wenner Moyer The science of sleep is woefully incomplete, not least because research on the topic has long ignored half of the population. For decades, sleep studies mostly enrolled men. Now, as sleep researchers are making a more concerted effort to study women, they are uncovering important differences between the sexes. Hormones are a major factor. Estrogen, progesterone and testosterone can influence the chemical systems in the brain that regulate sleep and arousal. Moreover, recent studies indicate that during times of hormonal change—such as puberty, pregnancy and menopause—women are at an increased risk for sleep disorders such as obstructive sleep apnea, restless legs syndrome and insomnia. Women also tend to report that they have more trouble sleeping before and during their menstrual periods. And when women do sleep poorly, they may have a harder time focusing than sleep-deprived men do. In one recent study, researchers shifted the sleep-wake cycles of 16 men and 18 women for 10 days. Volunteers were put on a 28-hour daily cycle involving nearly 19 hours of awake time followed by a little more than nine hours of sleep. During the sleep-shifted period, the women in the group performed much less accurately than the men on cognitive tests. The findings, published in April of this year in the Proceedings of the National Academy of Sciences USA, may help explain why women are more likely than men to get injured working graveyard shifts. In addition, a study conducted in 2015 in teenagers reported that weekday sleep deprivation affects cognitive ability more in girls than in boys. © 2016 Scientific American
By Karen Weintraub There’s been lots of coverage lately about meeting exercise recommendations by completing small chunks of exercise throughout the day rather than one, continuous session. Does the same hold true for meeting sleep recommendations? No. Unfortunately, sleep does not work that way. Substituting periodic naps for one consolidated night of sleep creates severe sleep deprivation, said Dr. Daniel Buysse, a sleep expert and professor of psychiatry at the University of Pittsburgh. He and his colleagues once did an experiment in which volunteers agreed to alternate 30 minutes of sleep with 60 minutes of wakefulness for two and a half days straight. They ended up sleep deprived, he said, because sound sleep is not equally likely at all times of day. People have a better chance of falling quickly into deep, restful sleep at night than midday, even if they feel as though they could fall asleep at any time. “Our biological clocks do not allow us to sleep as well during the day as at night,” he said. “All sleep is not necessarily equal.” That’s why night workers get less sleep on average than people who work other shifts – and suffer health consequences as a result, he said. But it’s always a good idea to make up for lost sleep, regardless of the time of day, said Dr. Ruth Benca, a professor of psychiatry and director of the Center for Sleep Medicine and Sleep Research at the University of Wisconsin-Madison. People used to think that it was better to pull an all-nighter than to break it up with a short nap, but that isn’t true, she said. On the other hand, it may be helpful, she said, to take an afternoon nap to compensate for a short night of sleep, bringing a six-and-a-half hour night up to seven, for instance. “If you have to stay awake for a prolonged period, you can mitigate that a little bit by taking some naps, but you can’t live your life like that,” Dr. Benca said. “Any sleep is better than no sleep, and more sleep is better than less sleep.” © 2016 The New York Times Company
Link ID: 22567 - Posted: 08.18.2016
By Robin Wylie Scientists have been searching for a genetic explanation for athletic ability for decades. So far their efforts have focused largely on genes related to physical attributes, such as muscular function and aerobic efficiency. But geneticists have also started to investigate the neurologicalbasis behind what makes someone excel in sports—and new findings implicate dopamine, a neurotransmitter responsible for the feelings of reward and pleasure. Dopamine is also involved in a host of other mental functions, including the ability to deal with stress and endure pain. Consequently, the new research supports the idea that the mental—not just the physical—is what sets elite athletes above the rest. In an effort to piece together what makes a great athlete great, researchers at the University of Parma in Italy collected DNA from 50 elite athletes (ones who had achieved top scores at an Olympic Games or other international competition) and 100 nonprofessional athletes (ones who played sports regularly, but below competitive level). They then compared four genes across the two groups that had previously been suggested as linked to athletic ability: one related to muscle development, one involved with transporting dopamine in the brain, another that regulates levels of cerebral serotonin and one involved in breaking down neurotransmitters. The researchers found a significant genetic difference between the two groups in only one of the genes: the one involved in transporting dopamine. Two particular variants of this gene (called the dopamine active transporter, or DAT) were significantly more common among the elite athletes than in the control group. One variant was almost five times more prevalent in the elite group (occurring in 24 percent of the elites versus 5 percent of the rest); the other variant was approximately 1.7 times more prevalent (51 percent versus 30 percent). The results were published in Journal of Biosciences. © 2016 Scientific American