Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By NANCY L. SEGAL and SATOSHI KANAZAWA In 1973, the biologist Robert Trivers and the computer scientist Dan Willard made a striking prediction about parents and their offspring. According to the principles of evolutionary theory, they argued, the male-to-female ratio of offspring should not be 50-50 (as chance would dictate), but rather should vary as a function of how good (or bad) the conditions are in which the parents find themselves. Are the parents’ resources plentiful — or scarce? The Trivers-Willard hypothesis holds that when their conditions are good, parents will have more male offspring: Males with more resources are likely to gain access to more females, thereby increasing the frequency with which their genes (and thus their parents’ genes) are preserved in future generations. Conversely, male offspring that lack resources are likely to lose out to males that have more resources, so in bad conditions it pays for parents to “invest” more in daughters, which will have more opportunities to mate. It follows, as a kind of corollary, that when parents have plentiful resources they will devote those resources more to their sons, whereas when resources are scarce, parents will devote them more to their daughters. In short: If things are good, you have more boys, and give them more stuff. If things are bad, you have more girls, and give more of your stuff to them. Is this hypothesis correct? In new research of ours, to be published in the April issue of The Journal of Experimental Child Psychology, we suggest that in the case of breast-feeding, at least, it appears to be. In recent years, evidence has emerged suggesting that in various mammalian species, breast milk — which is, of course, a resource that can be given to children — is tailored for the sex of each offspring. For example, macaque monkey mothers produce richer milk (with higher gross energy and fat content) for sons than for daughters, but also provide greater quantities of milk and higher concentrations of calcium for daughters than for sons. © 2017 The New York Times Company
By Carl Bialik A woman has never come closer to the presidency than Hillary Clinton did in winning the popular vote in November. Yet as women march in Washington on Saturday, many of them to protest the presidency of Donald Trump, an important obstacle to the first woman president remains: the hidden, internalized bias many people hold against career advancement by women. And perhaps surprisingly, there is evidence that women hold more of this bias, on average, than men do. There has been lots of discussion of the role that overt sexism played in both Trump’s campaign and at the ballot box. A YouGov survey conducted two weeks before the election, for example, found that Trump voters had much higher levels of sexism, on average, than Clinton voters, as measured by their level of agreement with statements such as “women seek to gain power by getting control over men.” An analysis of the survey found that sexism played a big role in explaining people’s votes, after controlling for other factors, including gender and political ideology. Other research has reached similar conclusions. Two recent studies of voters, however, suggest that another, subtler form of bias may also have been a factor in the election. These studies looked at what’s known as “implicit bias,” the unconscious tendency to associate certain qualities with certain groups — in this case, the tendency to associate men with careers and women with family. Researchers have found that this kind of bias is stronger on average in women than in men, and, among women, it is particularly strong among political conservatives. And at least according to one study, this unconscious bias was especially strong among one group in 2016: women who supported Trump.
Link ID: 23134 - Posted: 01.23.2017
Robert McCrum Stroke, or “brain attack”, is the third biggest killer in the western world, after cancer and heart failure. The life-changing effects associated with this simple, Anglo-Saxon word are readily explained: a stroke occurs when the blood supply to the brain is disrupted by a blood vessel either bursting or blocking, so that the part of the brain supplied by this blood vessel dies. The brain is a much more complex organ than the heart. While strokes are a common feature of everyday life, precisely how and why they occur is far from straightforward. Each year in the UK, there will be about 50,000 brain attacks. One-third of those affected will die; one-third will be left severely disabled; and about one-third will make some kind of recovery. In the time it takes to read this article, approximately nine people in Britain, from across all age groups, will have suffered a stroke. Or did they? For the brain is not only super-sensitive territory – as the human animal’s command HQ – it is also top secret. Despite extraordinary progress in MRI scans, the brain remains essentially mysterious and the symptoms of its dysfunction can be hard to diagnose with certainty. An elderly man presenting himself at A&E with unsteady gait and a slurring of his words could be suffering a stroke – or he might just be intoxicated. Treat him for the former, and you’ll save his life; treat him as a drunk, and he might die. © 2017 Guardian News and Media Limited
Link ID: 23133 - Posted: 01.23.2017
By JANE E. BRODY Susan Sills, a Brooklyn artist who until recently made life-size cutouts on plywood using a power saw, long suspected she might be at risk for developing Parkinson’s disease. Both her mother and grandfather had this neurological movement disorder, and she knew that it sometimes runs in families. So she was not surprised when at age 72 she first noticed hand tremors and a neurologist confirmed that she had the disease. But to watch her in action three years later, it would be hard for a layperson to tell. She stands straight, walks briskly, speaks in clarion tones and maintains a schedule that could tire someone half her age. Having wisely put the power saw aside, Ms. Sills now makes intricately designed art jewelry. She is also a docent at the Brooklyn Museum, participates in a cooperative art gallery and assists her husband’s business by entertaining customers. Ms. Sills attributes her energy and well-being partly to the medication she takes but primarily to the hours she spends working out with a physical therapist and personal trainer, who have helped her develop an exercise regimen that, while not a cure, can alleviate Parkinson’s symptoms and slow progression of the disease. “The exercises opened me up,” said Ms. Sills, allowing such symptoms as small steps, slow movements and tiny, cramped handwriting to subside. “The earlier people begin exercising after a Parkinson’s diagnosis, and the higher the intensity of exercise they achieve, the better they are,” Marilyn Moffat, a physical therapist on the faculty of New York University, said. “Many different activities have been shown to be beneficial, including cycling, boxing, dancing and walking forward and backward on a treadmill. If someone doesn’t like one activity, there are others that can have equally good results.” © 2017 The New York Times Company
Link ID: 23132 - Posted: 01.23.2017
By NICHOLAS ST. FLEUR The tale of the Tasmanian tiger was tragic. Once numerous across Tasmania, the doglike marsupial was branded a sheep killer by colonists in the 1830s and hunted to extinction. The last of its kind, Benjamin, died in a zoo in 1936, and with it many secrets into the animals’ lives were lost. The striped creature, which is also known as the thylacine, was hardly studied when it was alive, depriving scientists of understanding the behavior of an important predator from Australia’s recent biological past. Now, for the first time, researchers have performed neural scans on the extinct carnivore’s brain, revealing insights that had been lost since the species went extinct. “Part of the myth about them is what exactly did they eat, how did they hunt and were they social?” said Dr. Gregory Berns, a neuroscientist at Emory University and lead author on the study, which was published Wednesday in the journal PLOS One. “These are questions nobody really knows the answers to.” Dr. Berns’s main research pertains to dogs and the inner workings of the canine brain, but after learning more about Tasmanian tigers, he became fascinated by the beasts. With their slender bodies, long snouts and sharp teeth, Tasmanian tigers looked as if they could be related to dogs, wolves or coyotes. But actually they are separated by more than 150 million years of evolution. It is a classic example of convergent evolution, in which two organisms that are not closely related develop similar features because of the environment they adapted to and the ecological role they played. To better understand thylacines, Dr. Berns spent two years tracking down two preserved Tasmanian tiger brains, one at the Smithsonian Institution and the other at the Australian Museum. Their brains, like those of all marsupials, are very different from the brains of placental mammals. The biggest difference is that they lack a corpus callosum, which is the part of the brain that connects the left and right hemispheres. © 2017 The New York Times Company
By Jordan Axt Imagine playing a game where you’re seated in front of four decks of cards. On the back of two decks are pictures of puppies; on the other two are pictures of spiders. Each deck has some cards that win points and others that lose points. In general, the puppy decks are “good” in that they win you more points than they lose while the spider decks are “bad” in that they lose you more points they win. You repeatedly select cards in hopes of winning as many points as possible. This game seems pretty easy— and it is. Most players favor the puppy decks from the start and quickly learn to continue favoring them because they produce more points. However, if the pictures on the decks are reversed, the game becomes a little harder. People may have a tougher time initially favoring spider decks because it’s difficult to learn that something people fear like spiders brings positive outcomes and something people enjoy like puppies brings negative outcomes. Performance on this learning task is best when one’s attitudes and motivations are aligned. For instance, when puppies earn you more points than spiders, people’s preference for puppies can lead people to select more puppies initially, and a motivation to earn as many points as possible leads people to select more and more puppies over time. But when spiders earn you more points than spiders, people have to overcome their initial aversion to spiders in order to perform well. © 2017 Scientific American
Link ID: 23130 - Posted: 01.21.2017
Joseph Palamar On Nov. 30 the FDA approved a Phase III clinical trial to confirm the effectiveness of treating post-traumatic stress disorder (PTSD) with MDMA, also known as Ecstasy. This news appeared in headlines throughout the world, as it represents an important – yet somewhat unorthodox – advance in PTSD treatment. However, the media have largely been referring to Ecstasy – the street name for this drug – as the treatment in this trial, rather than MDMA (3,4-methylenedioxymethamphetamine). This can lead to misunderstanding, as recreational Ecstasy use is a highly stigmatized behavior. Using this terminology may further misconceptions about the study drug and its uses. While Ecstasy is in fact a common street name for MDMA, what we call Ecstasy has changed dramatically since it became a prevalent recreational drug. Ecstasy now has a very different meaning – socially and pharmacologically. It is understandable why the media have referred to this drug as Ecstasy rather than MDMA. Not only has much of the public at least heard of Ecstasy (and would not recognize MDMA), but this also increases shock value and readership. But referring to a therapeutic drug by its street name (such as Ecstasy) is misleading – especially since MDMA is known to be among the most popular illicit drugs used at nightclubs and dance festivals. This leads some to assume that street drugs are being promoted and provided to patients, perhaps in a reckless manner. © 2010–2017, The Conversation US, Inc.
Claudia Dreifus Geneticists tell us that somewhere between 1 and 5 percent of the genome of modern Europeans and Asians consists of DNA inherited from Neanderthals, our prehistoric cousins. At Vanderbilt University, John Anthony Capra, an evolutionary genomics professor, has been combining high-powered computation and a medical records databank to learn what a Neanderthal heritage — even a fractional one — might mean for people today. We spoke for two hours when Dr. Capra, 35, recently passed through New York City. An edited and condensed version of the conversation follows. Q. Let’s begin with an indiscreet question. How did contemporary people come to have Neanderthal DNA on their genomes? A. We hypothesize that roughly 50,000 years ago, when the ancestors of modern humans migrated out of Africa and into Eurasia, they encountered Neanderthals. Matings must have occurred then. And later. One reason we deduce this is because the descendants of those who remained in Africa — present day Africans — don’t have Neanderthal DNA. What does that mean for people who have it? At my lab, we’ve been doing genetic testing on the blood samples of 28,000 patients at Vanderbilt and eight other medical centers across the country. Computers help us pinpoint where on the human genome this Neanderthal DNA is, and we run that against information from the patients’ anonymized medical records. We’re looking for associations. What we’ve been finding is that Neanderthal DNA has a subtle influence on risk for disease. It affects our immune system and how we respond to different immune challenges. It affects our skin. You’re slightly more prone to a condition where you can get scaly lesions after extreme sun exposure. There’s an increased risk for blood clots and tobacco addiction. To our surprise, it appears that some Neanderthal DNA can increase the risk for depression; however, there are other Neanderthal bits that decrease the risk. Roughly 1 to 2 percent of one’s risk for depression is determined by Neanderthal DNA. It all depends on where on the genome it’s located. © 2017 The New York Times Company
Children who have their tonsils removed to treat chronic throat infections or breathing problems during sleep may get more short-term symptom relief than similar children who don’t get tonsillectomies, two recent studies suggest. Over time, however, the benefits of surgery for chronic streptococcal throat infections appear to go away. Three years after tonsillectomies, children who had these procedures had about the same number of throat infections as those who didn’t get their tonsils taken out, one of the studies in the journal Pediatrics found. “Tonsillectomy, while very common and generally safe, is not completely without risk,” said Sivakumar Chinnadurai, senior author of the strep throat study and a researcher at Vanderbilt University Medical Center in Nashville. “The recognition of risks, and the knowledge that some patients’ infection rate improves over time has led to [strep] infection being a much less common indication for tonsillectomy than it was in the past,” Chinnadurai added by email. “While tonsillectomy remains one of the most common surgeries performed in the United States, the main indication for children has switched to obstructed breathing.” To assess the potential for tonsillectomies to help young people with chronic strep infections, Chinnadurai and colleagues examined data from seven studies of children who had experienced at least three strep infections in the previous one to three years. © 1996-2017 The Washington Post
Link ID: 23127 - Posted: 01.21.2017
NEUROSCIENCE, like many other sciences, has a bottomless appetite for data. Flashy enterprises such as the BRAIN Initiative, announced by Barack Obama in 2013, or the Human Brain Project, approved by the European Union in the same year, aim to analyse the way that thousands or even millions of nerve cells interact in a real brain. The hope is that the torrents of data these schemes generate will contain some crucial nuggets that let neuroscientists get closer to understanding how exactly the brain does what it does. But a paper just published in PLOS Computational Biology questions whether more information is the same thing as more understanding. It does so by way of neuroscience’s favourite analogy: comparing the brain to a computer. Like brains, computers process information by shuffling electricity around complicated circuits. Unlike the workings of brains, though, those of computers are understood on every level. Eric Jonas of the University of California, Berkeley, and Konrad Kording of Northwestern University, in Chicago, who both have backgrounds in neuroscience and electronic engineering, reasoned that a computer was therefore a good way to test the analytical toolkit used by modern neuroscience. Their idea was to see whether applying those techniques to a microprocessor produced information that matched what they already knew to be true about how the chip works. © The Economist Newspaper Limited 2017.
Keyword: Brain imaging
Link ID: 23126 - Posted: 01.21.2017
By GRETCHEN REYNOLDS Being nearsighted is far more common than it once was. The prevalence of myopia, the condition’s medical name, in Americans has soared by 66 percent since the early 1970s, according to a 2009 study by the National Eye Institute; in China and other East Asian countries, as many as 90 percent of recent high school graduates are thought to be nearsighted. Myopia results when eyeballs are longer than normal, changing the angle at which light enters the eye and therefore the ability to focus on distant objects. The disorder involves a complex interplay of genetics and environment and usually begins before adolescence, when the eye is growing, but it can worsen in early adulthood. Some experts connect the elevated rates of myopia to the many hours young people stare at computers and other screens. But a recent study published in JAMA Ophthalmology suggests that a greater factor may be a side effect of all that screen-watching — it’s keeping children inside. This new study joins a growing body of research indicating that a lack of direct sunlight may reshape the human eye and impair vision. Researchers at King’s College London, the London School of Hygiene and Tropical Medicine and other institutions gave vision exams to more than 3,100 older European men and women and interviewed them at length about their education, careers and how often they remembered being outside during various stages of their lives. This biographical information was then cross-referenced with historical data about sunlight, originally compiled for research on skin cancer and other conditions. © 2017 The New York Times Company
By R. Douglas Fields With American restrictions on travel lifting, interest in Cuba has skyrocketed, especially among scientists considering developing collaborations and student exchange programs with their Caribbean neighbors. But few researchers in the United States know how science and higher education are conducted in communist Cuba. Undark met with Dr. Mitchell Valdés-Sosa, director of the Cuban Neuroscience Center, in his office in Havana to learn how someone becomes a neuroscientist in Cuba, and to discuss what the future may hold for scientific collaborations between the two nations. It is helpful to appreciate some of the ways that higher education and research operate differently in communist Cuba. In contrast to the local institutional and individual control of decisions in the U.S., the central government in Cuba makes career and educational decisions for its citizens. Scientific research is directed by authorities to meet the needs of the developing country, and Ph.D. dissertation proposals must satisfy this goal for approval. Much of the graduate education takes place in biotechnology companies and research centers that are authorized by the government — a situation resembling internships in the U.S. Development, production, and marketing of products from biomedical research and education are all carried out in the same center, and the sales of these products provide financial support to the institution. Copyright 2017 Undark
Link ID: 23124 - Posted: 01.19.2017
By Nicole Kobie Getting drunk could make it harder to enter your password – even if your brainwaves are your login. Brainwave authentication is one of many biometric measures touted as an alternative to passwords. The idea is for a person to authenticate their identity with electroencephalogram (EEG) readings. For example, instead of demanding a passcode, a computer could display a series of words on a screen and measure the user’s response via an EEG headset. EEG signatures are unique and are more complex than a standard password, making them difficult to hack. But while research suggests that EEG readings can authenticate someone’s identity with accuracy rates around 94 per cent, there could be confounding factors – including whether you’ve had a few too many drinks. Tommy Chin, a security researcher at cybersecurity consultancy firm Grimm, and Peter Muller, a graduate student at the Rochester Institute of Technology, decided to test this theory experimentally, by analysing people’s brainwaves before and after drinking shots of Fireball, a cinnamon-flavoured whisky. “Brainwaves can be easily manipulated by external influences such as drugs [like] opioids, caffeine, and alcohol,” Chin says. “This manipulation makes it a significant challenge to verify the authenticity of the user because they drank an immense amount of alcohol or caffeinated drink.” © Copyright Reed Business Information Ltd.
By Avi Selk “Oh Long Johnson,” a cat once said, back in the primordial history of Internet memes. “Oh Don Piano. Why I eyes ya.” Or so said the captions — appended to the gibberish of a perturbed house cat on “America's Funniest Home Videos” in 1999 and rediscovered in the YouTube era, when millions of people heard something vaguely human echo in a distant species. It was weird. And hilarious. And just maybe, profound. As the “Oh Long Johnson” craze was fading a few years ago, a wave of scientific discoveries about apes and monkeys began upending old assumptions about the origins of language. Only humans could willfully control their vocal tracts, went the established wisdom. Until Koko the gorilla coughed on command. Surely, then, our vowels were ours alone. But this month, researchers picked up British ohs in the babble of baboons. Study after study is dismantling a hypothesis that has stood for decades: that the seeds of language did not exist before modern humans, who got all the way to Shakespeare from scratch. And if so much of what we thought we knew about the uniqueness of human speech was wrong, some think it's time to take a second look at talking pet tricks. “It's humbling to understand that humans, in the end, are just another species of primate,” said Marcus Perlman, who led the Koko study in 2015. © 1996-2017 The Washington Post
About 11 per cent of Canadians aged 15 to 24 experienced depression at some point in their lives, and fewer than half of them sought professional help for a mental health condition over the previous year, according to Statistics Canada. The information was released Wednesday in the agency's Health Reports, and is based on data from the 2012 Canadian Community Health Survey Mental Health. The report was based on 4,031 respondents aged 15 to 24, which when extrapolated represents more than 4.4 million young people. Canadians 15 to 24 years old had a higher rate of depression than any other age group. Suicide is the second leading cause of death (after accidents), accounting for nearly a quarter of deaths in the 15-24 category, Statistics Canada said. An estimated 14 per cent of respondents reported having had suicidal thoughts at some point in their lives. The figure includes six per cent having that thought in the past 12 months. As well, 3.5 per cent had attempted suicide, according to the data. Report author Leanne Findlay said the findings confirm people with depression or suicidal thoughts are increasingly likely to seek professional help. Young people in the study were more likely to turn to friends or family, and when they did, generally felt they received a lot or some help. Factors such as perceived ability to deal with stress and "negative social interactions" — for instance, feeling others were angry with you — were related to depression and suicidal thoughts. Symptoms of depression include feeling sad or having trouble sleeping that last two weeks or more, Findlay said. "Knowledge of these risk and protective factors may facilitate early intervention," Findlay concluded. ©2017 CBC/Radio-Canada.
Tina Rosenberg It has been nearly 30 years since the first needle exchange program opened in the United States, in Takoma, Wash., in 1988. It was a health measure to prevent injecting drug users from sharing needles, and therefore spreading H.I.V. and hepatitis. The idea was controversial, to say the least. Many people felt — and still feel — that it enables drug use and sends a message that drug use is O.K. and can be done safely. Today the evidence is overwhelming that needle exchange prevents disease, increases use of drug treatment by winning users’ trust and bringing them into the health system, and does not increase drug use. Its utility has won over some critics. When Vice President-elect Mike Pence was governor of Indiana, he authorized needle exchange programs as an emergency response to an H.I.V. outbreak. “I do not support needle exchange as antidrug policy, but this is a public health emergency,” he said at a news conference in 2015. Needle exchange saved New York City from a generalized H.I.V. epidemic. In 1990, more than half of injecting drug users had H.I.V. Then in 1992, needle exchange began — and by 2001, H.I.V. prevalence had fallen to 13 percent. America has another epidemic now: overdose deaths from opioids, heroin and fentanyl, a synthetic opioid so powerful that a few grains can kill. A thousand people died of overdose in the city last year — three times the number who were killed in homicides. Nationally, drug overdose has passed firearms and car accidents as the leading cause of injury deaths. If there is a way to save people from overdose death without creating harm, we should do it. Yet there is a potent weapon that we’re ignoring: the supervised injection room. According to a report by the London-based group Harm Reduction International, 90 supervised injection sites exist around the world: in Canada, Australia and eight countries in Europe. Scotland and Ireland plan to open sites this year. In the United States, state officials in New York, California and Maryland, and city officials in Seattle (where a task force recommended two sites), San Francisco, New York City, Ithaca, N.Y., and elsewhere, are discussing such facilities. © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23120 - Posted: 01.18.2017
Thorsten Rudroff An estimated 400,000 Americans are currently living with multiple sclerosis, an autoimmune disease where the body’s immune cells attack a fatty substance called myelin in the nerves. Common symptoms are gait and balance disorders, cognitive dysfunction, fatigue, pain and muscle spasticity. Colorado has the highest proportion of people living with MS in the United States. It is estimated that one in 550 people living in the state has MS, compared to one in 750 nationally. The reason for this is unknown, but could be related to several factors, such as vitamin D deficiency or environment. Currently available therapies do not sufficiently relieve MS symptoms. As a result many people with the condition are trying alternative therapies, like cannabis. Based on several studies, the American Association of Neurology states that there is strong evidence that cannabis is effective for treatment of pain and spasticity. Although there are many anecdotal reports indicating cannabis’ beneficial effects for treatment of MS symptoms such as fatigue, muscle weakness, anxiety and sleep deprivation, they have not been scientifically verified. This is because clinical trials – where patients are given cannabis – are difficult to do because of how the substance is regulated at the federal level. To learn more, my Integrative Neurophysiology Laboratory at Colorado State University is studying people with MS in the state who are already using medical cannabis as a treatment to investigate what MS symptoms the drug can effectively treat. © 2010–2017, The Conversation US, Inc.
By Helen Briggs BBC News Babies build knowledge about the language they hear even in the first few months of life, research shows. If you move countries and forget your birth language, you retain this hidden ability, according to a study. Dutch-speaking adults adopted from South Korea exceeded expectations at Korean pronunciation when retrained after losing their birth language. Scientists say parents should talk to babies as much as possible in early life. Dr Jiyoun Choi of Hanyang University in Seoul led the research. The study is the first to show that the early experience of adopted children in their birth language gives them an advantage decades later even if they think it is forgotten, she said. ''This finding indicates that useful language knowledge is laid down in [the] very early months of life, which can be retained without further input of the language and revealed via re-learning,'' she told BBC News. In the study, adults aged about 30 who had been adopted as babies by Dutch-speaking families were asked to pronounce Korean consonants after a short training course. Korean consonants are unlike those spoken in Dutch. The participants were compared with a group of adults who had not been exposed to the Korean language as children and then rated by native Korean speakers. Both groups performed to the same level before training, but after training the international adoptees exceeded expectations. There was no difference between children who were adopted under six months of age - before they could speak - and those who were adopted after 17 months, when they had learned to talk. This suggests that the language knowledge retained is abstract in nature, rather than dependent on the amount of experience. © 2017 BBC
By Lisa Rapaport Researchers examined data on high school soccer players from 2005 to 2014 and found non-concussion injury rates declined for boys and were little changed for girls. But concussions increased in both male and female players. The significant rise in concussion rates "could be mainly due to a better recognition of concussion by medical and coaching staff," study leader Dr. Morteza Khodaee, a sports medicine researcher at the University of Colorado School of Medicine, said in an email. The research team looked at injuries per minute of athletic exposure (AE), which includes both practices and competitions, for U.S. high school athletes. Overall, there were 6,154 injuries during 2.98 million athletic exposures, for an injury rate of 2.06 per 1,000 AEs, the study found. That included about 1.8 million soccer injuries among girls and 1.5 million among boys. Girls were 27 percent more likely to sustain soccer injuries than boys, the authors reported online December 28 in the British Journal of Sports Medicine. Injuries were 42 percent more common in competitions than during practice. "The majority of injuries during competitions occurred during the second half indicating a potential accumulated effect of fatigue," the authors reported. "It is well known that the risk of injury is higher in competition compared with practice," Khodaee said. "This is most likely due to more intense, full contact and potentially riskier play that occurs in competition." Still, while injury rates were significantly higher in competition, more than one third of all injuries occurred in practice. © 2017 Scientific American
Keyword: Brain Injury/Concussion
Link ID: 23117 - Posted: 01.18.2017
By Catherine Offord In the early 20th century, Danish biologist Johannes Schmidt solved a puzzle that had confounded European fisherman for generations. Freshwater eels—popular for centuries on menus across northern Europe—were abundant in rivers and creeks, but only as adults, never as babies. So where were they coming from? In 1922, after nearly two decades of research, Schmidt published the answer: the Sargasso Sea, the center of a massive, swirling gyre in the North Atlantic Ocean. Now regarded as some of the world’s most impressive animal migrators, European eels (Anguilla anguilla) journey westward across the Atlantic to spawning sites in the Sargasso; their eggs hatch into larvae that are carried back across the ocean by the Gulf Stream, arriving two or three years later to repopulate European waterways. For decades, researchers have assumed that adults made the journey in one short and rapid migration, leaving European coastlines in autumn and arriving in the Sargasso Sea, ready to spawn, the following spring. But this assumption rests on surprisingly little evidence, says behavioral ecologist David Righton of the UK Centre for Environment, Fisheries, and Aquaculture Science. “Since Johannes Schmidt identified this spawning area in the Sargasso Sea, people have been wondering about that great journey and trying to figure out how to follow the eels,” says Righton, whose work on epic marine migrations includes appropriately titled projects such as CODYSSEY and EELIAD. “But the technology hasn’t been available. . . . They just slip away into the darkness, really, in autumn, and no one knows what happens to them.” © 1986-2017 The Scientist
Keyword: Animal Migration
Link ID: 23116 - Posted: 01.18.2017