Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Few genes have made the headlines as much as FOXP2. The first gene associated with language disorders , it was later implicated in the evolution of human speech. Girls make more of the FOXP2 protein, which may help explain their precociousness in learning to talk. Now, neuroscientists have figured out how one of its molecular partners helps Foxp2 exert its effects.
The findings may eventually lead to new therapies for inherited speech disorders, says Richard Huganir, the neurobiologist at Johns Hopkins University School of Medicine in Baltimore, Maryland, who led the work. Foxp2 controls the activity of a gene called Srpx2, he notes, which helps some of the brain's nerve cells beef up their connections to other nerve cells. By establishing what SRPX2 does, researchers can look for defective copies of it in people suffering from problems talking or learning to talk.
Until 2001, scientists were not sure how genes influenced language. Then Simon Fisher, a neurogeneticist now at the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, and his colleagues fingered FOXP2 as the culprit in a family with several members who had trouble with pronunciation, putting words together, and understanding speech. These people cannot move their tongue and lips precisely enough to talk clearly, so even family members often can?t figure out what they are saying. It “opened a molecular window on the neural basis of speech and language,” Fisher says.
Photo credit: Yoichi Araki, Ph.D.
By NICHOLAS ST. FLEUR The tale of the Tasmanian tiger was tragic. Once numerous across Tasmania, the doglike marsupial was branded a sheep killer by colonists in the 1830s and hunted to extinction. The last of its kind, Benjamin, died in a zoo in 1936, and with it many secrets into the animals’ lives were lost. The striped creature, which is also known as the thylacine, was hardly studied when it was alive, depriving scientists of understanding the behavior of an important predator from Australia’s recent biological past. Now, for the first time, researchers have performed neural scans on the extinct carnivore’s brain, revealing insights that had been lost since the species went extinct. “Part of the myth about them is what exactly did they eat, how did they hunt and were they social?” said Dr. Gregory Berns, a neuroscientist at Emory University and lead author on the study, which was published Wednesday in the journal PLOS One. “These are questions nobody really knows the answers to.” Dr. Berns’s main research pertains to dogs and the inner workings of the canine brain, but after learning more about Tasmanian tigers, he became fascinated by the beasts. With their slender bodies, long snouts and sharp teeth, Tasmanian tigers looked as if they could be related to dogs, wolves or coyotes. But actually they are separated by more than 150 million years of evolution. It is a classic example of convergent evolution, in which two organisms that are not closely related develop similar features because of the environment they adapted to and the ecological role they played. To better understand thylacines, Dr. Berns spent two years tracking down two preserved Tasmanian tiger brains, one at the Smithsonian Institution and the other at the Australian Museum. Their brains, like those of all marsupials, are very different from the brains of placental mammals. The biggest difference is that they lack a corpus callosum, which is the part of the brain that connects the left and right hemispheres. © 2017 The New York Times Company
By Jordan Axt Imagine playing a game where you’re seated in front of four decks of cards. On the back of two decks are pictures of puppies; on the other two are pictures of spiders. Each deck has some cards that win points and others that lose points. In general, the puppy decks are “good” in that they win you more points than they lose while the spider decks are “bad” in that they lose you more points they win. You repeatedly select cards in hopes of winning as many points as possible. This game seems pretty easy— and it is. Most players favor the puppy decks from the start and quickly learn to continue favoring them because they produce more points. However, if the pictures on the decks are reversed, the game becomes a little harder. People may have a tougher time initially favoring spider decks because it’s difficult to learn that something people fear like spiders brings positive outcomes and something people enjoy like puppies brings negative outcomes. Performance on this learning task is best when one’s attitudes and motivations are aligned. For instance, when puppies earn you more points than spiders, people’s preference for puppies can lead people to select more puppies initially, and a motivation to earn as many points as possible leads people to select more and more puppies over time. But when spiders earn you more points than spiders, people have to overcome their initial aversion to spiders in order to perform well. © 2017 Scientific American
Link ID: 23130 - Posted: 01.21.2017
Joseph Palamar On Nov. 30 the FDA approved a Phase III clinical trial to confirm the effectiveness of treating post-traumatic stress disorder (PTSD) with MDMA, also known as Ecstasy. This news appeared in headlines throughout the world, as it represents an important – yet somewhat unorthodox – advance in PTSD treatment. However, the media have largely been referring to Ecstasy – the street name for this drug – as the treatment in this trial, rather than MDMA (3,4-methylenedioxymethamphetamine). This can lead to misunderstanding, as recreational Ecstasy use is a highly stigmatized behavior. Using this terminology may further misconceptions about the study drug and its uses. While Ecstasy is in fact a common street name for MDMA, what we call Ecstasy has changed dramatically since it became a prevalent recreational drug. Ecstasy now has a very different meaning – socially and pharmacologically. It is understandable why the media have referred to this drug as Ecstasy rather than MDMA. Not only has much of the public at least heard of Ecstasy (and would not recognize MDMA), but this also increases shock value and readership. But referring to a therapeutic drug by its street name (such as Ecstasy) is misleading – especially since MDMA is known to be among the most popular illicit drugs used at nightclubs and dance festivals. This leads some to assume that street drugs are being promoted and provided to patients, perhaps in a reckless manner. © 2010–2017, The Conversation US, Inc.
Claudia Dreifus Geneticists tell us that somewhere between 1 and 5 percent of the genome of modern Europeans and Asians consists of DNA inherited from Neanderthals, our prehistoric cousins. At Vanderbilt University, John Anthony Capra, an evolutionary genomics professor, has been combining high-powered computation and a medical records databank to learn what a Neanderthal heritage — even a fractional one — might mean for people today. We spoke for two hours when Dr. Capra, 35, recently passed through New York City. An edited and condensed version of the conversation follows. Q. Let’s begin with an indiscreet question. How did contemporary people come to have Neanderthal DNA on their genomes? A. We hypothesize that roughly 50,000 years ago, when the ancestors of modern humans migrated out of Africa and into Eurasia, they encountered Neanderthals. Matings must have occurred then. And later. One reason we deduce this is because the descendants of those who remained in Africa — present day Africans — don’t have Neanderthal DNA. What does that mean for people who have it? At my lab, we’ve been doing genetic testing on the blood samples of 28,000 patients at Vanderbilt and eight other medical centers across the country. Computers help us pinpoint where on the human genome this Neanderthal DNA is, and we run that against information from the patients’ anonymized medical records. We’re looking for associations. What we’ve been finding is that Neanderthal DNA has a subtle influence on risk for disease. It affects our immune system and how we respond to different immune challenges. It affects our skin. You’re slightly more prone to a condition where you can get scaly lesions after extreme sun exposure. There’s an increased risk for blood clots and tobacco addiction. To our surprise, it appears that some Neanderthal DNA can increase the risk for depression; however, there are other Neanderthal bits that decrease the risk. Roughly 1 to 2 percent of one’s risk for depression is determined by Neanderthal DNA. It all depends on where on the genome it’s located. © 2017 The New York Times Company
Children who have their tonsils removed to treat chronic throat infections or breathing problems during sleep may get more short-term symptom relief than similar children who don’t get tonsillectomies, two recent studies suggest. Over time, however, the benefits of surgery for chronic streptococcal throat infections appear to go away. Three years after tonsillectomies, children who had these procedures had about the same number of throat infections as those who didn’t get their tonsils taken out, one of the studies in the journal Pediatrics found. “Tonsillectomy, while very common and generally safe, is not completely without risk,” said Sivakumar Chinnadurai, senior author of the strep throat study and a researcher at Vanderbilt University Medical Center in Nashville. “The recognition of risks, and the knowledge that some patients’ infection rate improves over time has led to [strep] infection being a much less common indication for tonsillectomy than it was in the past,” Chinnadurai added by email. “While tonsillectomy remains one of the most common surgeries performed in the United States, the main indication for children has switched to obstructed breathing.” To assess the potential for tonsillectomies to help young people with chronic strep infections, Chinnadurai and colleagues examined data from seven studies of children who had experienced at least three strep infections in the previous one to three years. © 1996-2017 The Washington Post
Link ID: 23127 - Posted: 01.21.2017
NEUROSCIENCE, like many other sciences, has a bottomless appetite for data. Flashy enterprises such as the BRAIN Initiative, announced by Barack Obama in 2013, or the Human Brain Project, approved by the European Union in the same year, aim to analyse the way that thousands or even millions of nerve cells interact in a real brain. The hope is that the torrents of data these schemes generate will contain some crucial nuggets that let neuroscientists get closer to understanding how exactly the brain does what it does. But a paper just published in PLOS Computational Biology questions whether more information is the same thing as more understanding. It does so by way of neuroscience’s favourite analogy: comparing the brain to a computer. Like brains, computers process information by shuffling electricity around complicated circuits. Unlike the workings of brains, though, those of computers are understood on every level. Eric Jonas of the University of California, Berkeley, and Konrad Kording of Northwestern University, in Chicago, who both have backgrounds in neuroscience and electronic engineering, reasoned that a computer was therefore a good way to test the analytical toolkit used by modern neuroscience. Their idea was to see whether applying those techniques to a microprocessor produced information that matched what they already knew to be true about how the chip works. © The Economist Newspaper Limited 2017.
Keyword: Brain imaging
Link ID: 23126 - Posted: 01.21.2017
By GRETCHEN REYNOLDS Being nearsighted is far more common than it once was. The prevalence of myopia, the condition’s medical name, in Americans has soared by 66 percent since the early 1970s, according to a 2009 study by the National Eye Institute; in China and other East Asian countries, as many as 90 percent of recent high school graduates are thought to be nearsighted. Myopia results when eyeballs are longer than normal, changing the angle at which light enters the eye and therefore the ability to focus on distant objects. The disorder involves a complex interplay of genetics and environment and usually begins before adolescence, when the eye is growing, but it can worsen in early adulthood. Some experts connect the elevated rates of myopia to the many hours young people stare at computers and other screens. But a recent study published in JAMA Ophthalmology suggests that a greater factor may be a side effect of all that screen-watching — it’s keeping children inside. This new study joins a growing body of research indicating that a lack of direct sunlight may reshape the human eye and impair vision. Researchers at King’s College London, the London School of Hygiene and Tropical Medicine and other institutions gave vision exams to more than 3,100 older European men and women and interviewed them at length about their education, careers and how often they remembered being outside during various stages of their lives. This biographical information was then cross-referenced with historical data about sunlight, originally compiled for research on skin cancer and other conditions. © 2017 The New York Times Company
By R. Douglas Fields With American restrictions on travel lifting, interest in Cuba has skyrocketed, especially among scientists considering developing collaborations and student exchange programs with their Caribbean neighbors. But few researchers in the United States know how science and higher education are conducted in communist Cuba. Undark met with Dr. Mitchell Valdés-Sosa, director of the Cuban Neuroscience Center, in his office in Havana to learn how someone becomes a neuroscientist in Cuba, and to discuss what the future may hold for scientific collaborations between the two nations. It is helpful to appreciate some of the ways that higher education and research operate differently in communist Cuba. In contrast to the local institutional and individual control of decisions in the U.S., the central government in Cuba makes career and educational decisions for its citizens. Scientific research is directed by authorities to meet the needs of the developing country, and Ph.D. dissertation proposals must satisfy this goal for approval. Much of the graduate education takes place in biotechnology companies and research centers that are authorized by the government — a situation resembling internships in the U.S. Development, production, and marketing of products from biomedical research and education are all carried out in the same center, and the sales of these products provide financial support to the institution. Copyright 2017 Undark
Link ID: 23124 - Posted: 01.19.2017
By Nicole Kobie Getting drunk could make it harder to enter your password – even if your brainwaves are your login. Brainwave authentication is one of many biometric measures touted as an alternative to passwords. The idea is for a person to authenticate their identity with electroencephalogram (EEG) readings. For example, instead of demanding a passcode, a computer could display a series of words on a screen and measure the user’s response via an EEG headset. EEG signatures are unique and are more complex than a standard password, making them difficult to hack. But while research suggests that EEG readings can authenticate someone’s identity with accuracy rates around 94 per cent, there could be confounding factors – including whether you’ve had a few too many drinks. Tommy Chin, a security researcher at cybersecurity consultancy firm Grimm, and Peter Muller, a graduate student at the Rochester Institute of Technology, decided to test this theory experimentally, by analysing people’s brainwaves before and after drinking shots of Fireball, a cinnamon-flavoured whisky. “Brainwaves can be easily manipulated by external influences such as drugs [like] opioids, caffeine, and alcohol,” Chin says. “This manipulation makes it a significant challenge to verify the authenticity of the user because they drank an immense amount of alcohol or caffeinated drink.” © Copyright Reed Business Information Ltd.
By Avi Selk “Oh Long Johnson,” a cat once said, back in the primordial history of Internet memes. “Oh Don Piano. Why I eyes ya.” Or so said the captions — appended to the gibberish of a perturbed house cat on “America's Funniest Home Videos” in 1999 and rediscovered in the YouTube era, when millions of people heard something vaguely human echo in a distant species. It was weird. And hilarious. And just maybe, profound. As the “Oh Long Johnson” craze was fading a few years ago, a wave of scientific discoveries about apes and monkeys began upending old assumptions about the origins of language. Only humans could willfully control their vocal tracts, went the established wisdom. Until Koko the gorilla coughed on command. Surely, then, our vowels were ours alone. But this month, researchers picked up British ohs in the babble of baboons. Study after study is dismantling a hypothesis that has stood for decades: that the seeds of language did not exist before modern humans, who got all the way to Shakespeare from scratch. And if so much of what we thought we knew about the uniqueness of human speech was wrong, some think it's time to take a second look at talking pet tricks. “It's humbling to understand that humans, in the end, are just another species of primate,” said Marcus Perlman, who led the Koko study in 2015. © 1996-2017 The Washington Post
About 11 per cent of Canadians aged 15 to 24 experienced depression at some point in their lives, and fewer than half of them sought professional help for a mental health condition over the previous year, according to Statistics Canada. The information was released Wednesday in the agency's Health Reports, and is based on data from the 2012 Canadian Community Health Survey Mental Health. The report was based on 4,031 respondents aged 15 to 24, which when extrapolated represents more than 4.4 million young people. Canadians 15 to 24 years old had a higher rate of depression than any other age group. Suicide is the second leading cause of death (after accidents), accounting for nearly a quarter of deaths in the 15-24 category, Statistics Canada said. An estimated 14 per cent of respondents reported having had suicidal thoughts at some point in their lives. The figure includes six per cent having that thought in the past 12 months. As well, 3.5 per cent had attempted suicide, according to the data. Report author Leanne Findlay said the findings confirm people with depression or suicidal thoughts are increasingly likely to seek professional help. Young people in the study were more likely to turn to friends or family, and when they did, generally felt they received a lot or some help. Factors such as perceived ability to deal with stress and "negative social interactions" — for instance, feeling others were angry with you — were related to depression and suicidal thoughts. Symptoms of depression include feeling sad or having trouble sleeping that last two weeks or more, Findlay said. "Knowledge of these risk and protective factors may facilitate early intervention," Findlay concluded. ©2017 CBC/Radio-Canada.
Tina Rosenberg It has been nearly 30 years since the first needle exchange program opened in the United States, in Takoma, Wash., in 1988. It was a health measure to prevent injecting drug users from sharing needles, and therefore spreading H.I.V. and hepatitis. The idea was controversial, to say the least. Many people felt — and still feel — that it enables drug use and sends a message that drug use is O.K. and can be done safely. Today the evidence is overwhelming that needle exchange prevents disease, increases use of drug treatment by winning users’ trust and bringing them into the health system, and does not increase drug use. Its utility has won over some critics. When Vice President-elect Mike Pence was governor of Indiana, he authorized needle exchange programs as an emergency response to an H.I.V. outbreak. “I do not support needle exchange as antidrug policy, but this is a public health emergency,” he said at a news conference in 2015. Needle exchange saved New York City from a generalized H.I.V. epidemic. In 1990, more than half of injecting drug users had H.I.V. Then in 1992, needle exchange began — and by 2001, H.I.V. prevalence had fallen to 13 percent. America has another epidemic now: overdose deaths from opioids, heroin and fentanyl, a synthetic opioid so powerful that a few grains can kill. A thousand people died of overdose in the city last year — three times the number who were killed in homicides. Nationally, drug overdose has passed firearms and car accidents as the leading cause of injury deaths. If there is a way to save people from overdose death without creating harm, we should do it. Yet there is a potent weapon that we’re ignoring: the supervised injection room. According to a report by the London-based group Harm Reduction International, 90 supervised injection sites exist around the world: in Canada, Australia and eight countries in Europe. Scotland and Ireland plan to open sites this year. In the United States, state officials in New York, California and Maryland, and city officials in Seattle (where a task force recommended two sites), San Francisco, New York City, Ithaca, N.Y., and elsewhere, are discussing such facilities. © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23120 - Posted: 01.18.2017
Thorsten Rudroff An estimated 400,000 Americans are currently living with multiple sclerosis, an autoimmune disease where the body’s immune cells attack a fatty substance called myelin in the nerves. Common symptoms are gait and balance disorders, cognitive dysfunction, fatigue, pain and muscle spasticity. Colorado has the highest proportion of people living with MS in the United States. It is estimated that one in 550 people living in the state has MS, compared to one in 750 nationally. The reason for this is unknown, but could be related to several factors, such as vitamin D deficiency or environment. Currently available therapies do not sufficiently relieve MS symptoms. As a result many people with the condition are trying alternative therapies, like cannabis. Based on several studies, the American Association of Neurology states that there is strong evidence that cannabis is effective for treatment of pain and spasticity. Although there are many anecdotal reports indicating cannabis’ beneficial effects for treatment of MS symptoms such as fatigue, muscle weakness, anxiety and sleep deprivation, they have not been scientifically verified. This is because clinical trials – where patients are given cannabis – are difficult to do because of how the substance is regulated at the federal level. To learn more, my Integrative Neurophysiology Laboratory at Colorado State University is studying people with MS in the state who are already using medical cannabis as a treatment to investigate what MS symptoms the drug can effectively treat. © 2010–2017, The Conversation US, Inc.
By Helen Briggs BBC News Babies build knowledge about the language they hear even in the first few months of life, research shows. If you move countries and forget your birth language, you retain this hidden ability, according to a study. Dutch-speaking adults adopted from South Korea exceeded expectations at Korean pronunciation when retrained after losing their birth language. Scientists say parents should talk to babies as much as possible in early life. Dr Jiyoun Choi of Hanyang University in Seoul led the research. The study is the first to show that the early experience of adopted children in their birth language gives them an advantage decades later even if they think it is forgotten, she said. ''This finding indicates that useful language knowledge is laid down in [the] very early months of life, which can be retained without further input of the language and revealed via re-learning,'' she told BBC News. In the study, adults aged about 30 who had been adopted as babies by Dutch-speaking families were asked to pronounce Korean consonants after a short training course. Korean consonants are unlike those spoken in Dutch. The participants were compared with a group of adults who had not been exposed to the Korean language as children and then rated by native Korean speakers. Both groups performed to the same level before training, but after training the international adoptees exceeded expectations. There was no difference between children who were adopted under six months of age - before they could speak - and those who were adopted after 17 months, when they had learned to talk. This suggests that the language knowledge retained is abstract in nature, rather than dependent on the amount of experience. © 2017 BBC
By Lisa Rapaport Researchers examined data on high school soccer players from 2005 to 2014 and found non-concussion injury rates declined for boys and were little changed for girls. But concussions increased in both male and female players. The significant rise in concussion rates "could be mainly due to a better recognition of concussion by medical and coaching staff," study leader Dr. Morteza Khodaee, a sports medicine researcher at the University of Colorado School of Medicine, said in an email. The research team looked at injuries per minute of athletic exposure (AE), which includes both practices and competitions, for U.S. high school athletes. Overall, there were 6,154 injuries during 2.98 million athletic exposures, for an injury rate of 2.06 per 1,000 AEs, the study found. That included about 1.8 million soccer injuries among girls and 1.5 million among boys. Girls were 27 percent more likely to sustain soccer injuries than boys, the authors reported online December 28 in the British Journal of Sports Medicine. Injuries were 42 percent more common in competitions than during practice. "The majority of injuries during competitions occurred during the second half indicating a potential accumulated effect of fatigue," the authors reported. "It is well known that the risk of injury is higher in competition compared with practice," Khodaee said. "This is most likely due to more intense, full contact and potentially riskier play that occurs in competition." Still, while injury rates were significantly higher in competition, more than one third of all injuries occurred in practice. © 2017 Scientific American
Keyword: Brain Injury/Concussion
Link ID: 23117 - Posted: 01.18.2017
By Catherine Offord In the early 20th century, Danish biologist Johannes Schmidt solved a puzzle that had confounded European fisherman for generations. Freshwater eels—popular for centuries on menus across northern Europe—were abundant in rivers and creeks, but only as adults, never as babies. So where were they coming from? In 1922, after nearly two decades of research, Schmidt published the answer: the Sargasso Sea, the center of a massive, swirling gyre in the North Atlantic Ocean. Now regarded as some of the world’s most impressive animal migrators, European eels (Anguilla anguilla) journey westward across the Atlantic to spawning sites in the Sargasso; their eggs hatch into larvae that are carried back across the ocean by the Gulf Stream, arriving two or three years later to repopulate European waterways. For decades, researchers have assumed that adults made the journey in one short and rapid migration, leaving European coastlines in autumn and arriving in the Sargasso Sea, ready to spawn, the following spring. But this assumption rests on surprisingly little evidence, says behavioral ecologist David Righton of the UK Centre for Environment, Fisheries, and Aquaculture Science. “Since Johannes Schmidt identified this spawning area in the Sargasso Sea, people have been wondering about that great journey and trying to figure out how to follow the eels,” says Righton, whose work on epic marine migrations includes appropriately titled projects such as CODYSSEY and EELIAD. “But the technology hasn’t been available. . . . They just slip away into the darkness, really, in autumn, and no one knows what happens to them.” © 1986-2017 The Scientist
Keyword: Animal Migration
Link ID: 23116 - Posted: 01.18.2017
By Rachael Lallensack A video game is helping researchers learn more about how tiny European starlings keep predators at bay. Their massive flocks, consisting of hundreds to thousands of birds, fly together in a mesmerizing, pulsating pattern called a murmuration. For a long time, researchers have suspected that the bigger the flock, the harder it is for predators like falcons and hawks to take down any one member, something known as “confusion effect.” Now, researchers have analyzed that effect—in human hunters. Using the first 3D computer program to simulate a murmuration, scientists tested how well 25 players, acting as flying predators, could target and pursue virtual starlings, whose movements were simulated based on data from real starling flocks (see video above). The team’s findings reaffirmed the confusion effect: The larger the simulated flocks, the harder it was for the “predators” to single out and catch individual prey, the researchers report this week in Royal Society Open Science. So maybe sometimes, it’s not so bad to get lost in a crowd. © 2017 American Association for the Advancement of Science.
By Kevin Pelphrey, In September, the Florida State University football team made a visit to a Tallahassee middle school that would become famous. At lunchtime, student-athlete Travis Rudolph noticed sixth grader Bo Paske eating alone, so he joined Bo for the meal. Bo, who has autism, often sat by himself in the lunchroom. The world took note of the athlete’s gesture after his mother’s Facebook post about it went viral. “This is one day I didn’t have to worry if my sweet boy ate lunch alone, because he sat across from someone who is a hero in many eyes,” she wrote. This story touched people because it calls to mind something universal: the sting of social exclusion. We have all known children who often eat, or play, alone. And all of us have felt left out at one time or another. But although this experience may be universal, a new generation of children is experiencing a wave of inclusiveness. Technology of various types, often thought of as an isolating influence, can actually abet people’s good intentions or help those with autism learn to fit in. One new app called Sit With Us, invented by 16-year-old Natalie Hampton, helps vulnerable children who have difficulty finding a welcoming group in the lunchroom. Its motto is inspiring: “The first step to a warmer, more inclusive community can begin with LUNCH.” Sit With Us allows students to designate themselves as ‘ambassadors’ and to signal to anyone seeking company that they’re invited to join the ambassador’s table. © 2017 Scientific American
Ian Sample Science editor Tempting as it may be, it would be wrong to claim that with each generation humans are becoming more stupid. As scientists are often so keen to point out, it is a bit more complicated than that. A study from Iceland is the latest to raise the prospect of a downwards spiral into imbecility. The research from deCODE, a genetics firm in Reykjavik, finds that groups of genes that predispose people to spend more years in education became a little rarer in the country from 1910 to 1975. The scientists used a database of more than 100,000 Icelanders to see how dozens of gene variants that affect educational attainment appeared in the population over time. They found a shallow decline over the 65 year period, implying a downturn in the natural inclination to rack up qualifications. But the genes involved in education affected fertility too. Those who carried more “education genes” tended to have fewer children than others. This led the scientists to propose that the genes had become rarer in the population because, for all their qualifications, better educated people had contributed less than others to the Icelandic gene pool. Spending longer in education and the career opportunities that provides is not the sole reason that better educated people tend to start families later and have fewer children, the study suggests. Many people who carried lots of genes for prolonged education left the system early and yet still had fewer children that the others. “It isn’t the case that education, or the career opportunities it provides, prevents you from having more children,” said Kari Stefansson, who led the study. “If you are genetically predisposed to have a lot of education, you are also predisposed to have fewer children.” © 2017 Guardian News and Media Limited
Eating disorders, including anorexia and bulimia, affect a small but substantial number of women in their 40s and 50s, UK research suggests. The study, involving more than 5,000 women, found just over 3% reported having an eating disorder. Some said they had experienced it since their teens, others developed it for the first time in their middle age. Julie Spinks, from Beaconsfield, is 48. She was not involved in the study, but can relate first-hand to its findings. She developed anorexia for the first time when she was 44. "It was a complete shock at the time," she recalls. "I knew that I was restricting my food but I didn't ever think I had anorexia. "I'd been really unhappy at work and had very low self-esteem. To begin with I just thought I had lost my appetite. "I felt depressed, like I was not worth feeding or existing. I wanted to disappear and fade away." Julie started to lose weight quite quickly and began to exercise as well. She realised something was very wrong one day after she had been to the gym. Mind struggle "I'd run for about an hour and burnt off about 500 calories. I remember thinking that's about the same as a chocolate bar. That's when I started to link food and exercise." Julie still did not recognise she had anorexia though. "I thought anorexia was something that happened to other people. It didn't occur to me that I might have it." After a breakdown at work she went for a mental health assessment. Her doctors then diagnosed her with anorexia and depression. Julie was given antidepressants and began therapy sessions to help with her eating disorder. © 2017 BBC.
Keyword: Anorexia & Bulimia
Link ID: 23112 - Posted: 01.17.2017