Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 3250

By Nathaniel P. Morris In the 20th century, the deinstitutionalization of mental health care took patients out of long-term psychiatric facilities with the aim that they might return to the community and lead more fulfilling lives. But in our rush to shut down America’s asylums, we failed to set up adequate outpatient services for the mentally ill, who now often fend for themselves on the streets or behind bars. According to recent surveys, the number of state psychiatric beds has fallen from over 550,000 in 1955 to fewer than 38,000 in 2016. Meanwhile, research conducted by the Treatment Advocacy Center estimates over 355,000 inmates in America’s prisons and jails suffered from severe mental illness in 2012. Last year, a report by the Department of Housing and Urban Development found that over 100,000 Americans who experienced homelessness also suffered from severe mental illness. Mental health advocates point to a number of failures, such as limited funding for outpatient care and a lack of political foresight, that may have led to this situation. Yet emerging community-based approaches to mental health care are providing hope for the severely mentally ill—as well as some constraints. Court-ordered care for patients with severe mental illness, known as assisted outpatient treatment or AOT, is spreading nationwide. In December, President Obama signed into law the landmark 21st Century Cures Act, bipartisan legislation that bolsters funding for medical research and reshapes approval processes for drugs and medical devices. The law also supports a number of mental health reforms, including millions in federal incentives for states to develop AOT. © 2017 Scientific American

Keyword: Depression; Schizophrenia
Link ID: 23144 - Posted: 01.25.2017

By Helen Briggs BBC News The idea that dogs are more intelligent than cats has been called into question. Japanese scientists say cats are as good as dogs at certain memory tests, suggesting they may be just as smart. A study - involving 49 domestic cats - shows felines can recall memories of pleasant experiences, such as eating a favourite snack. Dogs show this type of recollection - a unique memory of a specific event known as episodic memory. Humans often consciously try to reconstruct past events that have taken place in their lives, such as what they ate for breakfast, their first day in a new job or a family wedding. These memories are linked with an individual take on events, so they are unique to that person. Saho Takagi, a psychologist at Kyoto University, said cats, as well as dogs, used memories of a single past experience, which may imply they have episodic memory similar to that of humans. "Episodic memory is viewed as being related to introspective function of the mind; our study may imply a type of consciousness in cats," she told BBC News. "An interesting speculation is that they may enjoy actively recalling memories of their experience like humans." The Japanese team tested 49 domestic cats on their ability to remember which bowl they had already eaten out of and which remained untouched, after a 15-minute interval. © 2017 BBC

Keyword: Learning & Memory; Evolution
Link ID: 23143 - Posted: 01.25.2017

By Rachael Lallensack Jet lag can put anyone off their game, even Major League Baseball (MLB) players. Long-distance travel can affect specific—and at times, crucial—baseball skills such as pitching and base running, a new study finds. In fact, jetlag's effects can even cancel out the home field advantage for some teams returning from away games. Jet lag is known for its fatigue-inducing effects, most of which stem from a mismatch between a person’s internal clock and the time zone he or she is in, something called “circadian misalignment.” This misalignment is especially strong when a person’s day is shorter than it should be—which happens whenever people travel east—previous research has shown. Just how that affects sports teams has long been debated. A 2009 study of MLB, for example, found that jet lag did decrease a team’s likelihood of winning, if only slightly. But no prior study has ever been able to pinpoint exact areas of game play where the effects of jet lag hit hardest—data that could help coaches and trainers better prepare players for games following travel. To figure out how that might happen, “adopted” Chicago Cubs fan and study author Ravi Allada, a neurobiologist at Northwestern University in Evanston, Illinois, looked at 20 years’ worth of MLB data from 1992 to 2011. He and his team narrowed their data set from 46,535 games to the 4919 games in which players traveled at least two time zones. Then, they broke down offensive and defensive stats from each of those games, including home runs allowed, stolen bases, and sacrifice flies. Finally, they compared how the numbers changed for teams that had traveled east versus those that had traveled west. © 2017 American Association for the Advancement of Science.

Keyword: Biological Rhythms
Link ID: 23140 - Posted: 01.24.2017

By Ingfei Chen Learning Morse code, with its tappity-tap rhythms of dots and dashes, could take far less effort—and attention—than one might think. The trick is a wearable computer that engages the sensory powers of touch, according to a recent pilot study. The results suggest that mobile devices may be able to teach us manual skills, almost subconsciously, as we go about our everyday routines. Ph.D. student Caitlyn Seim and computer science professor Thad Starner of the Georgia Institute of Technology tinker with haptics, the integration of vibrations or other tactile cues with computing gadgets. Last September at the 20th International Symposium on Wearable Computers in Heidelberg, Germany, they announced that they had programmed Google Glass to passively teach its wearers Morse code—with preliminary signs of success. For the study, 12 participants wore the smart glasses while engrossed in an online game on a PC. During multiple hour-long sessions, half the players heard Google Glass's built-in speaker repeatedly spelling out words and felt taps behind the right ear (from a bone-conduction transducer built into the frames) for the dots and dashes corresponding to each letter. The other six participants heard only the audio, without the corresponding vibrations. After each run of game playing, all the players were asked to tap out letters in Morse code using a finger on the touch pad of the smart glasses; for example, if they tapped “dot-dot,” an “i” would pop up on the visual display. The brief testing essentially prompted them to try to learn the code. After four one-hour sessions, the group that had received tactile cues could tap a pangram (a sentence using the entire alphabet) with 94 percent accuracy. The audio-only group eventually achieved 47 percent accuracy, learning solely from their trial-and-error inputs. © 2017 Scientific American

Keyword: Learning & Memory
Link ID: 23138 - Posted: 01.24.2017

By NANCY L. SEGAL and SATOSHI KANAZAWA In 1973, the biologist Robert Trivers and the computer scientist Dan Willard made a striking prediction about parents and their offspring. According to the principles of evolutionary theory, they argued, the male-to-female ratio of offspring should not be 50-50 (as chance would dictate), but rather should vary as a function of how good (or bad) the conditions are in which the parents find themselves. Are the parents’ resources plentiful — or scarce? The Trivers-Willard hypothesis holds that when their conditions are good, parents will have more male offspring: Males with more resources are likely to gain access to more females, thereby increasing the frequency with which their genes (and thus their parents’ genes) are preserved in future generations. Conversely, male offspring that lack resources are likely to lose out to males that have more resources, so in bad conditions it pays for parents to “invest” more in daughters, which will have more opportunities to mate. It follows, as a kind of corollary, that when parents have plentiful resources they will devote those resources more to their sons, whereas when resources are scarce, parents will devote them more to their daughters. In short: If things are good, you have more boys, and give them more stuff. If things are bad, you have more girls, and give more of your stuff to them. Is this hypothesis correct? In new research of ours, to be published in the April issue of The Journal of Experimental Child Psychology, we suggest that in the case of breast-feeding, at least, it appears to be. In recent years, evidence has emerged suggesting that in various mammalian species, breast milk — which is, of course, a resource that can be given to children — is tailored for the sex of each offspring. For example, macaque monkey mothers produce richer milk (with higher gross energy and fat content) for sons than for daughters, but also provide greater quantities of milk and higher concentrations of calcium for daughters than for sons. © 2017 The New York Times Company

Keyword: Sexual Behavior; Evolution
Link ID: 23135 - Posted: 01.23.2017

By Carl Bialik A woman has never come closer to the presidency than Hillary Clinton did in winning the popular vote in November. Yet as women march in Washington on Saturday, many of them to protest the presidency of Donald Trump, an important obstacle to the first woman president remains: the hidden, internalized bias many people hold against career advancement by women. And perhaps surprisingly, there is evidence that women hold more of this bias, on average, than men do. There has been lots of discussion of the role that overt sexism played in both Trump’s campaign and at the ballot box. A YouGov survey conducted two weeks before the election, for example, found that Trump voters had much higher levels of sexism, on average, than Clinton voters, as measured by their level of agreement with statements such as “women seek to gain power by getting control over men.” An analysis of the survey found that sexism played a big role in explaining people’s votes, after controlling for other factors, including gender and political ideology. Other research has reached similar conclusions. Two recent studies of voters, however, suggest that another, subtler form of bias may also have been a factor in the election. These studies looked at what’s known as “implicit bias,” the unconscious tendency to associate certain qualities with certain groups — in this case, the tendency to associate men with careers and women with family. Researchers have found that this kind of bias is stronger on average in women than in men, and, among women, it is particularly strong among political conservatives. And at least according to one study, this unconscious bias was especially strong among one group in 2016: women who supported Trump.

Keyword: Attention
Link ID: 23134 - Posted: 01.23.2017

Robert McCrum Stroke, or “brain attack”, is the third biggest killer in the western world, after cancer and heart failure. The life-changing effects associated with this simple, Anglo-Saxon word are readily explained: a stroke occurs when the blood supply to the brain is disrupted by a blood vessel either bursting or blocking, so that the part of the brain supplied by this blood vessel dies. The brain is a much more complex organ than the heart. While strokes are a common feature of everyday life, precisely how and why they occur is far from straightforward. Each year in the UK, there will be about 50,000 brain attacks. One-third of those affected will die; one-third will be left severely disabled; and about one-third will make some kind of recovery. In the time it takes to read this article, approximately nine people in Britain, from across all age groups, will have suffered a stroke. Or did they? For the brain is not only super-sensitive territory – as the human animal’s command HQ – it is also top secret. Despite extraordinary progress in MRI scans, the brain remains essentially mysterious and the symptoms of its dysfunction can be hard to diagnose with certainty. An elderly man presenting himself at A&E with unsteady gait and a slurring of his words could be suffering a stroke – or he might just be intoxicated. Treat him for the former, and you’ll save his life; treat him as a drunk, and he might die. © 2017 Guardian News and Media Limited

Keyword: Stroke
Link ID: 23133 - Posted: 01.23.2017

By JANE E. BRODY Susan Sills, a Brooklyn artist who until recently made life-size cutouts on plywood using a power saw, long suspected she might be at risk for developing Parkinson’s disease. Both her mother and grandfather had this neurological movement disorder, and she knew that it sometimes runs in families. So she was not surprised when at age 72 she first noticed hand tremors and a neurologist confirmed that she had the disease. But to watch her in action three years later, it would be hard for a layperson to tell. She stands straight, walks briskly, speaks in clarion tones and maintains a schedule that could tire someone half her age. Having wisely put the power saw aside, Ms. Sills now makes intricately designed art jewelry. She is also a docent at the Brooklyn Museum, participates in a cooperative art gallery and assists her husband’s business by entertaining customers. Ms. Sills attributes her energy and well-being partly to the medication she takes but primarily to the hours she spends working out with a physical therapist and personal trainer, who have helped her develop an exercise regimen that, while not a cure, can alleviate Parkinson’s symptoms and slow progression of the disease. “The exercises opened me up,” said Ms. Sills, allowing such symptoms as small steps, slow movements and tiny, cramped handwriting to subside. “The earlier people begin exercising after a Parkinson’s diagnosis, and the higher the intensity of exercise they achieve, the better they are,” Marilyn Moffat, a physical therapist on the faculty of New York University, said. “Many different activities have been shown to be beneficial, including cycling, boxing, dancing and walking forward and backward on a treadmill. If someone doesn’t like one activity, there are others that can have equally good results.” © 2017 The New York Times Company

Keyword: Parkinsons
Link ID: 23132 - Posted: 01.23.2017

By NICHOLAS ST. FLEUR The tale of the Tasmanian tiger was tragic. Once numerous across Tasmania, the doglike marsupial was branded a sheep killer by colonists in the 1830s and hunted to extinction. The last of its kind, Benjamin, died in a zoo in 1936, and with it many secrets into the animals’ lives were lost. The striped creature, which is also known as the thylacine, was hardly studied when it was alive, depriving scientists of understanding the behavior of an important predator from Australia’s recent biological past. Now, for the first time, researchers have performed neural scans on the extinct carnivore’s brain, revealing insights that had been lost since the species went extinct. “Part of the myth about them is what exactly did they eat, how did they hunt and were they social?” said Dr. Gregory Berns, a neuroscientist at Emory University and lead author on the study, which was published Wednesday in the journal PLOS One. “These are questions nobody really knows the answers to.” Dr. Berns’s main research pertains to dogs and the inner workings of the canine brain, but after learning more about Tasmanian tigers, he became fascinated by the beasts. With their slender bodies, long snouts and sharp teeth, Tasmanian tigers looked as if they could be related to dogs, wolves or coyotes. But actually they are separated by more than 150 million years of evolution. It is a classic example of convergent evolution, in which two organisms that are not closely related develop similar features because of the environment they adapted to and the ecological role they played. To better understand thylacines, Dr. Berns spent two years tracking down two preserved Tasmanian tiger brains, one at the Smithsonian Institution and the other at the Australian Museum. Their brains, like those of all marsupials, are very different from the brains of placental mammals. The biggest difference is that they lack a corpus callosum, which is the part of the brain that connects the left and right hemispheres. © 2017 The New York Times Company

Keyword: Evolution; Brain imaging
Link ID: 23131 - Posted: 01.21.2017

By Jordan Axt Imagine playing a game where you’re seated in front of four decks of cards. On the back of two decks are pictures of puppies; on the other two are pictures of spiders. Each deck has some cards that win points and others that lose points. In general, the puppy decks are “good” in that they win you more points than they lose while the spider decks are “bad” in that they lose you more points they win. You repeatedly select cards in hopes of winning as many points as possible. This game seems pretty easy— and it is. Most players favor the puppy decks from the start and quickly learn to continue favoring them because they produce more points. However, if the pictures on the decks are reversed, the game becomes a little harder. People may have a tougher time initially favoring spider decks because it’s difficult to learn that something people fear like spiders brings positive outcomes and something people enjoy like puppies brings negative outcomes. Performance on this learning task is best when one’s attitudes and motivations are aligned. For instance, when puppies earn you more points than spiders, people’s preference for puppies can lead people to select more puppies initially, and a motivation to earn as many points as possible leads people to select more and more puppies over time. But when spiders earn you more points than spiders, people have to overcome their initial aversion to spiders in order to perform well. © 2017 Scientific American

Keyword: Attention
Link ID: 23130 - Posted: 01.21.2017

Claudia Dreifus Geneticists tell us that somewhere between 1 and 5 percent of the genome of modern Europeans and Asians consists of DNA inherited from Neanderthals, our prehistoric cousins. At Vanderbilt University, John Anthony Capra, an evolutionary genomics professor, has been combining high-powered computation and a medical records databank to learn what a Neanderthal heritage — even a fractional one — might mean for people today. We spoke for two hours when Dr. Capra, 35, recently passed through New York City. An edited and condensed version of the conversation follows. Q. Let’s begin with an indiscreet question. How did contemporary people come to have Neanderthal DNA on their genomes? A. We hypothesize that roughly 50,000 years ago, when the ancestors of modern humans migrated out of Africa and into Eurasia, they encountered Neanderthals. Matings must have occurred then. And later. One reason we deduce this is because the descendants of those who remained in Africa — present day Africans — don’t have Neanderthal DNA. What does that mean for people who have it? At my lab, we’ve been doing genetic testing on the blood samples of 28,000 patients at Vanderbilt and eight other medical centers across the country. Computers help us pinpoint where on the human genome this Neanderthal DNA is, and we run that against information from the patients’ anonymized medical records. We’re looking for associations. What we’ve been finding is that Neanderthal DNA has a subtle influence on risk for disease. It affects our immune system and how we respond to different immune challenges. It affects our skin. You’re slightly more prone to a condition where you can get scaly lesions after extreme sun exposure. There’s an increased risk for blood clots and tobacco addiction. To our surprise, it appears that some Neanderthal DNA can increase the risk for depression; however, there are other Neanderthal bits that decrease the risk. Roughly 1 to 2 percent of one’s risk for depression is determined by Neanderthal DNA. It all depends on where on the genome it’s located. © 2017 The New York Times Company

Keyword: Evolution; Depression
Link ID: 23128 - Posted: 01.21.2017

Children who have their tonsils removed to treat chronic throat infections or breathing problems during sleep may get more short-term symptom relief than similar children who don’t get tonsillectomies, two recent studies suggest. Over time, however, the benefits of surgery for chronic streptococcal throat infections appear to go away. Three years after tonsillectomies, children who had these procedures had about the same number of throat infections as those who didn’t get their tonsils taken out, one of the studies in the journal Pediatrics found. “Tonsillectomy, while very common and generally safe, is not completely without risk,” said Sivakumar Chinnadurai, senior author of the strep throat study and a researcher at Vanderbilt University Medical Center in Nashville. “The recognition of risks, and the knowledge that some patients’ infection rate improves over time has led to [strep] infection being a much less common indication for tonsillectomy than it was in the past,” Chinnadurai added by email. “While tonsillectomy remains one of the most common surgeries performed in the United States, the main indication for children has switched to obstructed breathing.” To assess the potential for tonsillectomies to help young people with chronic strep infections, Chinnadurai and colleagues examined data from seven studies of children who had experienced at least three strep infections in the previous one to three years. © 1996-2017 The Washington Post

Keyword: Sleep
Link ID: 23127 - Posted: 01.21.2017

By R. Douglas Fields With American restrictions on travel lifting, interest in Cuba has skyrocketed, especially among scientists considering developing collaborations and student exchange programs with their Caribbean neighbors. But few researchers in the United States know how science and higher education are conducted in communist Cuba. Undark met with Dr. Mitchell Valdés-Sosa, director of the Cuban Neuroscience Center, in his office in Havana to learn how someone becomes a neuroscientist in Cuba, and to discuss what the future may hold for scientific collaborations between the two nations. It is helpful to appreciate some of the ways that higher education and research operate differently in communist Cuba. In contrast to the local institutional and individual control of decisions in the U.S., the central government in Cuba makes career and educational decisions for its citizens. Scientific research is directed by authorities to meet the needs of the developing country, and Ph.D. dissertation proposals must satisfy this goal for approval. Much of the graduate education takes place in biotechnology companies and research centers that are authorized by the government — a situation resembling internships in the U.S. Development, production, and marketing of products from biomedical research and education are all carried out in the same center, and the sales of these products provide financial support to the institution. Copyright 2017 Undark

Keyword: Miscellaneous
Link ID: 23124 - Posted: 01.19.2017

By Avi Selk “Oh Long Johnson,” a cat once said, back in the primordial history of Internet memes. “Oh Don Piano. Why I eyes ya.” Or so said the captions — appended to the gibberish of a perturbed house cat on “America's Funniest Home Videos” in 1999 and rediscovered in the YouTube era, when millions of people heard something vaguely human echo in a distant species. It was weird. And hilarious. And just maybe, profound. As the “Oh Long Johnson” craze was fading a few years ago, a wave of scientific discoveries about apes and monkeys began upending old assumptions about the origins of language. Only humans could willfully control their vocal tracts, went the established wisdom. Until Koko the gorilla coughed on command. Surely, then, our vowels were ours alone. But this month, researchers picked up British ohs in the babble of baboons. Study after study is dismantling a hypothesis that has stood for decades: that the seeds of language did not exist before modern humans, who got all the way to Shakespeare from scratch. And if so much of what we thought we knew about the uniqueness of human speech was wrong, some think it's time to take a second look at talking pet tricks. “It's humbling to understand that humans, in the end, are just another species of primate,” said Marcus Perlman, who led the Koko study in 2015. © 1996-2017 The Washington Post

Keyword: Language; Evolution
Link ID: 23122 - Posted: 01.19.2017

Tina Rosenberg It has been nearly 30 years since the first needle exchange program opened in the United States, in Takoma, Wash., in 1988. It was a health measure to prevent injecting drug users from sharing needles, and therefore spreading H.I.V. and hepatitis. The idea was controversial, to say the least. Many people felt — and still feel — that it enables drug use and sends a message that drug use is O.K. and can be done safely. Today the evidence is overwhelming that needle exchange prevents disease, increases use of drug treatment by winning users’ trust and bringing them into the health system, and does not increase drug use. Its utility has won over some critics. When Vice President-elect Mike Pence was governor of Indiana, he authorized needle exchange programs as an emergency response to an H.I.V. outbreak. “I do not support needle exchange as antidrug policy, but this is a public health emergency,” he said at a news conference in 2015. Needle exchange saved New York City from a generalized H.I.V. epidemic. In 1990, more than half of injecting drug users had H.I.V. Then in 1992, needle exchange began — and by 2001, H.I.V. prevalence had fallen to 13 percent. America has another epidemic now: overdose deaths from opioids, heroin and fentanyl, a synthetic opioid so powerful that a few grains can kill. A thousand people died of overdose in the city last year — three times the number who were killed in homicides. Nationally, drug overdose has passed firearms and car accidents as the leading cause of injury deaths. If there is a way to save people from overdose death without creating harm, we should do it. Yet there is a potent weapon that we’re ignoring: the supervised injection room. According to a report by the London-based group Harm Reduction International, 90 supervised injection sites exist around the world: in Canada, Australia and eight countries in Europe. Scotland and Ireland plan to open sites this year. In the United States, state officials in New York, California and Maryland, and city officials in Seattle (where a task force recommended two sites), San Francisco, New York City, Ithaca, N.Y., and elsewhere, are discussing such facilities. © 2017 The New York Times Company

Keyword: Drug Abuse
Link ID: 23120 - Posted: 01.18.2017

By Lisa Rapaport Researchers examined data on high school soccer players from 2005 to 2014 and found non-concussion injury rates declined for boys and were little changed for girls. But concussions increased in both male and female players. The significant rise in concussion rates "could be mainly due to a better recognition of concussion by medical and coaching staff," study leader Dr. Morteza Khodaee, a sports medicine researcher at the University of Colorado School of Medicine, said in an email. The research team looked at injuries per minute of athletic exposure (AE), which includes both practices and competitions, for U.S. high school athletes. Overall, there were 6,154 injuries during 2.98 million athletic exposures, for an injury rate of 2.06 per 1,000 AEs, the study found. That included about 1.8 million soccer injuries among girls and 1.5 million among boys. Girls were 27 percent more likely to sustain soccer injuries than boys, the authors reported online December 28 in the British Journal of Sports Medicine. Injuries were 42 percent more common in competitions than during practice. "The majority of injuries during competitions occurred during the second half indicating a potential accumulated effect of fatigue," the authors reported. "It is well known that the risk of injury is higher in competition compared with practice," Khodaee said. "This is most likely due to more intense, full contact and potentially riskier play that occurs in competition." Still, while injury rates were significantly higher in competition, more than one third of all injuries occurred in practice. © 2017 Scientific American

Keyword: Brain Injury/Concussion
Link ID: 23117 - Posted: 01.18.2017

By Catherine Offord In the early 20th century, Danish biologist Johannes Schmidt solved a puzzle that had confounded European fisherman for generations. Freshwater eels—popular for centuries on menus across northern Europe—were abundant in rivers and creeks, but only as adults, never as babies. So where were they coming from? In 1922, after nearly two decades of research, Schmidt published the answer: the Sargasso Sea, the center of a massive, swirling gyre in the North Atlantic Ocean. Now regarded as some of the world’s most impressive animal migrators, European eels (Anguilla anguilla) journey westward across the Atlantic to spawning sites in the Sargasso; their eggs hatch into larvae that are carried back across the ocean by the Gulf Stream, arriving two or three years later to repopulate European waterways. For decades, researchers have assumed that adults made the journey in one short and rapid migration, leaving European coastlines in autumn and arriving in the Sargasso Sea, ready to spawn, the following spring. But this assumption rests on surprisingly little evidence, says behavioral ecologist David Righton of the UK Centre for Environment, Fisheries, and Aquaculture Science. “Since Johannes Schmidt identified this spawning area in the Sargasso Sea, people have been wondering about that great journey and trying to figure out how to follow the eels,” says Righton, whose work on epic marine migrations includes appropriately titled projects such as CODYSSEY and EELIAD. “But the technology hasn’t been available. . . . They just slip away into the darkness, really, in autumn, and no one knows what happens to them.” © 1986-2017 The Scientist

Keyword: Animal Migration
Link ID: 23116 - Posted: 01.18.2017

Ian Sample Science editor Tempting as it may be, it would be wrong to claim that with each generation humans are becoming more stupid. As scientists are often so keen to point out, it is a bit more complicated than that. A study from Iceland is the latest to raise the prospect of a downwards spiral into imbecility. The research from deCODE, a genetics firm in Reykjavik, finds that groups of genes that predispose people to spend more years in education became a little rarer in the country from 1910 to 1975. The scientists used a database of more than 100,000 Icelanders to see how dozens of gene variants that affect educational attainment appeared in the population over time. They found a shallow decline over the 65 year period, implying a downturn in the natural inclination to rack up qualifications. But the genes involved in education affected fertility too. Those who carried more “education genes” tended to have fewer children than others. This led the scientists to propose that the genes had become rarer in the population because, for all their qualifications, better educated people had contributed less than others to the Icelandic gene pool. Spending longer in education and the career opportunities that provides is not the sole reason that better educated people tend to start families later and have fewer children, the study suggests. Many people who carried lots of genes for prolonged education left the system early and yet still had fewer children that the others. “It isn’t the case that education, or the career opportunities it provides, prevents you from having more children,” said Kari Stefansson, who led the study. “If you are genetically predisposed to have a lot of education, you are also predisposed to have fewer children.” © 2017 Guardian News and Media Limited

Keyword: Intelligence; Genes & Behavior
Link ID: 23113 - Posted: 01.17.2017

Eating disorders, including anorexia and bulimia, affect a small but substantial number of women in their 40s and 50s, UK research suggests. The study, involving more than 5,000 women, found just over 3% reported having an eating disorder. Some said they had experienced it since their teens, others developed it for the first time in their middle age. Julie Spinks, from Beaconsfield, is 48. She was not involved in the study, but can relate first-hand to its findings. She developed anorexia for the first time when she was 44. "It was a complete shock at the time," she recalls. "I knew that I was restricting my food but I didn't ever think I had anorexia. "I'd been really unhappy at work and had very low self-esteem. To begin with I just thought I had lost my appetite. "I felt depressed, like I was not worth feeding or existing. I wanted to disappear and fade away." Julie started to lose weight quite quickly and began to exercise as well. She realised something was very wrong one day after she had been to the gym. Mind struggle "I'd run for about an hour and burnt off about 500 calories. I remember thinking that's about the same as a chocolate bar. That's when I started to link food and exercise." Julie still did not recognise she had anorexia though. "I thought anorexia was something that happened to other people. It didn't occur to me that I might have it." After a breakdown at work she went for a mental health assessment. Her doctors then diagnosed her with anorexia and depression. Julie was given antidepressants and began therapy sessions to help with her eating disorder. © 2017 BBC.

Keyword: Anorexia & Bulimia
Link ID: 23112 - Posted: 01.17.2017

By Alice Klein Who needs men? A female shark separated from her long-term mate has developed the ability to have babies on her own. Leonie the zebra shark (Stegostoma fasciatum) met her male partner at an aquarium in Townsville, Australia, in 1999. They had more than two dozen offspring together before he was moved to another tank in 2012. From then on, Leonie did not have any male contact. But in early 2016, she had three baby sharks. Intrigued, Christine Dudgeon at the University of Queensland in Brisbane, Australia, and her colleagues began fishing for answers. Zoologger: The amphibious fish that mates with itself One possibility was that Leonie had been storing sperm from her ex and using it to fertilise her eggs. But genetic testing showed that the babies only carried DNA from their mum, indicating they had been conceived via asexual reproduction. Some vertebrate species have the ability to reproduce asexually even though they normally reproduce sexually. These include certain sharks, turkeys, Komodo dragons, snakes and rays. However, most reports have been in females who have never had male partners. There are very few reports of asexual reproduction occurring in females with previous sexual histories, says Dudgeon. An eagle ray and a boa constrictor, both in captivity, are the only other female animals that have been documented switching from sexual to asexual reproduction. © Copyright Reed Business Information Ltd.

Keyword: Sexual Behavior
Link ID: 23110 - Posted: 01.17.2017