Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By PAM BELLUCK BALTIMORE — Leave it to the youngest person in the lab to think of the Big Idea. Xuyu Qian, 23, a third-year graduate student at Johns Hopkins, was chatting in late January with Hongjun Song, a neurologist. Dr. Song was wondering how to test their three-dimensional model of a brain — well, not a brain, exactly, but an “organoid,” essentially a tiny ball of brain cells, grown from stem cells and mimicking early brain development. “We need a disease,” Dr. Song said. Mr. Qian tossed out something he’d seen in the headlines: “Why don’t we check out this Zika virus?” Within a few weeks — a nanosecond compared with typical scientific research time — that suggestion led to one of the most significant findings in efforts to answer a central question: How does the Zika virus cause brain damage, including the abnormally small heads in babies born to infected mothers? The answer could spur discoveries to prevent such devastating neurological problems. And time is of the essence. One year after the virus was first confirmed in Latin America, with the raging crisis likely to reach the United States this summer, no treatment or vaccine exists. “We can’t wait,” said Dr. Song, at the university’s Institute for Cell Engineering, where he and his wife and research partner, Dr. Guo-Li Ming, provided a pipette-and-petri-dish-level tour. “To translate our work for the clinic, to the public, normally it takes years. This is a case where we can make a difference right away.” The laboratory’s initial breakthrough, published in March with researchers at two other universities, showed that the Zika virus attacked and killed so-called neural progenitor cells, which form early in fetal development and generate neurons in the brain. © 2016 The New York Times Company
By Maia Szalavitz Both the FDA and the CDC have recently taken steps to address an epidemic of opioid overdose and addiction, which is now killing some 29,000 Americans each year. But these regulatory efforts will fail unless we acknowledge that the problem is actually driven by illicit—not medical—drug use. You’ve probably read that 80 percent of heroin users started with prescription medications—and you may have seen billboards that compare giving pain medication to children to giving them heroin. You have probably also heard and seen media stories of people with addiction who blame their problem on medical use. But the simple reality is this: According to the large, annually repeated and representative National Survey on Drug Use and Health, 75 percent of all opioid misuse starts with people using medication that wasn’t prescribed for them—obtained from a friend, family member or dealer. And 90 percent of all addictions—no matter what the drug—start in the adolescent and young adult years. Typically, young people who misuse prescription opioids are heavy users of alcohol and other drugs. This type of drug use, not medical treatment with opioids, is by far the greatest risk factor for opioid addiction, according to a study by Richard Miech of the University of Michigan and his colleagues. For this research, the authors analyzed data from the nationally representative Monitoring the Future survey, which includes thousands of students. While medical use of opioids among students who were strongly opposed to alcohol and other drugs did raise later risk for misuse, the overall risk for this group remained small and their actual misuse occurred less than five times a year. In other words, it wasn’t actually addiction. Given that these teens had generally rejected experimenting with drugs, an increased risk of misuse associated with medical care makes sense since they’d otherwise have no source of exposure. © 2016 Scientific American
Keyword: Drug Abuse
Link ID: 22202 - Posted: 05.11.2016
Aaron E. Carroll People get hooked on cigarettes, and enjoy them for that matter, because of the nicotine buzz. The nicotine doesn’t give them cancer and lung disease, though. It’s the tar and other chemicals that do the real harm. A robust debate is going on among public health officials over whether electronic cigarettes, or e-cigarettes, can alleviate the harms of smoking tobacco, or whether they should be treated as negatively as conventional cigarettes. In other countries, such as Britain, officials are more in favor of e-cigarettes, encouraging smokers to switch from conventional to electronic. Last week, the Food and Drug Administration issued new rules on e-cigarettes, banning their sale to anyone under 18 and requiring that adults under the age of 26 show a photo identification to buy them. Electronic cigarettes carry the promise of delivering the nicotine without the dangerous additives. The use of e-cigarettes by youth has increased sharply in recent years. In 2011, about 1.5 percent of high school students reported using them in the last month. In 2014, more than 12 percent of students did. That means that nearly 2.5 million American middle and high school students used them in the past month. The problem is that nicotine is generally considered less safe for children and adolescents than for adults. Poisoning is possible. It’s thought that nicotine may interfere with brain development. Most worrisome, it’s believed that becoming addicted to nicotine in any form makes smoking more likely later in life. E-cigarettes are perceived to be less harmful than conventional cigarettes, and they are thought to be useful aids to quitting. These perceptions, however, are not always fully grounded in evidence. © 2016 The New York Times Company
Keyword: Drug Abuse
Link ID: 22201 - Posted: 05.11.2016
Nancy Shute A body mass index under 25 is deemed normal and healthy, and a higher BMI that's "overweight" or "obese" is not. But that might be changing, at least when it comes to risk of death. The body mass index, or BMI, associated with the lowest risk of death has increased since the 1970s, a study finds, from 23.7, in the "normal" weight category, to 27, which is deemed "overweight." That means a person who is 5-foot-8 could weigh 180 pounds and be in that epidemiological sweet spot, according to the NIH's online BMI calculator. The results were published Tuesday in JAMA, the journal of the American Medical Association. The researchers came to that conclusion by looking at data from three studies of people in Copenhagen, one from the 1970s, one from the 1990s and one from 2003-2013. More than 100,000 people were involved. Because Denmark has an excellent national health registry, they were able to pinpoint the cause of death for every single one of those people. The risk of death for people who are obese, with a BMI of 30 or greater, also declined, to the point that it was on a par with some people of so-called "normal" weight. So being fatter, at least a bit, may be healthier. "I was surprised as a scientist to see how clear the result was," Borge Nordestgaard, a clinical professor and chief physician at Copenhagen University Hospital and senior author of the study, told Shots. So he and his colleagues sliced and diced the data to see what could account for the shift. They looked at age, sex, smoking, cancer and heart disease. The most relevant was the decline in smoking since the 1970s. But when they looked at the mortality rates in nonsmokers who had never had cancer or heart disease, it also became associated with a higher BMI over time. © 2016 npr
Link ID: 22200 - Posted: 05.11.2016
By Dan Kiefer I’m on the heavy bag, throwing left jabs, ignoring the relentless blare of Kanye’s “Drive Slow, Homie” played at a volume that would raise the dead. I punch to a one-two count: left jab, right cross. I’m working as hard as I’ve ever worked, and even in this unheated gym I sweat as if it’s a sauna. Finally, the bell rings. It feels as if I’ve been at it for an hour; actually, three minutes have passed. The ensuing one-minute break seems to last four seconds. Let’s be clear: Boxing, even when the opponent is only a heavy bag, is a brutal sport. But brutality is needed, even welcome, when you’re facing a progressive, incurable neurological disease. I have Parkinson’s disease, and it causes my body to just freeze up. Weirdly enough, boxing helps me get unstuck. All 12 of us in this class bear the unmistakable signs of Parkinson’s disease. I spot a dapper, cheerful white-haired fellow shaking like a leaf (tremor). Next, a balding, heavyset guy stumbling forward awkwardly on his toes (dystonia, or muscle cramping). Then I see myself in a mirror: a man in a white T-shirt, khaki shorts and Nike running shoes, standing still, seemingly paralyzed. I’m in the midst of a Parkinson’s freeze (an extreme form of bradykinesia, or slow movement). Although Parkinson’s is generally thought of as an old-person’s disease, I was diagnosed with a young-onset version 18 years ago, at age 35. Since then, I’ve taken every sort of medication known to science. I’ve had brain surgery — two tiny electrodes were implanted deep in my brain to stimulate an area affected by Parkinson’s — which unquestionably have helped treat some of my symptoms. But medicine and surgery have not cured my freezing and falling, my gait and balance issues that worsen as my disease progresses: When walking across a busy street, I may suddenly, inexplicably come to a full stop as the light is about to change. Even the slightest downhill slope of a path causes me to fall forward.
Link ID: 22198 - Posted: 05.10.2016
Chris Woolston A story about epigenetics in the 2 May issue of The New Yorker has been sharply criticized for inaccurately describing how genes are regulated. The article by Siddhartha Mukherjee — a physician, cancer researcher and award-winning author at Columbia University in New York — examines how environmental factors can change the activity of genes without altering the DNA sequence. Jerry Coyne, an evolutionary ecologist at the University of Chicago in Illinois, posted two widely discussed blog posts calling the piece “superficial and misleading”, largely because it ignored key aspects of gene regulation. Other researchers quoted in the blog posts called the piece “horribly damaging” and “a truly painful read”. Mukherjee responded by publishing a point-by-point rebuttal online. Speaking to Nature, he says he now realizes that he erred by omitting key areas of the science, but that he didn’t mean to mislead. “I sincerely thought that I had done it justice,” he says. Mukherjee’s article, ‘Same But Different’, takes a personal view of epigenetics — a term whose definition is highly contentious in the field. The story features his mother and aunt, identical twins who have distinct personalities. Mukherjee, who won a Pulitzer Prize in 2011 for his best-selling book The Emperor of All Maladies: A Biography of Cancer (Scribner, 2010), writes that identical twins differ because: “Chance events — injuries, infections, infatuations; the haunting trill of that particular nocturne — impinge on one twin and not on the other. Genes are turned on and off in response to these events, as epigenetic marks are gradually layered above genes, etching the genome with its own scars, calluses, and freckles.” The article is drawn from a book by Mukherjee that is due out later this month, called The Gene: An Intimate History (Scribner, 2016). © 2016 Nature Publishing Group
by Julia Belluz and Javier Zarracina "I'm going to make you work hard," a blonde and perfectly muscled fitness instructor screamed at me in a recent spinning class, "so you can have that second drink at happy hour!" At the end of the 45-minute workout, my body was dripping with sweat. I felt like I had worked really, really hard. And according to my bike, I had burned more than 700 calories. Surely I had earned an extra margarita. The spinning instructor was echoing a message we've been getting for years: As long as you get on that bike or treadmill, you can keep indulging — and still lose weight. It's been reinforced by fitness gurus, celebrities, food and beverage companies like PepsiCo and Coca-Cola, and even public-health officials, doctors, and the first lady of the United States. Countless gym memberships, fitness tracking devices, sports drinks, and workout videos have been sold on this promise. There's just one problem: This message is not only wrong, it's leading us astray in our fight against obesity. To find out why, I read through more than 60 studies on exercise and weight loss. I also spoke to nine leading exercise, nutrition, and obesity researchers. Here's what I learned. 1) An evolutionary clue to how our bodies burn calories When anthropologist Herman Pontzer set off from Hunter College in New York to Tanzania to study one of the few remaining hunter-gatherer tribes on the planet, he expected to find a group of calorie burning machines. Unlike Westerners, who increasingly spend their waking hours glued to chairs, the Hadza are on the move most of the time. Men typically go off and hunt — chasing and killing animals, climbing trees in search of wild honey. Women forage for plants, dig up tubers, and comb bushes for berries. "They're on the high end of physical activity for any population that's been looked at ever," Pontzer said. © 2016 Vox Media, Inc
Link ID: 22196 - Posted: 05.09.2016
By DAN BARRY IDIOT. Imbecile. Cretin. Feebleminded. Moron. Retarded. Offensive now but once quite acceptable, these terms figured in the research for a lengthy article I wrote in 2014 about 32 men who spent decades eviscerating turkeys in a meat-processing plant in Iowa — all for $65 a month, along with food and lodging in an ancient former schoolhouse on a hill. These were men with intellectual disability, which meant they had significant limitations in reasoning, learning and problem solving, as well as in adaptive behavior. But even though “intellectual disability” has been the preferred term for more than a decade, it gave my editors and me pause. We wondered whether readers would instantly understand what the phrase meant. What’s more, advocates and academicians were recommending that I suppress my journalistic instinct to tighten the language. I was told that it was improper to call these men “intellectually disabled,” instead of “men with intellectual disability.” Their disability does not define them; they are human beings with a disability. This linguistic preference is part of society’s long struggle to find the proper terminology for people with intellectual disability, and reflects the discomfort the subject creates among many in the so-called non-disabled world. It speaks to a continuing sense of otherness; to perceptions of what is normal, and not. “It often doesn’t matter what the word is,” said Michael Wehmeyer, the director and senior scientist at the Beach Center on Disability at the University of Kansas. “It’s that people associate that word with what their perceptions of these people are — as broken, or as defective, or as something else.” For many years, the preferred term was, simply, idiot. When Massachusetts established a commission on idiocy in the mid-1840s, it appointed Dr. Samuel G. Howe, an abolitionist and early disability rights advocate, as its chairman. The commission argued for the establishment of schools to help this segment of society, but made clear that it regarded idiocy “as an outward sign of an inward malady.” © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22195 - Posted: 05.09.2016
By Aleszu Bajak In its May 2 issue, The New Yorker magazine published a report titled “Same But Different,” with the subhead: “How epigenetics can blur the line between nature and nurture.” The piece was written by Siddhartha Mukherjee, a physician and author of the Pulitzer prize-winning book “The Emperor of all Maladies: A Biography of Cancer.” In his New Yorker story, Mukherjee, with deft language and colorful anecdotes, examines a topic that is very much du jour in science writing: Epigenetics. Google defines epigenetics as “the study of changes in organisms caused by modification of gene expression, rather than alteration of the genetic code itself.” Merriam Webster’s definition is similar — but not exactly the same: “The study of heritable changes in gene function that do not involve changes in DNA sequence.” The slight variation in definition is telling in itself — and it’s really that “heritable” part that has sparked intense interest not just among scientists, but in the popular mind. Steven Henikoff, a molecular biologist at the Fred Hutchinson Cancer Research Center in Seattle, called Siddhartha Mukherjee’s lyrical take on epigenetics “baloney.” It’s the idea that external factors like diet, or stress or even lifestyle choices can impact not just your own genes, but the genetic information you pass down to all of your descendants. Spend your life smoking cigarettes and eating fatty foods, the thinking goes, and you’ll not just make yourself sick, you’ll predispose your offspring — and their offspring, and their offspring — to associated diseases as well. It’s heady stuff, but much of it remains speculative and poorly supported, which is where Mukherjee may have run into trouble. The publication of his story — an excerpt from his forthcoming book “The Gene: An Intimate History” — was met with swift criticism from biologists working in epigenetics and the broader field of gene regulation. They argue that Mukherjee played fast and loose with his description of epigenetic processes and misled readers by casting aside decades of research into how genes are regulated during development. Copyright 2016 Undark
Link ID: 22194 - Posted: 05.09.2016
By John Horgan Scientists trying to explain consciousness are entitled to be difficult, but what’s philosophers’ excuse? Don’t they have a moral duty to be comprehensible to non-specialists? I recently attended “The Science of Consciousness,” the legendary inquest held every two years in Tucson, Arizona. I reported on the first meeting in 1994 and wanted to see how it’s evolved since then. This year’s shindig lasted from April 26 to April 30 and featured hundreds of presenters, eminent and obscure. I arrived on the afternoon of April 27 and stayed through the closing “End-of-Consciousness Party.” The only event I regret missing is a chat between philosopher David Chalmers, who loosed his “hard problem of consciousness” meme here in Tucson in 1994, and Deepak Chopra, the New Age mogul and a sponsor of this year’s meeting. I feel obliged to post something fast, because conference organizer and quantum-consciousness advocate Stuart Hameroff complained that most reporters “come for free, drink our booze and don’t write anything.” Hameroff also generously allowed me to give a talk, “The Quest to Solve Consciousness: A Skeptic’s View,” even though I teased him in my 1994 article for Scientific American, calling him an “aging hipster.” What follows is a highly subjective account of my first day at the meeting. I’d call this a “stream-of-consciousness report on consciousness,” but that would be pretentious. I'm just trying to answer this question: What is it like to be a skeptical journalist at a consciousness conference? I’ll post on the rest of the meeting soon. -- John Horgan DAY 1, WEDNESDAY, APRIL 27. THE HOROR A bullet-headed former New York fireman picks me up at the Tucson airport. Driving to the Loews Ventana Canyon Resort, he argues strenuously that President Trump will make us great again. As we approach the resort, he back-peddles a bit, no doubt worried about his tip. I tip him well, to show how tolerant I am. Everyone’s entitled to an irrational belief or two. © 2016 Scientific American
Link ID: 22193 - Posted: 05.09.2016
By Jane E. Brody Truth to tell, sometimes I don’t follow my own advice, and when I suffer the consequences, I rediscover why I offer it. I’ve long recommended drinking plenty of water, perhaps a glass with every meal and another glass or two between meals. If not plain water, which is best, then coffee or tea without sugar (but not alcoholic or sugary drinks) will do. I dined out recently after an especially active day that included about five miles of walking, 40 minutes of lap swimming and a 90-minute museum visit. I drank only half a glass of water and no other beverage with my meal. It did seem odd that I had no need to use the facilities afterward, not even after a long trip home. But I didn’t focus on why until the next day when, after a fitful night, I awoke exhausted, did another long walk and swim, and cycled to an appointment four miles away. I arrived parched, begging for water. After downing about 12 ounces, I was a new person. I no longer felt like a lead balloon. It seems mild dehydration was my problem, and the experience prompted me to take a closer look at the body’s need for water under a variety of circumstances. Although millions of Americans carry water bottles wherever they go and beverage companies like Coke and Pepsi would have you believe that every life can be improved by the drinks they sell, the truth is serious dehydration is not common among ordinary healthy people. But there are exceptions, and they include people like me in the Medicare generation, athletes who participate in particularly challenging events like marathons, and infants and small children with serious diarrhea. Let’s start with some facts. Water is the single most important substance we consume. You can survive for about two months without food, but you would die in about seven days without water. Water makes up about 75 percent of an infant’s weight and 55 percent of an older person’s weight. © 2016 The New York Times Company
Link ID: 22192 - Posted: 05.09.2016
Why You Can’t Lose Weight on a Diet By SANDRA AAMODT SIX years after dropping an average of 129 pounds on the TV program “The Biggest Loser,” a new study reports, the participants were burning about 500 fewer calories a day than other people their age and size. This helps explain why they had regained 70 percent of their lost weight since the show’s finale. The diet industry reacted defensively, arguing that the participants had lost weight too fast or ate the wrong kinds of food — that diets do work, if you pick the right one. But this study is just the latest example of research showing that in the long run dieting is rarely effective, doesn’t reliably improve health and does more harm than good. There is a better way to eat. The root of the problem is not willpower but neuroscience. Metabolic suppression is one of several powerful tools that the brain uses to keep the body within a certain weight range, called the set point. The range, which varies from person to person, is determined by genes and life experience. When dieters’ weight drops below it, they not only burn fewer calories but also produce more hunger-inducing hormones and find eating more rewarding. The brain’s weight-regulation system considers your set point to be the correct weight for you, whether or not your doctor agrees. If someone starts at 120 pounds and drops to 80, her brain rightfully declares a starvation state of emergency, using every method available to get that weight back up to normal. The same thing happens to someone who starts at 300 pounds and diets down to 200, as the “Biggest Loser” participants discovered. This coordinated brain response is a major reason that dieters find weight loss so hard to achieve and maintain. For example, men with severe obesity have only one chance in 1,290 of reaching the normal weight range within a year; severely obese women have one chance in 677. A vast majority of those who beat the odds are likely to end up gaining the weight back over the next five years. In private, even the diet industry agrees that weight loss is rarely sustained. A report for members of the industry stated: “In 2002, 231 million Europeans attempted some form of diet. Of these only 1 percent will achieve permanent weight loss.” © 2016 The New York Times Company
Link ID: 22188 - Posted: 05.07.2016
By Linda Zajac For nearly 65 million years, bats and tiger moths have been locked in an aerial arms race: Bats echolocate to detect and capture tiger moths, and tiger moths evade them with flight maneuvers and their own ultrasonic sounds. Scientists have long wondered why certain species emit these high-frequency clicks that sound like rapid squeaks from a creaky floorboard. Does the sound jam bat sonar or does it warn bats that the moths are toxic? To find out, scientists collected two types of tiger moths: red-headed moths (pictured above) and Martin’s lichen moths. They then removed the soundmaking organs from some of the insects. In a grassy field in Arizona they set up infrared video cameras, ultrasonic microphones, and ultraviolet lights, the last of which they used to attract bats. In darkness, they released one tiger moth at a time and recorded the moth-bat interactions. They found that the moths rarely produced ultrasonic clicks fast enough to jam bat sonar. They also discovered that without sound organs, 64% of the red-headed moths and 94% of the Martin’s lichen moths were captured and spit out. Together, these findings reported late last month in PLOS ONE suggest that instead of jamming sonar like some tiger moths, these species act tough, flexing their soundmaking organs to warn predators of their toxin. © 2016 American Association for the Advancement of Science
By Virginia Morell After defeating other males in boxing matches and winning a territorial roost—and a bevy of females—a male Seba’s short-tailed bat (Carollia perspicillata, pictured) might think his battles for reproductive rights are over. But the defeated males of this neotropical species have a trick up their sleeve: clandestine matings with willing females. The tactic works, and now researchers know why. Scientists studied bats in a captive colony in Switzerland, removing alpha males from their harems for 3 days, and examining their sperm—as well as that of their rivals. A previous study showed that the sneaky males have faster, longer lived sperm, which gives them a leg-up on the alpha male. Researchers had suspected this was because the sneakers produced this supersperm to compete. But the new study finds that after the 3 days of abstinence, the alpha male’s sperm is as agile and vigorous as that of his rivals. Thus, the team reports today in the Journal of Experimental Biology, the sneaky males aren’t generating special sperm—they just mate less, so their sperm is in better shape when it comes time to race to the egg. © 2016 American Association for the Advancement of Science.
By Gretchen Reynolds Young rats prone to obesity are much less likely to fulfill that unhappy destiny if they run during adolescence than if they do not, according to a provocative new animal study of exercise and weight. They also were metabolically healthier, and had different gut microbes, than rats that keep the weight off by cutting back on food, the study found. The experiment was done in rodents, not people, but it does raise interesting questions about just what role exercise may play in keeping obesity at bay. For some time, many scientists, dieting gurus and I have been pointing out that exercise by itself tends to be ineffective for weight loss. Study after study has found that if overweight people start working out but do not also reduce their caloric intake, they shed little if any poundage and may gain weight. The problem, most scientists agree, is that exercise increases appetite, especially in people who are overweight, and also can cause compensatory inactivity, meaning that people move less over all on days when they exercise. Consequently, they wind up burning fewer daily calories, while also eating more. You do the math. But those discouraging studies involved weight loss. There has been much less examination of whether exercise might help to prevent weight gain in the first place and, if it does, how it compares to calorie restriction for that purpose. So for the new study, which was published last week in Medicine & Science in Sports & Exercise, researchers at the University of Missouri in Columbia and other schools first gathered rats from a strain that has an inborn tendency to become obese, starting in adolescence. (Adolescence is also when many young people begin to add weight.) © 2016 The New York Times Company
Link ID: 22178 - Posted: 05.04.2016
By Helen Briggs BBC News The Labrador retriever, known as one of the greediest breeds of dog, is hard-wired to overeat, research suggests. The dog is more likely to become obese than other breeds partly because of its genes, scientists at Cambridge University say. The gene affected is thought to be important in controlling how the brain recognises hunger and the feeling of being full after eating. The research could help in the understanding of human obesity. "About a quarter of pet Labradors carry this gene [difference]," lead researcher Dr Eleanor Raffan told the BBC. "Although obesity is the consequence of eating more than you need and more than you burn off in exercise, actually there's some real hard-wired biology behind our drive to eat," she added. Lifestyle factors Canine obesity mirrors the human obesity epidemic, with lifestyle factors such as lack of exercise and high-calorie food both implicated - as well as genetics. As many as two in three dogs (34-59%) in rich countries are now overweight. The Labrador has the highest levels of obesity and has been shown to be more obsessed with food than other breeds. Researchers screened more than 300 Labradors kept as pets or assistance dogs for known obesity genes in the study, published in the journal Cell Metabolism. The international team found that a change in a gene known as POMC was strongly linked with weight, obesity and appetite in Labradors and Flat-Coated retrievers. In both breeds, for each copy of the gene carried, the dog was on average 2kg heavier. Other breeds of dog - from the Shih Tzu to the Great Dane - were also screened, but the genetic difference was not found. However, the variation was more common in Labradors working as assistance dogs, which the researchers say might be because these dogs are easier to train by rewarding with food. © 2016 BBC.
By Sarah Kaplan The ancient Greeks spoke of a mythological society composed entirely of warrior women. The medieval traveler John Mandeville wrote of a place whose female rulers "never would suffer man to dwell amongst them." "Paradise Island," home of Wonder Woman, was a feminist utopia where no one with a Y chromosome was allowed. Sadly, those places only exist in fiction. But something like them does exist in the real world. It's in a wetland in rural Ohio. And it's full of salamanders. "They’re pretty incredible," said Robert Denton, a biologist at Ohio State who studies an unusual group of salamander species that literally don't need men. These creatures – all female – reproduce by cloning themselves. To keep their gene pool diverse, they sometimes "steal" sperm left behind on trees and leaves by male salamanders of other species and incorporate that DNA into their offspring. Most sexually reproducing organisms have two sets of chromosomes to make up their genome – one from each parent. But one of these strange salamanders can have between two and five times that much genetic material lying in wait within her cells. It's as if they have multiple genomes to fall back on, and that's made them incredibly successful. "Polyploid" salamanders have been around some 6 million years, Denton said — far longer than most other animal species that reproduce asexually. Since a lack of diversity means having a smaller arsenal of genetic variation to fall back on when living conditions change, these groups usually go extinct relatively quickly. © 1996-2016 The Washington Post
by Susan Milius There’s nothing like a guy doing all the child care to win female favor, even among giant water bugs. Thumbnail-sized Appasus water bugs have become an exemplar species for studying paternal care. After mating, females lay eggs on a male’s back and leave him to swim around for weeks tending his glued-on load. For an A. major water bug, lab tests show an egg burden can have the sweet side of attracting more females, researchers in Japan report May 4 in Royal Society Open Science. Given a choice of two males, females strongly favored, and laid more eggs on, the one already hauling around 10 eggs rather than the male that researchers had scraped eggless. Females still favored a well-egged male even when researchers offered two males that a female had already considered, but with their egg-carrying roles switched from the previous encounter. That formerly spurned suitor this time triumphed. A similar preference, though not as clear-cut, showed up in the slightly smaller and lighter A. japonicus giant water bug. “We conclude that sexual selection plays an important role in the maintenance of elaborate paternal care,” says study coauthor Shin-ya Ohba of Nagasaki University. © Society for Science & the Public 2000 - 2016
By Emily Benson Baby birds are sometimes known to shove their siblings out of the nest to gain their parents’ undivided attention, but barn owl chicks appear to be more altruistic. Scientists recorded the hissing calls of hungry and full barn owl nestlings (Tyto alba, pictured), then played the sounds back to single chicks settled in nests stocked with mice. The young owls that heard the squawks of their hungry kin delayed eating each rodent by an average of half an hour; those that heard cries indicating their invisible nest-mate was full ate the mice more quickly. The findings suggest that barn owl chicks give hungrier siblings a chance to eat first even when the nest is full of food, the researchers will report in an upcoming issue of Behavioral Ecology and Sociobiology. So is it true altruism? Maybe not. Nestlings may share food in exchange for help with grooming or to get the first crack at a later meal, the team says, suggesting a possible ulterior motive. © 2016 American Association for the Advancement of Science
Link ID: 22174 - Posted: 05.04.2016
Laura Sanders Iron, says aging expert Naftali Raz, is like the Force. It can be good or bad, depending on the context. When that context is the human brain, though, scientists wrangle over whether iron is a dark force for evil or a bright source of support. Some iron is absolutely essential for the brain. On that, scientists agree. But recent studies suggest to some researchers that too much iron, and the chemical reactions that ensue, can be dangerous or deadly, especially to nerve cells in the vulnerable brain area that deteriorates with Parkinson’s disease. Yet other work raises the possibility that those cells die because of lack of iron, rather than too much. “There are a lot of surprises in this field,” says iron biologist Nancy Andrews of Duke University. The idea that too much iron is dangerous captivates many researchers, including analytical neurochemist Dominic Hare of the University of Technology Sydney. “All of life is a chemical reaction,” he says, “so the start of disease is a chemical reaction as well.” And as Raz points out, reactions involving iron are both life-sustaining and dangerous. “Iron is absolutely necessary for conducting the very fundamental business in every cell,” says Raz, of Wayne State University in Detroit. It helps produce energy-storing ATP molecules. And that’s a dirty job, throwing off dangerous free radicals that can cause cellular mayhem as energy is made. But those free radicals are not the most worrisome aspect of iron, Hare believes. “The reaction that is much more dangerous is the reaction you get when iron and dopamine come together,” he says. © Society for Science & the Public 2000 - 2016.
Link ID: 22173 - Posted: 05.03.2016