Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group
Link ID: 20080 - Posted: 09.16.2014
By ANNA FELS THE idea of putting a mind-altering drug in the drinking water is the stuff of sci-fi, terrorist plots and totalitarian governments. Considering the outcry that occurred when putting fluoride in the water was first proposed, one can only imagine the furor that would ensue if such a thing were ever suggested. The debate, however, is moot. It’s a done deal. Mother Nature has already put a psychotropic drug in the drinking water, and that drug is lithium. Although this fact has been largely ignored for over half a century, it appears to have important medical implications. Lithium is a naturally occurring element, not a molecule like most medications, and it is present in the United States, depending on the geographic area, at concentrations that can range widely, from undetectable to around .170 milligrams per liter. This amount is less than a thousandth of the minimum daily dose given for bipolar disorders and for depression that doesn’t respond to antidepressants. Although it seems strange that the microscopic amounts of lithium found in groundwater could have any substantial medical impact, the more scientists look for such effects, the more they seem to discover. Evidence is slowly accumulating that relatively tiny doses of lithium can have beneficial effects. They appear to decrease suicide rates significantly and may even promote brain health and improve mood. Yet despite the studies demonstrating the benefits of relatively high natural lithium levels present in the drinking water of certain communities, few seem to be aware of its potential. Intermittently, stories appear in the scientific journals and media, but they seem to have little traction in the medical community or with the general public. The New York Times Company
Link ID: 20077 - Posted: 09.15.2014
By Abby Phillip Most long-time, pack-a-day smokers who took part in a small study were able to quit smoking after six months, and researchers believe the hallucinogenic substance found in "magic mushrooms" could be the reason why. The study of the 15 participants, published this week in the Journal of Psychopharmacology, is the first to look at the feasibility of using the psychedelic drug psilocybin to aid in a smoking cessation treatment program. Existing treatments, from quitting cold turkey to prescription medications like Varenicline (Chantix), work for some people, but not the majority of smokers. With Varenicline, which mimics the effect of nicotine in the body, only about 35 percent of participants in a clinical trial were still abstaining from smoking six months later. Nearly half of all adult smokers reported that they tried to quit in 2010, according to the Centers for Disease Control and Prevention, yet 480,000 deaths are attributed to the addiction every year. Researchers at Johns Hopkins University recruited a group of long-time, heavy smokers — an average of 19 cigarettes a day for an average of 31 years — to participate in the study. They were treated with cognitive behavioral therapy for 15 weeks, and they were given a dose of the hallucinogen psilocybin at the five-week mark, when they had agreed to stop smoking. Although it was a small study, the results were promising. Twelve of the participants had quit smoking six months after being treated with the drug.
Keyword: Drug Abuse
Link ID: 20076 - Posted: 09.15.2014
By Tara Parker-Pope The most reliable workers are those who get seven to eight hours of sleep each night, a new study shows. Researchers from Finland analyzed the sleep habits and missed work days among 3,760 men and women over about seven years. The workers ranged in age from 30 to 64 at the start of the study. The researchers found that the use of sick days was associated with the worker’s sleep habits. Not surprisingly, they found that people who did not get enough sleep because of insomnia or other sleep problems were more likely to miss work. But notably, getting a lot of extra sleep was also associated with missed work. The workers who were most likely to take extra sick days were those who slept five hours or less or 10 hours or more. Short sleepers and long sleepers missed about five to nine more days of work than so-called optimal sleepers, workers who managed seven to eight hours of sleep each night. The workers who used the fewest number of sick days were women who slept an average of 7 hours 38 minutes a night and men who slept an average of 7:46. The study results were published in the September issue of the medical journal Sleep. © 2014 The New York Times Company
Link ID: 20074 - Posted: 09.15.2014
By KEN BELSON The National Football League, which for years disputed evidence that its players had a high rate of severe brain damage, has stated in federal court documents that it expects nearly a third of retired players to develop long-term cognitive problems and that the conditions are likely to emerge at “notably younger ages” than in the general population. The findings are a result of data prepared by actuaries hired by the league and provided to the United States District Court judge presiding over the settlement between the N.F.L. and 5,000 former players who sued the league, alleging that it had hidden the dangers of concussions from them. “Thus, our assumptions result in prevalence rates by age group that are materially higher than those expected in the general population,” said the report, prepared by the Segal Group for the N.F.L. “Furthermore, the model forecasts that players will develop these diagnoses at notably younger ages than the generation population.” The statements are the league’s most unvarnished admission yet that the sport’s professional participants sustain severe brain injuries at far higher rates than the general population. They also appear to confirm what scientists have said for years: that playing football increases the risk of developing neurological conditions like chronic traumatic encephalopathy, a degenerative brain disease that can be identified only in an autopsy. “This statement clears up all the confusion and doubt manufactured over the years questioning the link between brain trauma and long-term neurological impairment,” said Chris Nowinski, the executive director of the Sports Legacy Institute, who has for many years pressured the league to acknowledge the connection between football and brain diseases. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20073 - Posted: 09.13.2014
By Smitha Mundasad Health reporter, BBC News Giving young people Botox treatment may restrict their emotional growth, experts warn. Writing in the Journal of Aesthetic Nursing, clinicians say there is a growing trend for under-25s to seek the wrinkle-smoothing injections. But the research suggests "frozen faces" could stop young people from learning how to express emotions fully. A leading body of UK plastic surgeons says injecting teenagers for cosmetic reasons is "morally wrong". Botox and other versions of the toxin work by temporarily paralysing muscles in the upper face to reduce wrinkling when people frown. Nurse practitioner Helen Collier, who carried out the research, says reality TV shows and celebrity culture are driving young people to idealise the "inexpressive frozen face." But she points to a well-known psychological theory, the facial feedback hypothesis, that suggests adolescents learn how best to relate to people by mimicking their facial expressions. She says: "As a human being our ability to demonstrate a wide range of emotions is very dependent on facial expressions. "Emotions such as empathy and sympathy help us to survive and grow into confident and communicative adults." But she warns that a "growing generation of blank-faced" young people could be harming their ability to correctly convey their feelings. "If you wipe those expressions out, this might stunt their emotional and social development," she says. The research calls for practitioners to use assessment tools to decide whether there are clear clinical reasons for Botox treatment. BBC © 2014
Link ID: 20070 - Posted: 09.13.2014
Corie Lok Tami Morehouse's vision was not great as a child, but as a teenager she noticed it slipping even further. The words she was trying to read began disappearing into the page and eventually everything faded to a dull, grey haze. The culprit was a form of Leber's congenital amaurosis (LCA), a group of genetic disorders in which light-sensing cells in the retina die off, usually resulting in total blindness by the time people reach their thirties or forties. But Morehouse got a reprieve. In 2009, at the age of 44, the social worker from Ashtabula, Ohio, became the oldest participant in a ground-breaking clinical trial to test a gene therapy for LCA. Now, she says, she can see her children's eyes, and the colours of the sunset seem brighter than before. Morehouse calls these improvements life-changing, but they are minor compared with the changes in some of the younger trial participants. Corey Haas was eight years old when he was treated in 2008 — the youngest person to receive the therapy. He went from using a white cane to riding a bicycle and playing softball. Morehouse often wonders what she would be able to see now if she had been closer to Haas's age when she had the therapy. “I was born a little too soon,” she says. Visual impairment affects some 285 million people worldwide, about 39 million of whom are considered blind, according to a 2010 estimate from the World Health Organization. Roughly 80% of visual impairment is preventable or curable, including operable conditions such as cataracts that account for much of the blindness in the developing world. But retinal-degeneration disorders — including age-related macular degeneration, the leading cause of blindness in the developed world — have no cure. © 2014 Nature Publishing Group
Link ID: 20064 - Posted: 09.11.2014
By JOSHUA A. KRISCH PHILADELPHIA — McBaine, a bouncy black and white springer spaniel, perks up and begins his hunt at the Penn Vet Working Dog Center. His nose skims 12 tiny arms that protrude from the edges of a table-size wheel, each holding samples of blood plasma, only one of which is spiked with a drop of cancerous tissue. The dog makes one focused revolution around the wheel before halting, steely-eyed and confident, in front of sample No. 11. A trainer tosses him his reward, a tennis ball, which he giddily chases around the room, sliding across the floor and bumping into walls like a clumsy puppy. McBaine is one of four highly trained cancer detection dogs at the center, which trains purebreds to put their superior sense of smell to work in search of the early signs of ovarian cancer. Now, Penn Vet, part of the University of Pennsylvania’s School of Veterinary Medicine, is teaming with the university’s chemistry and physics departments to isolate cancer chemicals that only dogs can smell. They hope this will lead to the manufacture of nanotechnology sensors that are capable of detecting bits of cancerous tissue 1/100,000th the thickness of a sheet of paper. “We don’t ever anticipate our dogs walking through a clinic,” said the veterinarian Dr. Cindy Otto, the founder and executive director of the Working Dog Center. “But we do hope that they will help refine chemical and nanosensing techniques for cancer detection.” Since 2004, research has begun to accumulate suggesting that dogs may be able to smell the subtle chemical differences between healthy and cancerous tissue, including bladder cancer, melanoma and cancers of the lung, breast and prostate. But scientists debate whether the research will result in useful medical applications. © 2014 The New York Times Company
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20063 - Posted: 09.11.2014
By Sarah Zielinski The marshmallow test is pretty simple: Give a child a treat, such as a marshmallow, and promise that if he doesn’t eat it right away, he’ll soon be rewarded with a second one. The experiment was devised by Stanford psychologist Walter Mischel in the late 1960s as a measure of self-control. When he later checked back in with kids he had tested as preschoolers, those who had been able to wait for the second treat appeared to be doing better in life. They tended to have fewer behavioral or drug-abuse problems, for example, than those who had given in to temptation. Most attempts to perform this experiment on animals haven’t worked out so well. Many animals haven’t been willing to wait at all. Dogs, primates, and some birds have done a bit better, managing to wait at least a couple of minutes before eating the first treat. The best any animal has managed has been 10 minutes—a record set earlier this year by a couple of crows. The African grey parrot is a species known for its intelligence. Animal psychologist Irene Pepperberg, now at Harvard, spent 30 years studying one of these parrots, Alex, and showed that the bird had an extraordinary vocabulary and capacity for learning. Alex even learned to add numerals before his death in 2007. Could an African grey pass the marshmallow test? Adrienne E. Koepke of Hunter College and Suzanne L. Gray of Harvard University tried the experiment on Pepperberg’s current star African grey, a 19-year-old named Griffin. In their test, a researcher took two treats, one of which Griffin liked slightly better, and put them into cups. Then she placed the cup with the less preferred food in front of Griffin and told him, “wait.” She took the other cup and either stood a few feet away or left the room. After a random amount of time, from 10 seconds to 15 minutes, she would return. If the food was still in the cup, Griffin got the nut he was waiting for. Koepke and colleagues presented their findings last month at the Animal Behavior Society meeting at Princeton. © 2014 The Slate Group LLC.
|By Amy Nordrum If you were one of millions of children who completed the Drug Abuse Resistance Education program, or D.A.R.E., between 1983 and 2009, you may be surprised to learn that scientists have repeatedly shown that the program did not work. Despite being the nation’s most popular substance-abuse prevention program, D.A.R.E. did not make you less likely to become a drug addict or even to refuse that first beer from your friends. But over the past few years prevention scientists have helped D.A.R.E. America, the nonprofit organization that administers the program, replace the old curriculum with a course based on a few concepts that should make the training more effective for today’s students. The new course, called keepin’ it REAL, differs in both form and content from the former D.A.R.E.—replacing long, drug-fact laden lectures with interactive lessons that present stories meant to help kids make smart decisions. Beginning in 2009 D.A.R.E. administrators required middle schools across the country that teach the program to switch over to the 10-week, researcher-designed curriculum for seventh graders. By 2013, they had ordered elementary schools to start teaching a version of those lessons to fifth and sixth graders, too. "It's not an antidrug program," says Michelle Miller-Day, co-developer of the new curriculum and a communications researcher at Chapman University. “It's about things like being honest and safe and responsible." Even so, keepin’ it REAL has reduced substance abuse and maintained antidrug attitudes over time among students in early trials—an achievement that largely eluded the former iteration of the program. D.A.R.E.’s original curriculum was not shaped by prevention specialists but by police officers and teachers in Los Angeles. They started D.A.R.E. in 1983 to curb the use of drugs, alcohol and tobacco among teens and to improve community–police relations. Fueled by word of mouth, the program quickly spread to 75 percent of U.S. schools. © 2014 Scientific American,
Keyword: Drug Abuse
Link ID: 20060 - Posted: 09.11.2014
By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company
Link ID: 20056 - Posted: 09.10.2014
By SOMINI SENGUPTA A coalition of political figures from around the world, including Kofi Annan, the former United Nations secretary general, and several former European and Latin American presidents, is urging governments to decriminalize a variety of illegal drugs and set up regulated drug markets within their own countries. The proposal by the group, the Global Commission on Drug Policy, goes beyond its previous call to abandon the nearly half-century-old American-led war on drugs. As part of a report scheduled to be released on Tuesday, the group goes much further than its 2011 recommendation to legalize cannabis. The former Brazilian president Fernando Henrique Cardoso, a member of the commission, said the group was calling for the legal regulation of “as many of the drugs that are currently illegal as possible, with the understanding that some drugs may remain too dangerous to decriminalize.” The proposal comes at a time when several countries pummeled by drug violence, particularly in Latin America, are rewriting their own drug laws, and when even the United States is allowing state legislatures to gingerly regulate cannabis use. The United Nations is scheduled to hold a summit meeting in 2016 to evaluate global drug laws. The commission includes former presidents like Mr. Cardoso of Brazil, Ernesto Zedillo of Mexico and Ruth Dreifuss of Switzerland, along with George P. Shultz, a former secretary of state in the Reagan administration, among others. The group stops short of calling on countries to legalize all drugs right away. It calls instead for countries to continue to pursue violent criminal gangs, to stop incarcerating users and to offer treatment for addicts. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20052 - Posted: 09.10.2014
By Mo Costandi The nerve endings in your fingertips can perform complex neural computations that were thought to be carried out by the brain, according to new research published in the journal Nature Neuroscience. The processing of both touch and visual information involves computations that extract the geometrical features of objects we touch and see, such as the edge orientation. Most of this processing takes place in the brain, which contains cells that are sensitive to the orientation of edges on the things we touch and see, and which pass this information onto cells in neighbouring regions, that encode other features. The brain has outsourced some aspects of visual processing, such as motion detection, to the retina, and the new research shows that something similar happens in the touch processing pathway. Delegating basic functions to the sense organs in this way could be an evolutionary mechanism that enables the brain to perform other, more sophisticated information processing tasks more efficiently. Your fingertips are among the most sensitive parts of your body. They are densely packed with thousands of nerve endings, which produce complex patterns of nervous impulses that convey information about the size, shape and texture of objects, and your ability to identify objects by touch and manipulate them depends upon the continuous influx of this information. © 2014 Guardian News and Media Limited
Keyword: Pain & Touch
Link ID: 20051 - Posted: 09.09.2014
By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.
Link ID: 20049 - Posted: 09.09.2014
By Maggie Fox, Erika Edwards and Judy Silverman Here’s how you might be able to turn autism around in a baby: Carefully watch her cues, and push just a little harder with that game of peek-a-boo or “This little piggy.” But don’t push too hard — kids with autism are super-sensitive. That’s what Sally Rogers of the University of California, Davis has found in an intense experiment with the parents of infants who showed clear signs of autism. It’s one of the most hopeful signs yet that if you diagnose autism very early, you can help children rewire their brains and reverse the symptoms. It was a small study, and it’s very hard to find infants who are likely to have autism, which is usually diagnosed in the toddler years. But the findings, published in the Journal of Autism and Developmental Disorders, offer some hope to parents worried about their babies. “With only seven infants in the treatment group, no conclusions can be drawn,” they wrote. However, the effects were striking. Six out of the seven children in the study had normal learning and language skills by the time they were 2 to 3. Isobel was one of them. “She is 3 years old now and she is a 100 percent typical, normally developing child,” her mother, Megan, told NBC News. The family doesn’t want their last name used for privacy reasons. “We don’t have to do the therapy any more. It literally rewired her brain.” Autism is a very common diagnosis for children in the U.S. The latest survey by the Centers for Disease Control and Prevention shows a startling 30 percent jump among 8-year-olds diagnosed with the disorder in a two-year period, to one in every 68 children.
Link ID: 20047 - Posted: 09.09.2014
// by Richard Farrell Conventional thinking has long held that pelvic bones in whales and dolphins, evolutionary throwbacks to ancestors that once walked on land, are vestigial and will disappear millions of years from now. But researchers from University of Southern California and the Natural History Museum of Los Angeles County (NHM) have upended that assumption. The scientists argue in a paper just published in the journal Evolution that cetacean (whale and dolphin) pelvic bones certainly do have a purpose and that they're specifically targeted, by selection, for mating. The muscles that control a cetacean's penis are attached to the creature's pelvic bones. Matthew Dean, assistant professor at the USC Dornsife College of Letters, Arts and Sciences, and Jim Dines, collections manager of mammalogy at NHM, wanted to find out if pelvic bones could be evolutionarily advantageous by impacting the overall amount of control an individual creature has with its penis. The pair spent four years examining whale and dolphin pelvic bones, using a 3D laser scanner to study the shape and size of the samples in extreme detail. Then they gathered as much data as they could find -- reaching back to whaler days -- on whale testis size relative to body mass. The testis data was important because in nature, species in "promiscuous," competitive mating environments (where females mate with multiple males) develop larger testes, relative to their body mass, in order to outdo the competition. © 2014 Discovery Communications, LLC.
By BENEDICT CAREY Imagine that on Day 1 of a difficult course, before you studied a single thing, you got hold of the final exam. The motherlode itself, full text, right there in your email inbox — attached mistakenly by the teacher, perhaps, or poached by a campus hacker. No answer key, no notes or guidelines. Just the questions. Would that help you study more effectively? Of course it would. You would read the questions carefully. You would know exactly what to focus on in your notes. Your ears would perk up anytime the teacher mentioned something relevant to a specific question. You would search the textbook for its discussion of each question. If you were thorough, you would have memorized the answer to every item before the course ended. On the day of that final, you would be the first to finish, sauntering out with an A+ in your pocket. And you would be cheating. But what if, instead, you took a test on Day 1 that was just as comprehensive as the final but not a replica? You would bomb the thing, for sure. You might not understand a single question. And yet as disorienting as that experience might feel, it would alter how you subsequently tuned into the course itself — and could sharply improve your overall performance. This is the idea behind pretesting, one of the most exciting developments in learning-science. Across a variety of experiments, psychologists have found that, in some circumstances, wrong answers on a pretest aren’t merely useless guesses. Rather, the attempts themselves change how we think about and store the information contained in the questions. On some kinds of tests, particularly multiple-choice, we benefit from answering incorrectly by, in effect, priming our brain for what’s coming later. That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20043 - Posted: 09.08.2014
by Laura Beil The obesity crisis has given prehistoric dining a stardom not known since Fred Flintstone introduced the Bronto Burger. Last year, “Paleo diet” topped the list of most-Googled weight loss searches, as modern Stone Age dieters sought the advice of bestsellers like The Paleo Solution or The Primal Blueprint, which encourages followers to “honor your primal genes.” The assumption is that America has a weight problem because human metabolism runs on ancient genes that are ill equipped for contemporary eating habits. In this line of thinking, a diet true to the hunter-gatherers we once were — heavy on protein, light on carbs — will make us skinny again. While the fad has attracted skepticism from those who don’t buy the idea whole hog, there’s still plenty of acceptance for one common premise about the evolution of obesity: Our bodies want to stockpile fat. For most of human history, the theory goes, hunter-gatherers ate heartily when they managed to slay a fleeing mastodon. Otherwise, prehistoric life meant prolonged stretches of near starvation, surviving only on inner reserves of adipose. Today, modern humans mostly hunt and gather at the drive-thru, but our Pleistocene genes haven’t stopped fretting over the coming famine. The idea that evolution favored calorie-hoarding genes has long shaped popular and scientific thinking. Called the “thrifty gene” hypothesis, it has arguably been the dominant theory for evolutionary origins of obesity, and by extension diabetes. (Insulin resistance and diabetes so commonly accompany obesity that doctors have coined the term “diabesity.”) However, it’s not that difficult to find scientists who call the rise of the thrifty gene theory a feat of enthusiasm over evidence. Greg Gibson, director of the Center for Integrative Genomics at Georgia Tech in Atlanta, calls the data “somewhere between scant and nonexistent — a great example of crowd mentality in science.” © Society for Science & the Public 2000 - 2014
Link ID: 20042 - Posted: 09.06.2014
By Jeffrey Mervis Embattled U.K. biomedical researchers are drawing some comfort from a new survey showing that a sizable majority of the public continues to support the use of animals in research. But there’s another twist that should interest social scientists as well: The government’s decision this year to field two almost identical surveys on the topic offers fresh evidence that the way you ask a question affects how people answer it. Since 1999, the U.K. Department for Business, Innovation & Skills (BIS) has been funding a survey of 1000 adults about their attitudes toward animal experimentation. But this year the government asked the London-based pollsters, Ipsos MORI, to carry out a new survey, changing the wording of several questions. (The company also collected additional information, including public attitudes toward different animal species and current rules regarding their use.) For example, the phrase “animal experimentation” was replaced by “animal research” because the latter is “less inflammatory,” notes Ipsos MORI Research Manager Jerry Latter. In addition, says Emma Brown, a BIS spokeswoman, the word research “more accurately reflects the range of procedures that animals may be involved in, including the breeding of genetically modified animals.” But government officials also value the information about long-term trends in public attitudes that can be gleaned from the current survey. So they told the company to conduct one last round—the 10th in the series—at the same time they deployed the new survey. Each survey went to a representative, but different, sample of U.K. adults. © 2014 American Association for the Advancement of Scienc
Keyword: Animal Rights
Link ID: 20041 - Posted: 09.06.2014
Ewen Callaway Caffeine's buzz is so nice it evolved twice. The coffee genome has now been published, and it reveals that the coffee plant makes caffeine using a different set of genes from those found in tea, cacao and other perk-you-up plants. Coffee plants are grown across some 11 million hectares of land, with more than two billion cups of the beverage drunk every day. It is brewed from the fermented, roasted and ground berries of Coffea canephora and Coffea arabica, known as robusta and arabica, respectively. An international team of scientists has now identified more than 25,000 protein-making genes in the robusta coffee genome. The species accounts for about one-third of the coffee produced, much of it for instant-coffee brands such as Nescafe. Arabica contains less caffeine, but its lower acidity and bitterness make it more flavourful to many coffee drinkers. However, the robusta species was selected for sequencing because its genome is simpler than arabica’s. Caffeine evolved long before sleep-deprived humans became addicted to it, probably to defend the coffee plant against predators and for other benefits. For example, coffee leaves contain the highest levels of caffeine of any part of the plant, and when they fall on the soil they stop other plants from growing nearby. “Caffeine also habituates pollinators and makes them want to come back for more, which is what it does to us, too,” says Victor Albert, a genome scientist at the University of Buffalo in New York, who co-led the sequencing effort. The results were published on 4 September in Science1. © 2014 Nature Publishing Group