Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1507

2014 by Dan Jones The vast majority of people think we have free will and are the authors of our own life stories. But if neuroscientists were one day able to predict our every action based on brain scans, would people abandon this belief in droves? A new study concludes that such knowledge would not by itself be enough to shake our confidence in our own volition. Many neuroscientists, such as the late Francis Crick, have argued that our sense of free will is no more than the behaviour of a vast assembly of nerve cells. This is tied to the idea of determinism, which has it that every effect is preceded by a cause, with cause and effect connected by physical laws. This is why the behaviour of physical systems can be predicted – even the brain, in principle. As author Sam Harris puts it: "If determinism is true, the future is set – and this includes all our future states of mind and our subsequent behaviour." If people lost their belief in their own free will, that would have important consequences for how we think about moral responsibility, and even how we behave. For example, numerous studies have shown that when people are led to reject free will they are more likely to cheat, and are also less bothered about punishing other wrongdoers. For those who argue that what we know about neuroscience is incompatible with free will, predicting what our brain is about to do should reveal the illusory nature of free will, and lead people to reject it. Experimental philosopher Eddy Nahmias at Georgia State University in Atlanta dubs this view "willusionism". He recently set out to test it. © Copyright Reed Business Information Ltd.

Keyword: Consciousness
Link ID: 20102 - Posted: 09.22.2014

By CLYDE HABERMAN When it came to pharmacological solutions to life’s despairs, Aldous Huxley was ahead of the curve. In Huxley’s 1932 novel about a dystopian future, the Alphas, Betas and others populating his “Brave New World” have at their disposal a drug called soma. A little bit of it chases the blues away: “A gramme” — Huxley was English, remember, spelling included — “is better than a damn.” With a swallow, negative feelings are dispelled. Prozac, the subject of this week’s video documentary from Retro Report, is hardly soma. But its guiding spirit is not dissimilar: A few milligrams of this drug are preferable to the many damns that lie at the core of some people’s lives. Looking back at Prozac’s introduction by Eli Lilly and Company in 1988, and hopscotching to today, the documentary explores the enormous influence, both chemical and cultural, that Prozac and its brethren have had in treating depression, a concern that gained new resonance with the recent suicide of the comedian Robin Williams. In the late 1980s and the 90s, Prozac was widely viewed as a miracle pill, a life preserver thrown to those who felt themselves drowning in the high waters of mental anguish. It was the star in a class of new pharmaceuticals known as S.S.R.I.s — selective serotonin reuptake inhibitors. Underlying their use is a belief that depression is caused by a shortage of the neurotransmitter serotonin. Pump up the levels of this brain chemical and, voilà, the mood lifts. Indeed, millions have embraced Prozac, and swear by it. Depression left them emotionally paralyzed, they say. Now, for the first time in years, they think clearly and can embrace life. Pharmacological merits aside, the green-and-cream pill was also a marvel of commercial branding, down to its market-tested name. Its chemical name is fluoxetine hydrochloride, not the most felicitous of terms. A company called Interbrand went to work for Eli Lilly and came up with Prozac. “Pro” sounds positive. Professional, too. “Ac”? That could signify action. As for the Z, it suggests a certain strength, perhaps with a faint high-techy quality. © 2014 The New York Times Company

Keyword: Depression
Link ID: 20098 - Posted: 09.22.2014

By Jocelyn Kaiser A virus that shuttles a therapeutic gene into cells has strengthened the muscles, improved the motor skills, and lengthened the lifespan of mice afflicted with two neuromuscular diseases. The approach could one day help people with a range of similar disorders, from muscular dystrophy to amyotrophic lateral sclerosis, or ALS. Many of these diseases involve defective neuromuscular junctions—the interface between neurons and muscle cells where brain signals tell muscles to contract. In one such disease, a form of familial limb-girdle myasthenia, people carry two defective copies of the gene called DOK7, which codes for a protein that’s needed to form such junctions. Their hip and shoulder muscles atrophy over many years, and some eventually have trouble breathing or end up in a wheelchair. Mice similarly missing a properly working Dok7 gene are severely underweight and die within a few weeks. In the new study, researchers led by molecular biologist Yuji Yamanashi of the University of Tokyo first injected young mice engineered to have defective Dok7 with a harmless virus carrying a good copy of the Dok7 gene, which is expressed only in muscle. Within about 7 weeks, the rodents recovered. Their muscle cells cranked out the DOK7 protein, and under a microscope their muscles had larger neuromuscular junctions than those of untreated mice with defective Dok7. What’s more, the mice grew to a healthy body weight and had essentially normal scores on tests of motor skills and muscle strength. © 2014 American Association for the Advancement of Science.

Keyword: Movement Disorders; Aggression
Link ID: 20096 - Posted: 09.19.2014

By John Horgan On this blog, in my book The End of War and elsewhere (see Further Reading and Viewing), I have knocked the deep roots theory of war, which holds that war stems from an instinct deeply embedded in the genes of our male ancestors. Inter-community killings are rare among chimpanzees and non-existent among bonobos, according to a new report in Nature, undercutting the theory that the roots of war extend back to the common ancestor of humans and chimps. Proponents of this theory—notably primatologist Richard Wrangham—claim it is supported by observations of inter-community killings by chimpanzees, Pan troglodytes, our closest genetic relatives. Skeptics, including anthropologists Robert Sussman and Brian Ferguson, have pointed out that chimpanzee violence might be not an adaptation but a response to environmental circumstances, such as human encroachment. This “human impacts” hypothesis is rejected in a new report in Nature by a coalition of 30 primatologists, including Wrangham and lead author Michael Wilson. In “Lethal aggression in Pan is better explained by adaptive strategies than human impacts,” Wilson et al. analyze 152 killings in 18 chimpanzee communities and find “little correlation with human impacts.” Given that the primary interest in chimp violence is its alleged support of the deep-roots theory, it might seem odd, at first, that Wilson et al. do not mention human warfare. Actually, this omission is wise, because the Nature report undermines the deep-roots theory of war, and establishes that the “human impact” issue is a red herring. © 2014 Scientific American,

Keyword: Aggression; Aggression
Link ID: 20092 - Posted: 09.18.2014

I’m an epileptic. It’s not how I define myself, but I am writing about epilepsy, so I think pointing out the fact that I am speaking from experience is acceptable. I may not define myself by my epilepsy but it’s a big part of my life. It affects my life on a daily basis. Because of the epilepsy I can’t drive, can’t pull all-nighters or get up really early just in case I have a seizure. It’s frustrating at times, though I will gladly milk the not getting up early thing when I can, eg bin day. But whereas I’ve grown up with it, having been diagnosed when I was 17, most people I’ve met don’t understand it. You mention the fact that you’re epileptic to some people and they look at you like they’re a robot you’ve just asked to explain the concept of love; they adopt a sort of “DOES NOT COMPUTE!” expression. They often don’t know what to say, or do, or even what epilepsy is and often spend the rest of the conversation searching their data banks for information on what to do if I have a seizure, like “Do I … put a spoon in his mouth?” For the record: no, you don’t. If putting a spoon in an epileptics mouth helped, then we would be prescribed a constant supply of Fruit Corners. So let me put you at ease. No one expects you to know that much about epilepsy (unless you’re responsible for treating it). There are many different types, with many different causes. Not everyone has seizures and often those who do, when given the correct meds, can live pretty much fit-free lives. © 2014 Guardian News and Media Limited

Keyword: Epilepsy
Link ID: 20091 - Posted: 09.18.2014

By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.

Keyword: Attention
Link ID: 20088 - Posted: 09.18.2014

By Virginia Morell Living in a complex social world—one with shifting alliances and competitors—is often cited as the key reason humans, dolphins, and spotted hyenas evolved large brains. Now, researchers say that social complexity also underlies the braininess of parrots, which have big brains relative to their body size. To understand the social lives of these birds, the scientists observed wild populations of monk parakeets (Myiopsitta monachus), a small parrot, in Argentina and captive ones in Florida. They recorded how often the birds (pictured) were seen with other individuals and how they interacted—and then analyzed the parakeets’ social networks. The birds, the researchers report online today in The Auk: Ornithological Advances, prefer to spend time with one specific individual, usually their mate. In the captive populations, the birds also had strong associations with one or two other individuals, numerous more moderate relationships, and only a few that were weak. The scientists also recorded aggressive interactions among the captive birds, revealing that monk parakeets have a dominance hierarchy based on which birds won or lost confrontations. Thus, the parakeets’ society has layers of relationships, similar to those documented in other big-brained animals. Living in such a society requires that the birds recognize and remember others, and whether they are friend or foe—mental tasks that are thought to be linked to the evolution of significant cognitive skills. © 2014 American Association for the Advancement of Science

Keyword: Evolution
Link ID: 20087 - Posted: 09.18.2014

|By Daniel A. Yudkin If you’re reading this at a desk, do me a favor. Grab a pen or pencil and hold the end between your teeth so it doesn’t touch your lips. As you read on, stay that way—science suggests you’ll find this article more amusing if you do. Why? Notice that holding a pencil in this manner puts your face in the shape of a smile. And research in psychology says that the things we do—smiling at a joke, giving a gift to a friend, or even running from a bear—influence how we feel. This idea—that actions affect feelings—runs counter to how we generally think about our emotions. Ask average folks how emotions work—about the causal relationship between feelings and behavior—and they’ll say we smile because we’re happy, we run because we’re afraid. But work by such psychologists as Fritz Strack, Antonio Damasio, Joe LeDoux shows the truth is often the reverse: what we feel is actually the product, not the cause, of what we do. It’s called “somatic feedback.” Only after we act do we deduce, by seeing what we just did, how we feel. This bodes well, at first blush, for anyone trying to change their emotions for the better. All you’d need to do is act like the kind of person you want to be, and that’s who you’ll become. (Call it the Bobby McFerrin philosophy: “Aren’t happy? Don’t worry. Just smile!”) But new research, published in the Journal of Experimental Social Psychology by Aparna Labroo, Anirban Mukhopadhyay, and Ping Dong suggests there may be limits to our ability to proactively manage our own well-being. The team ran a series of studies examining whether more smiling led to more happiness. One asked people how much smiling they had done that day, and how happy they currently felt. Other studies manipulated the amount of smiling people actually did, either by showing them a series of funny pictures or by replicating a version of the pencil-holding experiment. As expected, across these experiments, the researchers found that the more people smiled, the happier they reported being. © 2014 Scientific American

Keyword: Emotions
Link ID: 20085 - Posted: 09.17.2014

By Douglas Main Researchers have created a blood test that they have used to accurately diagnose depression in a small sample of people, and they hope that with time and funding it could be used on a widespread basis. It is the first blood test—and thus the first “objective” gauge—for any type of mental disorder in adults, says study co-author Eva Redei, a neuroscientist at Northwestern University in Evanston, Ill. Outside experts caution, however, that the results are preliminary, and not close to ready for use the doctor’s office. Meanwhile, diagnosing depression the “old-fashioned way” through an interview works quite well, and should only take 10 to 15 minutes, says Todd Essig, a clinical psychologist in New York. But many doctors are increasingly overburdened and often not reimbursed for taking the time to talk to their patients, he says. The test works by measuring blood levels of nine different types of RNA, a chemical that the body uses to process DNA. Besides accurately diagnosing depression, which affects perhaps 10 percent of American adults and is becoming more common, the technique may also be able to tell who could benefit from talk therapy and who may be vulnerable to the condition in the first place. In a study describing the test, published in the journal Translational Psychiatry, the scientists recruited 32 patients who were diagnosed with depression using a clinical interview, the standard technique. They also got 32 non-depressed patients to participate as a control group. © 2014 Newsweek LLC

Keyword: Depression
Link ID: 20084 - Posted: 09.17.2014

By Neuroskeptic Today, we are thinking – and talking – about the brain more than ever before. It is widely said that neuroscience has much to teach psychiatry, cognitive science, economics, and others. Practical applications of brain science are proposed in the fields of politics, law enforcement and education. The brain is everywhere. This “Neuro Turn” has, however, not always been accompanied by a critical attitude. We ought to be skeptical of any claims regarding the brain because it remains a mystery – we fundamentally do not understand how it works. Yet much neuro-discourse seems to make the assumption that the brain is almost a solved problem already. For example, media stories about neuroscience commonly contain simplistic misunderstandings – such as the tendency to over-interpret neural activation patterns as practical guides to human behavior. For instance, recently we have heard claims that because fMRI finds differences in the brain activity of some violent offenders, this means that their criminal tendencies are innate and unchangeable – with clear implications for rehabilitation. Neuroscientists are well aware of the faults in lay discourse about the brain – and are increasingly challenging them e.g. on social media. Unfortunately, the same misunderstandings also exist within neuroscience itself. For example, I argue, much of cognitive neuroscience is actually based on (or, only makes sense given the assumption that) the popular misunderstanding that brain activity has a psychological ‘meaning’. In fact, we just do not know what a given difference in brain activity means, in the vast majority of cases. Thus, many research studies based on finding differences in fMRI activity maps across groups or across conditions, are not really helping us to understand the brain at all – but only providing us with a canvas to project our misunderstandings onto it.

Keyword: Brain imaging
Link ID: 20082 - Posted: 09.17.2014

Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group

Keyword: Consciousness
Link ID: 20080 - Posted: 09.16.2014

By ANNA FELS THE idea of putting a mind-altering drug in the drinking water is the stuff of sci-fi, terrorist plots and totalitarian governments. Considering the outcry that occurred when putting fluoride in the water was first proposed, one can only imagine the furor that would ensue if such a thing were ever suggested. The debate, however, is moot. It’s a done deal. Mother Nature has already put a psychotropic drug in the drinking water, and that drug is lithium. Although this fact has been largely ignored for over half a century, it appears to have important medical implications. Lithium is a naturally occurring element, not a molecule like most medications, and it is present in the United States, depending on the geographic area, at concentrations that can range widely, from undetectable to around .170 milligrams per liter. This amount is less than a thousandth of the minimum daily dose given for bipolar disorders and for depression that doesn’t respond to antidepressants. Although it seems strange that the microscopic amounts of lithium found in groundwater could have any substantial medical impact, the more scientists look for such effects, the more they seem to discover. Evidence is slowly accumulating that relatively tiny doses of lithium can have beneficial effects. They appear to decrease suicide rates significantly and may even promote brain health and improve mood. Yet despite the studies demonstrating the benefits of relatively high natural lithium levels present in the drinking water of certain communities, few seem to be aware of its potential. Intermittently, stories appear in the scientific journals and media, but they seem to have little traction in the medical community or with the general public. The New York Times Company

Keyword: Depression
Link ID: 20077 - Posted: 09.15.2014

By Abby Phillip Most long-time, pack-a-day smokers who took part in a small study were able to quit smoking after six months, and researchers believe the hallucinogenic substance found in "magic mushrooms" could be the reason why. The study of the 15 participants, published this week in the Journal of Psychopharmacology, is the first to look at the feasibility of using the psychedelic drug psilocybin to aid in a smoking cessation treatment program. Existing treatments, from quitting cold turkey to prescription medications like Varenicline (Chantix), work for some people, but not the majority of smokers. With Varenicline, which mimics the effect of nicotine in the body, only about 35 percent of participants in a clinical trial were still abstaining from smoking six months later. Nearly half of all adult smokers reported that they tried to quit in 2010, according to the Centers for Disease Control and Prevention, yet 480,000 deaths are attributed to the addiction every year. Researchers at Johns Hopkins University recruited a group of long-time, heavy smokers — an average of 19 cigarettes a day for an average of 31 years — to participate in the study. They were treated with cognitive behavioral therapy for 15 weeks, and they were given a dose of the hallucinogen psilocybin at the five-week mark, when they had agreed to stop smoking. Although it was a small study, the results were promising. Twelve of the participants had quit smoking six months after being treated with the drug.

Keyword: Drug Abuse
Link ID: 20076 - Posted: 09.15.2014

By Tara Parker-Pope The most reliable workers are those who get seven to eight hours of sleep each night, a new study shows. Researchers from Finland analyzed the sleep habits and missed work days among 3,760 men and women over about seven years. The workers ranged in age from 30 to 64 at the start of the study. The researchers found that the use of sick days was associated with the worker’s sleep habits. Not surprisingly, they found that people who did not get enough sleep because of insomnia or other sleep problems were more likely to miss work. But notably, getting a lot of extra sleep was also associated with missed work. The workers who were most likely to take extra sick days were those who slept five hours or less or 10 hours or more. Short sleepers and long sleepers missed about five to nine more days of work than so-called optimal sleepers, workers who managed seven to eight hours of sleep each night. The workers who used the fewest number of sick days were women who slept an average of 7 hours 38 minutes a night and men who slept an average of 7:46. The study results were published in the September issue of the medical journal Sleep. © 2014 The New York Times Company

Keyword: Sleep
Link ID: 20074 - Posted: 09.15.2014

By KEN BELSON The National Football League, which for years disputed evidence that its players had a high rate of severe brain damage, has stated in federal court documents that it expects nearly a third of retired players to develop long-term cognitive problems and that the conditions are likely to emerge at “notably younger ages” than in the general population. The findings are a result of data prepared by actuaries hired by the league and provided to the United States District Court judge presiding over the settlement between the N.F.L. and 5,000 former players who sued the league, alleging that it had hidden the dangers of concussions from them. “Thus, our assumptions result in prevalence rates by age group that are materially higher than those expected in the general population,” said the report, prepared by the Segal Group for the N.F.L. “Furthermore, the model forecasts that players will develop these diagnoses at notably younger ages than the generation population.” The statements are the league’s most unvarnished admission yet that the sport’s professional participants sustain severe brain injuries at far higher rates than the general population. They also appear to confirm what scientists have said for years: that playing football increases the risk of developing neurological conditions like chronic traumatic encephalopathy, a degenerative brain disease that can be identified only in an autopsy. “This statement clears up all the confusion and doubt manufactured over the years questioning the link between brain trauma and long-term neurological impairment,” said Chris Nowinski, the executive director of the Sports Legacy Institute, who has for many years pressured the league to acknowledge the connection between football and brain diseases. © 2014 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 20073 - Posted: 09.13.2014

By Smitha Mundasad Health reporter, BBC News Giving young people Botox treatment may restrict their emotional growth, experts warn. Writing in the Journal of Aesthetic Nursing, clinicians say there is a growing trend for under-25s to seek the wrinkle-smoothing injections. But the research suggests "frozen faces" could stop young people from learning how to express emotions fully. A leading body of UK plastic surgeons says injecting teenagers for cosmetic reasons is "morally wrong". Botox and other versions of the toxin work by temporarily paralysing muscles in the upper face to reduce wrinkling when people frown. Nurse practitioner Helen Collier, who carried out the research, says reality TV shows and celebrity culture are driving young people to idealise the "inexpressive frozen face." But she points to a well-known psychological theory, the facial feedback hypothesis, that suggests adolescents learn how best to relate to people by mimicking their facial expressions. She says: "As a human being our ability to demonstrate a wide range of emotions is very dependent on facial expressions. "Emotions such as empathy and sympathy help us to survive and grow into confident and communicative adults." But she warns that a "growing generation of blank-faced" young people could be harming their ability to correctly convey their feelings. "If you wipe those expressions out, this might stunt their emotional and social development," she says. The research calls for practitioners to use assessment tools to decide whether there are clear clinical reasons for Botox treatment. BBC © 2014

Keyword: Emotions
Link ID: 20070 - Posted: 09.13.2014

Corie Lok Tami Morehouse's vision was not great as a child, but as a teenager she noticed it slipping even further. The words she was trying to read began disappearing into the page and eventually everything faded to a dull, grey haze. The culprit was a form of Leber's congenital amaurosis (LCA), a group of genetic disorders in which light-sensing cells in the retina die off, usually resulting in total blindness by the time people reach their thirties or forties. But Morehouse got a reprieve. In 2009, at the age of 44, the social worker from Ashtabula, Ohio, became the oldest participant in a ground-breaking clinical trial to test a gene therapy for LCA. Now, she says, she can see her children's eyes, and the colours of the sunset seem brighter than before. Morehouse calls these improvements life-changing, but they are minor compared with the changes in some of the younger trial participants. Corey Haas was eight years old when he was treated in 2008 — the youngest person to receive the therapy. He went from using a white cane to riding a bicycle and playing softball. Morehouse often wonders what she would be able to see now if she had been closer to Haas's age when she had the therapy. “I was born a little too soon,” she says. Visual impairment affects some 285 million people worldwide, about 39 million of whom are considered blind, according to a 2010 estimate from the World Health Organization. Roughly 80% of visual impairment is preventable or curable, including operable conditions such as cataracts that account for much of the blindness in the developing world. But retinal-degeneration disorders — including age-related macular degeneration, the leading cause of blindness in the developed world — have no cure. © 2014 Nature Publishing Group

Keyword: Vision
Link ID: 20064 - Posted: 09.11.2014

By JOSHUA A. KRISCH PHILADELPHIA — McBaine, a bouncy black and white springer spaniel, perks up and begins his hunt at the Penn Vet Working Dog Center. His nose skims 12 tiny arms that protrude from the edges of a table-size wheel, each holding samples of blood plasma, only one of which is spiked with a drop of cancerous tissue. The dog makes one focused revolution around the wheel before halting, steely-eyed and confident, in front of sample No. 11. A trainer tosses him his reward, a tennis ball, which he giddily chases around the room, sliding across the floor and bumping into walls like a clumsy puppy. McBaine is one of four highly trained cancer detection dogs at the center, which trains purebreds to put their superior sense of smell to work in search of the early signs of ovarian cancer. Now, Penn Vet, part of the University of Pennsylvania’s School of Veterinary Medicine, is teaming with the university’s chemistry and physics departments to isolate cancer chemicals that only dogs can smell. They hope this will lead to the manufacture of nanotechnology sensors that are capable of detecting bits of cancerous tissue 1/100,000th the thickness of a sheet of paper. “We don’t ever anticipate our dogs walking through a clinic,” said the veterinarian Dr. Cindy Otto, the founder and executive director of the Working Dog Center. “But we do hope that they will help refine chemical and nanosensing techniques for cancer detection.” Since 2004, research has begun to accumulate suggesting that dogs may be able to smell the subtle chemical differences between healthy and cancerous tissue, including bladder cancer, melanoma and cancers of the lung, breast and prostate. But scientists debate whether the research will result in useful medical applications. © 2014 The New York Times Company

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20063 - Posted: 09.11.2014

By Sarah Zielinski The marshmallow test is pretty simple: Give a child a treat, such as a marshmallow, and promise that if he doesn’t eat it right away, he’ll soon be rewarded with a second one. The experiment was devised by Stanford psychologist Walter Mischel in the late 1960s as a measure of self-control. When he later checked back in with kids he had tested as preschoolers, those who had been able to wait for the second treat appeared to be doing better in life. They tended to have fewer behavioral or drug-abuse problems, for example, than those who had given in to temptation. Most attempts to perform this experiment on animals haven’t worked out so well. Many animals haven’t been willing to wait at all. Dogs, primates, and some birds have done a bit better, managing to wait at least a couple of minutes before eating the first treat. The best any animal has managed has been 10 minutes—a record set earlier this year by a couple of crows. The African grey parrot is a species known for its intelligence. Animal psychologist Irene Pepperberg, now at Harvard, spent 30 years studying one of these parrots, Alex, and showed that the bird had an extraordinary vocabulary and capacity for learning. Alex even learned to add numerals before his death in 2007. Could an African grey pass the marshmallow test? Adrienne E. Koepke of Hunter College and Suzanne L. Gray of Harvard University tried the experiment on Pepperberg’s current star African grey, a 19-year-old named Griffin. In their test, a researcher took two treats, one of which Griffin liked slightly better, and put them into cups. Then she placed the cup with the less preferred food in front of Griffin and told him, “wait.” She took the other cup and either stood a few feet away or left the room. After a random amount of time, from 10 seconds to 15 minutes, she would return. If the food was still in the cup, Griffin got the nut he was waiting for. Koepke and colleagues presented their findings last month at the Animal Behavior Society meeting at Princeton. © 2014 The Slate Group LLC.

Keyword: Intelligence; Aggression
Link ID: 20061 - Posted: 09.11.2014

|By Amy Nordrum If you were one of millions of children who completed the Drug Abuse Resistance Education program, or D.A.R.E., between 1983 and 2009, you may be surprised to learn that scientists have repeatedly shown that the program did not work. Despite being the nation’s most popular substance-abuse prevention program, D.A.R.E. did not make you less likely to become a drug addict or even to refuse that first beer from your friends. But over the past few years prevention scientists have helped D.A.R.E. America, the nonprofit organization that administers the program, replace the old curriculum with a course based on a few concepts that should make the training more effective for today’s students. The new course, called keepin’ it REAL, differs in both form and content from the former D.A.R.E.—replacing long, drug-fact laden lectures with interactive lessons that present stories meant to help kids make smart decisions. Beginning in 2009 D.A.R.E. administrators required middle schools across the country that teach the program to switch over to the 10-week, researcher-designed curriculum for seventh graders. By 2013, they had ordered elementary schools to start teaching a version of those lessons to fifth and sixth graders, too. "It's not an antidrug program," says Michelle Miller-Day, co-developer of the new curriculum and a communications researcher at Chapman University. “It's about things like being honest and safe and responsible." Even so, keepin’ it REAL has reduced substance abuse and maintained antidrug attitudes over time among students in early trials—an achievement that largely eluded the former iteration of the program. D.A.R.E.’s original curriculum was not shaped by prevention specialists but by police officers and teachers in Los Angeles. They started D.A.R.E. in 1983 to curb the use of drugs, alcohol and tobacco among teens and to improve community–police relations. Fueled by word of mouth, the program quickly spread to 75 percent of U.S. schools. © 2014 Scientific American,

Keyword: Drug Abuse
Link ID: 20060 - Posted: 09.11.2014