Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Jennifer Cutraro and Michael Gonchar Marijuana is illegal in the United States. Yet 35 states and the District of Columbia permit some form of marijuana consumption for medical purposes, and, as of this year, two states now allow its recreational use. As national policy evolves on this issue, the New York Times editorial board this summer published a six-part series calling for legalization. In this lesson, we pull together those opinion pieces as well as many other Times articles, graphics and videos to offer starting points for science, social studies and English teachers aiming to use the debate as an opportunity for learning, research and discussion. Like other crops, marijuana is largely cultivated — legally and illegally — in greenhouse-type “grow houses” and on farms. And like other crops, marijuana comes from a plant — cannabis, originally found in the wild and cultivated over thousands of years. Have students research the history of cannabis, from its origins in South and Central Asia to its introduction to the Americas. How have people used the different parts of the plant throughout history? Then, have students work in groups to annotate a map of the world, tracing the history of marijuana cultivation. Marijuana is best known for its psychoactive properties. But how does marijuana bring about these sensations and how else does it behave in the body? To answer these questions, students might research how the active compounds in marijuana affect the body at the level of the cell, and draw parallels with how other drugs act in the body. As is the case with many other drugs — from legal, over-the-counter medications to illegal street drugs, like heroin — the active compounds interact with locations on the surfaces of cells called receptors. Cell surface receptors provide a means for cells to receive information and input from the environment; when a molecule attaches, or binds, to a cell surface receptor, it triggers a series of events inside the cell, like the release of hormones, neurotransmitters or other molecules. A discussion about marijuana’s effects on the body might dovetail nicely with a broader class discussion or review of cell biology, the makeup and function of the cell membrane, and the function of neurotransmitters. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20118 - Posted: 09.27.2014
Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.
Link ID: 20115 - Posted: 09.25.2014
by Greg Laden I heard yesterday that my friend and former advisor Irven DeVore died. He was important, amazing, charming, difficult, harsh, brilliant, fun, annoying. My relationship to him as an advisee and a friend was complex, important to me for many years, and formative. For those who don’t know he was instrumental in developing several subfields of anthropology, including behavioral biology, primate behavioral studies, hunter-gatherer research, and even ethnoarchaeology. He was a cultural anthropologist who realized during his first field season that a) he was not cut out to be a cultural anthropologist and b) most of the other cultural anthropologists were not either. Soon after he became Washburn’s student and independently invented the field study of complex social behavior in primates (though some others were heading in that direction at the same time), producing his famous work on the baboons of Kenya’s Nairobi National Park. For many years, what students learned about primate behavior, they learned from that work. Later he and Richard Lee, along with John Yellen, Alison Brooks, Henry Harpending, and others started up the study of Ju/’hoansi Bushmen along the Namibian/Botswana border. One of the outcomes of that work was the famous Werner Gren conference and volume called “Man the Hunter.” That volume has two roles in the history of anthropology. First, it launched modern forager studies. Second, it became one of the more maligned books in the field of Anthropology. I have yet to meet a single person who has a strong criticism of that book that is not based on having not read it. For many years, much of what students learned about human foragers, they learned from that work.
Link ID: 20114 - Posted: 09.25.2014
By SAM BORDEN Bellini, a Brazilian soccer star who led the team that won the 1958 World Cup and was honored with a statue outside the Estádio do Maracanã in Rio de Janeiro, had a degenerative brain disease linked to dozens of boxers and American football players when he died in March at age 83. At the time, his death was attributed to complications related to Alzheimer’s disease. But researchers now say he had an advanced case of chronic traumatic encephalopathy, or C.T.E., which is caused by repeated blows to the head and has symptoms similar to those of Alzheimer’s. C.T.E. can be diagnosed only posthumously, and few brains of former soccer players have been examined. Bellini is the second known case, according to Dr. Ann McKee, a neuropathologist at Boston University and the Veterans Affairs Medical Center in Bedford, Mass., who assisted in examining Bellini’s brain. McKee was also involved this year when researchers found C.T.E. in the brain of a 29-year-old man from New Mexico who had played soccer semiprofessionally. McKee said in an interview that she was aware of a third former soccer player who had C.T.E. but that she was not yet authorized to publicly identify the person. As C.T.E. began to gain widespread attention about six years ago, it was often thought of as an American problem. Many of the early cases of the disease, for which there is no known cure, were connected to boxers and American football players. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20110 - Posted: 09.24.2014
// by Jennifer Viegas Harems -- where a group of females share a single mate -- can be sexual bliss for the male, but the arrangement poses many challenges for him, according to a new study. Male leaders of harems are often overworked and tired finds the study, published in the latest issue of Royal Society Open Science. Gelada baboons exemplify the problems. "Being a gelada leader male is fairly exhausting," co-author David Pappano told Discovery News. "In order to keep the females within his harem happy, gelada leader males spend a lot of time grooming them." "When bachelors are around, leader males often engage in costly displays -- running around, climbing up a tree, and producing a very loud (ee-yow) display call," added Pappano, who is an NSF postdoctoral research fellow at Princeton University's Department of Ecology and Evolutionary Biology. He co-authored the paper with Jacinta Beehner. © 2014 Discovery Communications, LLC.
Keyword: Sexual Behavior
Link ID: 20108 - Posted: 09.24.2014
By Nicholas Bakalar Average waist circumference — but not body mass index— increased significantly in the United States between 1999 and 2012, a new study reports. Abdominal obesity — a “beer belly” or “beer gut” — is caused by fat around the internal organs. It is one of the indicators of metabolic syndrome, a group of five conditions that raises the risk for heart disease and diabetes. After adjusting for age, the overall mean waist circumference increased to 38.7 inches in 2012 from 37.5 in 1999. The increases were significant for men, women, non-Hispanic whites, non-Hispanic blacks and Mexican-Americans. They were greatest among non-Hispanic whites in their 40s, and non-Hispanic black men in their 30s. “I would encourage people to keep track of their waists,” said the lead author of the study, Dr. Earl S. Ford, a medical officer with the Centers for Disease Control and Prevention. “Standing on the scale every day is all good and well, but you can have a steady weight and still have an expanding waist. And that should be a signal for people to start looking at their diet and physical activity.” In 2012, 54.2 percent of Americans had abdominal obesity (defined as an age-adjusted waist circumference of more than 40 inches for men and more than 34.6 for women) compared with 46.4 percent in 1999. The study was published in JAMA. © 2014 The New York Times Company
Link ID: 20103 - Posted: 09.23.2014
2014 by Dan Jones The vast majority of people think we have free will and are the authors of our own life stories. But if neuroscientists were one day able to predict our every action based on brain scans, would people abandon this belief in droves? A new study concludes that such knowledge would not by itself be enough to shake our confidence in our own volition. Many neuroscientists, such as the late Francis Crick, have argued that our sense of free will is no more than the behaviour of a vast assembly of nerve cells. This is tied to the idea of determinism, which has it that every effect is preceded by a cause, with cause and effect connected by physical laws. This is why the behaviour of physical systems can be predicted – even the brain, in principle. As author Sam Harris puts it: "If determinism is true, the future is set – and this includes all our future states of mind and our subsequent behaviour." If people lost their belief in their own free will, that would have important consequences for how we think about moral responsibility, and even how we behave. For example, numerous studies have shown that when people are led to reject free will they are more likely to cheat, and are also less bothered about punishing other wrongdoers. For those who argue that what we know about neuroscience is incompatible with free will, predicting what our brain is about to do should reveal the illusory nature of free will, and lead people to reject it. Experimental philosopher Eddy Nahmias at Georgia State University in Atlanta dubs this view "willusionism". He recently set out to test it. © Copyright Reed Business Information Ltd.
Link ID: 20102 - Posted: 09.22.2014
By CLYDE HABERMAN When it came to pharmacological solutions to life’s despairs, Aldous Huxley was ahead of the curve. In Huxley’s 1932 novel about a dystopian future, the Alphas, Betas and others populating his “Brave New World” have at their disposal a drug called soma. A little bit of it chases the blues away: “A gramme” — Huxley was English, remember, spelling included — “is better than a damn.” With a swallow, negative feelings are dispelled. Prozac, the subject of this week’s video documentary from Retro Report, is hardly soma. But its guiding spirit is not dissimilar: A few milligrams of this drug are preferable to the many damns that lie at the core of some people’s lives. Looking back at Prozac’s introduction by Eli Lilly and Company in 1988, and hopscotching to today, the documentary explores the enormous influence, both chemical and cultural, that Prozac and its brethren have had in treating depression, a concern that gained new resonance with the recent suicide of the comedian Robin Williams. In the late 1980s and the 90s, Prozac was widely viewed as a miracle pill, a life preserver thrown to those who felt themselves drowning in the high waters of mental anguish. It was the star in a class of new pharmaceuticals known as S.S.R.I.s — selective serotonin reuptake inhibitors. Underlying their use is a belief that depression is caused by a shortage of the neurotransmitter serotonin. Pump up the levels of this brain chemical and, voilà, the mood lifts. Indeed, millions have embraced Prozac, and swear by it. Depression left them emotionally paralyzed, they say. Now, for the first time in years, they think clearly and can embrace life. Pharmacological merits aside, the green-and-cream pill was also a marvel of commercial branding, down to its market-tested name. Its chemical name is fluoxetine hydrochloride, not the most felicitous of terms. A company called Interbrand went to work for Eli Lilly and came up with Prozac. “Pro” sounds positive. Professional, too. “Ac”? That could signify action. As for the Z, it suggests a certain strength, perhaps with a faint high-techy quality. © 2014 The New York Times Company
Link ID: 20098 - Posted: 09.22.2014
By Jocelyn Kaiser A virus that shuttles a therapeutic gene into cells has strengthened the muscles, improved the motor skills, and lengthened the lifespan of mice afflicted with two neuromuscular diseases. The approach could one day help people with a range of similar disorders, from muscular dystrophy to amyotrophic lateral sclerosis, or ALS. Many of these diseases involve defective neuromuscular junctions—the interface between neurons and muscle cells where brain signals tell muscles to contract. In one such disease, a form of familial limb-girdle myasthenia, people carry two defective copies of the gene called DOK7, which codes for a protein that’s needed to form such junctions. Their hip and shoulder muscles atrophy over many years, and some eventually have trouble breathing or end up in a wheelchair. Mice similarly missing a properly working Dok7 gene are severely underweight and die within a few weeks. In the new study, researchers led by molecular biologist Yuji Yamanashi of the University of Tokyo first injected young mice engineered to have defective Dok7 with a harmless virus carrying a good copy of the Dok7 gene, which is expressed only in muscle. Within about 7 weeks, the rodents recovered. Their muscle cells cranked out the DOK7 protein, and under a microscope their muscles had larger neuromuscular junctions than those of untreated mice with defective Dok7. What’s more, the mice grew to a healthy body weight and had essentially normal scores on tests of motor skills and muscle strength. © 2014 American Association for the Advancement of Science.
By John Horgan On this blog, in my book The End of War and elsewhere (see Further Reading and Viewing), I have knocked the deep roots theory of war, which holds that war stems from an instinct deeply embedded in the genes of our male ancestors. Inter-community killings are rare among chimpanzees and non-existent among bonobos, according to a new report in Nature, undercutting the theory that the roots of war extend back to the common ancestor of humans and chimps. Proponents of this theory—notably primatologist Richard Wrangham—claim it is supported by observations of inter-community killings by chimpanzees, Pan troglodytes, our closest genetic relatives. Skeptics, including anthropologists Robert Sussman and Brian Ferguson, have pointed out that chimpanzee violence might be not an adaptation but a response to environmental circumstances, such as human encroachment. This “human impacts” hypothesis is rejected in a new report in Nature by a coalition of 30 primatologists, including Wrangham and lead author Michael Wilson. In “Lethal aggression in Pan is better explained by adaptive strategies than human impacts,” Wilson et al. analyze 152 killings in 18 chimpanzee communities and find “little correlation with human impacts.” Given that the primary interest in chimp violence is its alleged support of the deep-roots theory, it might seem odd, at first, that Wilson et al. do not mention human warfare. Actually, this omission is wise, because the Nature report undermines the deep-roots theory of war, and establishes that the “human impact” issue is a red herring. © 2014 Scientific American,
I’m an epileptic. It’s not how I define myself, but I am writing about epilepsy, so I think pointing out the fact that I am speaking from experience is acceptable. I may not define myself by my epilepsy but it’s a big part of my life. It affects my life on a daily basis. Because of the epilepsy I can’t drive, can’t pull all-nighters or get up really early just in case I have a seizure. It’s frustrating at times, though I will gladly milk the not getting up early thing when I can, eg bin day. But whereas I’ve grown up with it, having been diagnosed when I was 17, most people I’ve met don’t understand it. You mention the fact that you’re epileptic to some people and they look at you like they’re a robot you’ve just asked to explain the concept of love; they adopt a sort of “DOES NOT COMPUTE!” expression. They often don’t know what to say, or do, or even what epilepsy is and often spend the rest of the conversation searching their data banks for information on what to do if I have a seizure, like “Do I … put a spoon in his mouth?” For the record: no, you don’t. If putting a spoon in an epileptics mouth helped, then we would be prescribed a constant supply of Fruit Corners. So let me put you at ease. No one expects you to know that much about epilepsy (unless you’re responsible for treating it). There are many different types, with many different causes. Not everyone has seizures and often those who do, when given the correct meds, can live pretty much fit-free lives. © 2014 Guardian News and Media Limited
Link ID: 20091 - Posted: 09.18.2014
By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.
Link ID: 20088 - Posted: 09.18.2014
By Virginia Morell Living in a complex social world—one with shifting alliances and competitors—is often cited as the key reason humans, dolphins, and spotted hyenas evolved large brains. Now, researchers say that social complexity also underlies the braininess of parrots, which have big brains relative to their body size. To understand the social lives of these birds, the scientists observed wild populations of monk parakeets (Myiopsitta monachus), a small parrot, in Argentina and captive ones in Florida. They recorded how often the birds (pictured) were seen with other individuals and how they interacted—and then analyzed the parakeets’ social networks. The birds, the researchers report online today in The Auk: Ornithological Advances, prefer to spend time with one specific individual, usually their mate. In the captive populations, the birds also had strong associations with one or two other individuals, numerous more moderate relationships, and only a few that were weak. The scientists also recorded aggressive interactions among the captive birds, revealing that monk parakeets have a dominance hierarchy based on which birds won or lost confrontations. Thus, the parakeets’ society has layers of relationships, similar to those documented in other big-brained animals. Living in such a society requires that the birds recognize and remember others, and whether they are friend or foe—mental tasks that are thought to be linked to the evolution of significant cognitive skills. © 2014 American Association for the Advancement of Science
Link ID: 20087 - Posted: 09.18.2014
|By Daniel A. Yudkin If you’re reading this at a desk, do me a favor. Grab a pen or pencil and hold the end between your teeth so it doesn’t touch your lips. As you read on, stay that way—science suggests you’ll find this article more amusing if you do. Why? Notice that holding a pencil in this manner puts your face in the shape of a smile. And research in psychology says that the things we do—smiling at a joke, giving a gift to a friend, or even running from a bear—influence how we feel. This idea—that actions affect feelings—runs counter to how we generally think about our emotions. Ask average folks how emotions work—about the causal relationship between feelings and behavior—and they’ll say we smile because we’re happy, we run because we’re afraid. But work by such psychologists as Fritz Strack, Antonio Damasio, Joe LeDoux shows the truth is often the reverse: what we feel is actually the product, not the cause, of what we do. It’s called “somatic feedback.” Only after we act do we deduce, by seeing what we just did, how we feel. This bodes well, at first blush, for anyone trying to change their emotions for the better. All you’d need to do is act like the kind of person you want to be, and that’s who you’ll become. (Call it the Bobby McFerrin philosophy: “Aren’t happy? Don’t worry. Just smile!”) But new research, published in the Journal of Experimental Social Psychology by Aparna Labroo, Anirban Mukhopadhyay, and Ping Dong suggests there may be limits to our ability to proactively manage our own well-being. The team ran a series of studies examining whether more smiling led to more happiness. One asked people how much smiling they had done that day, and how happy they currently felt. Other studies manipulated the amount of smiling people actually did, either by showing them a series of funny pictures or by replicating a version of the pencil-holding experiment. As expected, across these experiments, the researchers found that the more people smiled, the happier they reported being. © 2014 Scientific American
Link ID: 20085 - Posted: 09.17.2014
By Douglas Main Researchers have created a blood test that they have used to accurately diagnose depression in a small sample of people, and they hope that with time and funding it could be used on a widespread basis. It is the first blood test—and thus the first “objective” gauge—for any type of mental disorder in adults, says study co-author Eva Redei, a neuroscientist at Northwestern University in Evanston, Ill. Outside experts caution, however, that the results are preliminary, and not close to ready for use the doctor’s office. Meanwhile, diagnosing depression the “old-fashioned way” through an interview works quite well, and should only take 10 to 15 minutes, says Todd Essig, a clinical psychologist in New York. But many doctors are increasingly overburdened and often not reimbursed for taking the time to talk to their patients, he says. The test works by measuring blood levels of nine different types of RNA, a chemical that the body uses to process DNA. Besides accurately diagnosing depression, which affects perhaps 10 percent of American adults and is becoming more common, the technique may also be able to tell who could benefit from talk therapy and who may be vulnerable to the condition in the first place. In a study describing the test, published in the journal Translational Psychiatry, the scientists recruited 32 patients who were diagnosed with depression using a clinical interview, the standard technique. They also got 32 non-depressed patients to participate as a control group. © 2014 Newsweek LLC
Link ID: 20084 - Posted: 09.17.2014
By Neuroskeptic Today, we are thinking – and talking – about the brain more than ever before. It is widely said that neuroscience has much to teach psychiatry, cognitive science, economics, and others. Practical applications of brain science are proposed in the fields of politics, law enforcement and education. The brain is everywhere. This “Neuro Turn” has, however, not always been accompanied by a critical attitude. We ought to be skeptical of any claims regarding the brain because it remains a mystery – we fundamentally do not understand how it works. Yet much neuro-discourse seems to make the assumption that the brain is almost a solved problem already. For example, media stories about neuroscience commonly contain simplistic misunderstandings – such as the tendency to over-interpret neural activation patterns as practical guides to human behavior. For instance, recently we have heard claims that because fMRI finds differences in the brain activity of some violent offenders, this means that their criminal tendencies are innate and unchangeable – with clear implications for rehabilitation. Neuroscientists are well aware of the faults in lay discourse about the brain – and are increasingly challenging them e.g. on social media. Unfortunately, the same misunderstandings also exist within neuroscience itself. For example, I argue, much of cognitive neuroscience is actually based on (or, only makes sense given the assumption that) the popular misunderstanding that brain activity has a psychological ‘meaning’. In fact, we just do not know what a given difference in brain activity means, in the vast majority of cases. Thus, many research studies based on finding differences in fMRI activity maps across groups or across conditions, are not really helping us to understand the brain at all – but only providing us with a canvas to project our misunderstandings onto it.
Keyword: Brain imaging
Link ID: 20082 - Posted: 09.17.2014
Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group
Link ID: 20080 - Posted: 09.16.2014
By ANNA FELS THE idea of putting a mind-altering drug in the drinking water is the stuff of sci-fi, terrorist plots and totalitarian governments. Considering the outcry that occurred when putting fluoride in the water was first proposed, one can only imagine the furor that would ensue if such a thing were ever suggested. The debate, however, is moot. It’s a done deal. Mother Nature has already put a psychotropic drug in the drinking water, and that drug is lithium. Although this fact has been largely ignored for over half a century, it appears to have important medical implications. Lithium is a naturally occurring element, not a molecule like most medications, and it is present in the United States, depending on the geographic area, at concentrations that can range widely, from undetectable to around .170 milligrams per liter. This amount is less than a thousandth of the minimum daily dose given for bipolar disorders and for depression that doesn’t respond to antidepressants. Although it seems strange that the microscopic amounts of lithium found in groundwater could have any substantial medical impact, the more scientists look for such effects, the more they seem to discover. Evidence is slowly accumulating that relatively tiny doses of lithium can have beneficial effects. They appear to decrease suicide rates significantly and may even promote brain health and improve mood. Yet despite the studies demonstrating the benefits of relatively high natural lithium levels present in the drinking water of certain communities, few seem to be aware of its potential. Intermittently, stories appear in the scientific journals and media, but they seem to have little traction in the medical community or with the general public. The New York Times Company
Link ID: 20077 - Posted: 09.15.2014
By Abby Phillip Most long-time, pack-a-day smokers who took part in a small study were able to quit smoking after six months, and researchers believe the hallucinogenic substance found in "magic mushrooms" could be the reason why. The study of the 15 participants, published this week in the Journal of Psychopharmacology, is the first to look at the feasibility of using the psychedelic drug psilocybin to aid in a smoking cessation treatment program. Existing treatments, from quitting cold turkey to prescription medications like Varenicline (Chantix), work for some people, but not the majority of smokers. With Varenicline, which mimics the effect of nicotine in the body, only about 35 percent of participants in a clinical trial were still abstaining from smoking six months later. Nearly half of all adult smokers reported that they tried to quit in 2010, according to the Centers for Disease Control and Prevention, yet 480,000 deaths are attributed to the addiction every year. Researchers at Johns Hopkins University recruited a group of long-time, heavy smokers — an average of 19 cigarettes a day for an average of 31 years — to participate in the study. They were treated with cognitive behavioral therapy for 15 weeks, and they were given a dose of the hallucinogen psilocybin at the five-week mark, when they had agreed to stop smoking. Although it was a small study, the results were promising. Twelve of the participants had quit smoking six months after being treated with the drug.
Keyword: Drug Abuse
Link ID: 20076 - Posted: 09.15.2014
By Tara Parker-Pope The most reliable workers are those who get seven to eight hours of sleep each night, a new study shows. Researchers from Finland analyzed the sleep habits and missed work days among 3,760 men and women over about seven years. The workers ranged in age from 30 to 64 at the start of the study. The researchers found that the use of sick days was associated with the worker’s sleep habits. Not surprisingly, they found that people who did not get enough sleep because of insomnia or other sleep problems were more likely to miss work. But notably, getting a lot of extra sleep was also associated with missed work. The workers who were most likely to take extra sick days were those who slept five hours or less or 10 hours or more. Short sleepers and long sleepers missed about five to nine more days of work than so-called optimal sleepers, workers who managed seven to eight hours of sleep each night. The workers who used the fewest number of sick days were women who slept an average of 7 hours 38 minutes a night and men who slept an average of 7:46. The study results were published in the September issue of the medical journal Sleep. © 2014 The New York Times Company
Link ID: 20074 - Posted: 09.15.2014