Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o
Link ID: 20347 - Posted: 11.24.2014
By CLYDE HABERMAN The notion that a person might embody several personalities, each of them distinct, is hardly new. The ancient Romans had a sense of this and came up with Janus, a two-faced god. In the 1880s, Robert Louis Stevenson wrote “Strange Case of Dr. Jekyll and Mr. Hyde,” a novella that provided us with an enduring metaphor for good and evil corporeally bound. Modern comic books are awash in divided personalities like the Hulk and Two-Face in the Batman series. Even heroic Superman has his alternating personas. But few instances of the phenomenon captured Americans’ collective imagination quite like “Sybil,” the study of a woman said to have had not two, not three (like the troubled figure in the 1950s’ “Three Faces of Eve”), but 16 different personalities. Alters, psychiatrists call them, short for alternates. As a mass-market book published in 1973, “Sybil” sold in the millions. Tens of millions watched a 1976 television movie version. The story had enough juice left in it for still another television film in 2007. Sybil Dorsett, a pseudonym, became the paradigm of a psychiatric diagnosis once known as multiple personality disorder. These days, it goes by a more anodyne label: dissociative identity disorder. Either way, the strange case of the woman whose real name was Shirley Ardell Mason made itself felt in psychiatrists’ offices across the country. Pre-"Sybil,” the diagnosis was rare, with only about 100 cases ever having been reported in medical journals. Less than a decade after “Sybil” made its appearance, in 1980, the American Psychiatric Association formally recognized the disorder, and the numbers soared into the thousands. People went on television to tell the likes of Jerry Springer and Leeza Gibbons about their many alters. One woman insisted that she had more than 300 identities within her (enough, if you will, to fill the rosters of a dozen major-league baseball teams). Even “Eve,” whose real name is Chris Costner Sizemore, said in the mid-1970s that those famous three faces were surely an undercount. It was more like 22, she said. © 2014 The New York Times Company
Link ID: 20346 - Posted: 11.24.2014
Christopher Stringer Indeed, skeletal evidence from every inhabited continent suggests that our brains have become smaller in the past 10,000 to 20,000 years. How can we account for this seemingly scary statistic? Some of the shrinkage is very likely related to the decline in humans' average body size during the past 10,000 years. Brain size is scaled to body size because a larger body requires a larger nervous system to service it. As bodies became smaller, so did brains. A smaller body also suggests a smaller pelvic size in females, so selection would have favored the delivery of smaller-headed babies. What explains our shrinking body size, though? This decline is possibly related to warmer conditions on the earth in the 10,000 years after the last ice age ended. Colder conditions favor bulkier bodies because they conserve heat better. As we have acclimated to warmer temperatures, the way we live has also generally become less physically demanding, which overall serves to drive down body weights. Another likely reason for this decline is that brains are energetically expensive and will not be maintained at larger sizes unless it is necessary. The fact that we increasingly store and process information externally—in books, computers and online—means that many of us can probably get by with smaller brains. Some anthropologists have also proposed that larger brains may be less efficient at certain tasks, such as rapid computation, because of longer connection pathways. © 2014 Scientific American
Link ID: 20345 - Posted: 11.24.2014
by Linda Geddes A tapeworm that usually infects dogs, frogs and cats has made its home inside a man's brain. Sequencing its genome showed that it contains around 10 times more DNA than any other tapeworm sequenced so far, which could explain its ability to invade many different species. When a 50-year-old Chinese man was admitted to a UK hospital complaining of headaches, seizures, an altered sense of smell and memory flashbacks, his doctors were stumped. Tests for tuberculosis, syphilis, HIV and Lyme disease were negative, and although an MRI scan showed an abnormal region in the right side of his brain, a biopsy found inflammation, but no tumour. Over the next four years, further MRIs recorded the abnormal region moving across the man's brain (see animation), until finally his doctors decided to operate. To their immense surprise, they pulled out a 1 centimetre-long ribbon-shaped worm. It looked like a tapeworm, but was unlike any seen before in the UK, so a sample of its tissue was sent to Hayley Bennett and her colleagues at the Wellcome Trust Sanger Institute in Cambridge, UK. Genetic sequencing identified it as Spirometra erinaceieuropaei, a rare species of tapeworm found in China, South Korea, Japan and Thailand. Just 300 human infections have been reported since 1953, and not all of them in the brain. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 20344 - Posted: 11.21.2014
By Tara Parker-Pope Most people who drink to get drunk are not alcoholics, suggesting that more can be done to help heavy drinkers cut back, a new government report concludes. The finding, from a government survey of 138,100 adults, counters the conventional wisdom that every “falling-down drunk” must be addicted to alcohol. Instead, the results from the National Survey on Drug Use and Health show that nine out of 10 people who drink too much are not addicts, and can change their behavior with a little — or perhaps a lot of — prompting. “Many people tend to equate excessive drinking with alcohol dependence,’’ sad Dr. Robert Brewer, who leads the alcohol program at the Centers for Disease Control and Prevention. “We need to think about other strategies to address these people who are drinking too much but who are not addicted to alcohol.” Excessive drinking is viewed as a major public health problem that results in 88,000 deaths a year, from causes that include alcohol poisoning and liver disease, to car accidents and other accidental deaths. Excessive drinking is defined as drinking too much at one time or over the course of a week. For men, it’s having five or more drinks in one sitting or 15 drinks or more during a week. For women, it’s four drinks on one occasion or eight drinks over the course of a week. Underage drinkers and women who drink any amount while pregnant also are defined as “excessive drinkers.” Surprisingly, about 29 percent of the population meets the definition for excessive drinking, but 90 percent of them do not meet the definition of alcoholism. That’s good news because it means excessive drinking may be an easier problem to solve than previously believed. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20342 - Posted: 11.21.2014
By Jyoti Madhusoodanan Eurasian jays are tricky thieves. They eavesdrop on the noises that other birds make while hiding food in order to steal the stash later, new research shows. Scientists trying to figure out if the jays (Garrulus glandarius) could remember sounds and make use of the information placed trays of two materials—either sand or gravel—in a spot hidden from a listening jay’s view. Other avian participants of the same species, which were given a nut, cached the treat in one of the two trays. Fifteen minutes later, the listening bird was permitted to hunt up the stash (video). When food lay buried in a less noisy material such as sand, jays searched randomly. But if they heard gravel being tossed around as treats were hidden, they headed to the pebbles to pilfer the goods. Previous studies have shown that jays—like crows, ravens, and other bird burglars that belong to the corvid family—can remember where they saw food being hidden and return to the spot to look for the cache. But these new results, published in Animal Cognition this month, provide the first evidence that these corvids can also recollect sounds to locate and steal stashes of food. In their forest homes, where birds are heard more often than they are seen, this sneaky strategy might give eavesdropping jays a better chance at finding hidden feasts.
Link ID: 20339 - Posted: 11.21.2014
By Emily Underwood WASHINGTON, D.C.—Rapid changes unfold in the brain after a person's hand is amputated. Within days—and possibly even hours—neurons that once processed sensations from the palm and fingers start to shift their allegiances, beginning to fire in response to sensations in other body parts, such as the face. But a hand transplant can bring these neurons back into the fold, restoring the sense of touch nearly back to normal, according to a study presented here this week at the annual conference of the Society for Neuroscience. To date, roughly 85 people worldwide have undergone hand replant or transplant surgery, an 8- to 10-hour procedure in which surgeons reattach the bones, muscles, nerves, blood vessels, and soft tissue between the patient's severed wrist and their own hand or one from a donor, often using a needle finer than a human hair. After surgery, studies have shown that it takes about 2 years for the peripheral nerves to regenerate, with sensation slowly creeping through the palm and into the fingertips at a rate of roughly 2 mm per day, says Scott Frey, a cognitive neuroscientist at the University of Missouri, Columbia. Even once the nerves have regrown, the surgically attached hand remains far less sensitive to touch than the original hand once was. One potential explanation is that the brain's sensory "map" of the body—a series of cortical ridges and folds devoted to processing touch in different body parts—loses its ability to respond to the missing hand in the absence of sensory input, Frey says. If that's true, the brain may need to reorganize that sensory map once again in order to fully restore sensation. © 2014 American Association for the Advancement of Science
By Esther Hsieh A little-known fact: the tongue is directly connected to the brain stem. This anatomical feature is now being harnessed by scientists to improve rehabilitation. A team at the University of Wisconsin–Madison recently found that electrically stimulating the tongue can help patients with multiple sclerosis (MS) improve their gait. MS is an incurable disease in which the insulation around the nerves becomes damaged, disrupting the communication between body and brain. One symptom is loss of muscle control. In a study published in the Journal of Neuro-Engineering and Rehabilitation, Wisconsin neuroscientist Yuri Danilov and his team applied painless electrical impulses to the tip of the tongue of MS patients during physical therapy. Over a 14-week trial, patients who got tongue stimulation improved twice as much on variables such as balance and fluidity as did a control group who did the same regimen without stimulation. The tongue has extensive motor and sensory integration with the brain, Danilov explains. The nerves on the tip of the tongue are directly connected to the brain stem, a crucial hub that directs basic bodily processes. Previous research showed that sending electrical pulses through the tongue activated the neural network for balance; such activation may shore up the circuitry weakened by MS. The team is also using tongue stimulation to treat patients with vision loss, stroke damage and Parkinson's. “We have probably discovered a new way for the neurorehabilitation of many neurological disorders,” Danilov says. © 2014 Scientific American
Keyword: Multiple Sclerosis
Link ID: 20332 - Posted: 11.20.2014
By Gretchen Reynolds Exercise seems to be good for the human brain, with many recent studies suggesting that regular exercise improves memory and thinking skills. But an interesting new study asks whether the apparent cognitive benefits from exercise are real or just a placebo effect — that is, if we think we will be “smarter” after exercise, do our brains respond accordingly? The answer has significant implications for any of us hoping to use exercise to keep our minds sharp throughout our lives. In experimental science, the best, most reliable studies randomly divide participants into two groups, one of which receives the drug or other treatment being studied and the other of which is given a placebo, similar in appearance to the drug, but not containing the active ingredient. Placebos are important, because they help scientists to control for people’s expectations. If people believe that a drug, for example, will lead to certain outcomes, their bodies may produce those results, even if the volunteers are taking a look-alike dummy pill. That’s the placebo effect, and its occurrence suggests that the drug or procedure under consideration isn’t as effective as it might seem to be; some of the work is being done by people’s expectations, not by the medicine. Recently, some scientists have begun to question whether the apparently beneficial effects of exercise on thinking might be a placebo effect. While many studies suggest that exercise may have cognitive benefits, those experiments all have had a notable scientific limitation: They have not used placebos. This issue is not some abstruse scientific debate. If the cognitive benefits from exercise are a result of a placebo effect rather than of actual changes in the brain because of the exercise, then those benefits could be ephemeral and unable in the long term to help us remember how to spell ephemeral. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20329 - Posted: 11.20.2014
James Gorman Evidence has been mounting for a while that birds and other animals can count, particularly when the things being counted are items of food. But most of the research is done under controlled conditions. In a recent experiment with New Zealand robins, Alexis Garland and Jason Low at Victoria University of Wellington tested the birds in a natural setting, giving them no training and no rewards, and showed that they knew perfectly well when a scientist had showed them two mealworms in a box, but then delivered only one. The researchers reported the work this fall in the journal Behavioural Processes. The experiment is intriguing to watch, partly because it looks like a child’s magic trick. The apparatus used is a wooden box that has a sliding drawer. After clearly showing a robin that she was dropping two mealworms in a circular well in the box, Dr. Garland would slide in the drawer. It covered the two worms with an identical-looking circular well containing only one worm. When the researcher moved away and the robin flew down and lifted off a cover, it would find only one worm. The robins pecked intensely at the box, behavior they didn’t show if they found the two worms they were expecting. Earlier experiments had also shown the birds to be good at counting, and Dr. Garland said that one reason might be that they are inveterate thieves. Mates, in particular, steal from one another’s food caches, where they hide perishable prey like worms or insects. “If you’ve got a mate that steals 50 or more percent of your food,” she said, you’d better learn how to keep track of how many mealworms you’ve got. © 2014 The New York Times Company
By Bethany Brookshire WASHINGTON – Moldy houses are hard on the lungs, and new results in mice suggest that they could also be bad for the brain. Inhaling mold spores made mice anxious and forgetful, researchers reported November 15 at the annual meeting of the Society for Neuroscience. Cheryl Harding, a psychologist at the City University of New York, and colleagues dripped low doses of spores from the toxic mold Stachybotrys into mouse noses three times per week. After three weeks, the mice didn’t look sick. But they had trouble remembering a fearful place. The mice were also more anxious than normal counterparts. The anxiety and memory deficits went along with decreases in new brain cells in the hippocampus — a part of the brain that plays a role in memory — compared with control mice. Harding and colleagues also found that the behaviors linked to increased inflammatory proteins in the hippocampus. Exposure to mold’s toxins and structural proteins may trigger an immune response in the brain. The findings, Harding says, may help explain some of the conditions that people living in moldy buildings complain about, such as anxiety and cognitive problems. C. Harding et al. Mold inhalation, brain inflammation, and behavioral dysfunction. Society for Neuroscience Meeting, Washington, DC, November 15, 2014. © Society for Science & the Public 2000 - 2014.
By John Bohannon If you had the choice between hurting yourself or someone else in exchange for money, how altruistic do you think you’d be? In one infamous experiment, people were quite willing to deliver painful shocks to anonymous victims when asked by a scientist. But a new study that forced people into the dilemma of choosing between pain and profit finds that participants cared more about other people’s well-being than their own. It is hailed as the first hard evidence of altruism for the young field of behavioral economics. Human behavior toward others is hard to predict. On the one hand, we stand out in the animal world for our altruism, often making significant sacrifices to help out a stranger in need. And all but the most antisocial people experience psychological distress at witnessing, let alone causing, pain in others. Yet study after study in the field of behavioral economics has demonstrated that we tend to value our own needs and desires above those of others. For example, researchers have found that just thinking about money makes people behave more selfishly. To try to reconcile the angels and devils of our nature, a team led by Molly Crockett, a psychologist at the University of Oxford in the United Kingdom, combined the classic psychological and economics tools for probing altruism: pain and money. Everyone has their own pain threshold, so the first task was a pain calibration. Researchers administered electric shocks with electrodes attached to the wrists of 160 subjects, starting at an almost imperceptible level and amping up until the subject described the pain as intolerable. (For most people, that threshold for pain is similar to holding your wrist under a stream of 50°C water.) © 2014 American Association for the Advancement of Science.
By Kate Baggaley WASHINGTON, D.C. — Adding magnets to football helmets could reduce the risk of concussions, new research suggests. When two players collide, the magnets in their helmets would repel each other, reducing the force of the collision. “All helmet design companies and manufacturers have the same approach, which is to try to disperse the impact energy after the impact’s already occurred,” neuroscientist Raymond Colello said November 15 at the annual meeting of the Society for Neuroscience. The magnets, he says, would put a brake on the impact before it happens. The idea hasn’t been tested yet in helmets with real players, said Judy Cameron, a neuroscientist at the University of Pittsburgh. “But a lot of thought has gone into it, and the data that was shown about the ability of the magnets to actually repel each other looked extremely promising.” On the field, football players can run at nearly 20 miles per hour and can experience up to 150 g’s of force upon impact. Concussions readily occur at impacts greater than 100 g’s. Every year there are 100,000 concussions at all levels of play among the nearly 1.2 million people who play football in the United States. Colello, of Virginia Commonwealth University in Richmond, is testing magnets made in China from the rare-earth element neodymium. They are the most powerful commercially available magnets and weigh about one-third of a pound each (football helmets weigh from 3.5 to 5.5 pounds). When placed one-fourth of an inch away from each other, two magnets with their same poles face-to-face exert nearly 100 pounds of repulsive force. © Society for Science & the Public 2000 - 2014
Keyword: Brain Injury/Concussion
Link ID: 20317 - Posted: 11.17.2014
By Adam Brimelow Health Correspondent, BBC News A Mediterranean diet may be a better way of tackling obesity than calorie counting, leading doctors have said. Writing in the Postgraduate Medical Journal (PMJ), the doctors said a Mediterranean diet quickly reduced the risk of heart attacks and strokes. And they said it may be better than low-fat diets for sustained weight loss. Official NHS advice is to monitor calorie intake to maintain a healthy weight. Last month NHS leaders stressed the need for urgent action to tackle obesity and the health problems that often go with it. The PMJ editorial argues a focus on food intake is the best approach, but it warns crash dieting is harmful. Signatories of the piece included the chair of the Academy of Medical Royal Colleges, Prof Terence Stephenson, and Dr Mahiben Maruthappu, who has a senior role at NHS England. They criticise the weight-loss industry for focusing on calorie restriction rather than "good nutrition". And they make the case for a Mediterranean diet, including fruit and vegetables, nuts and olive oil, citing research suggesting it quickly reduces the risk of heart attacks and strokes, and may be better than low-fat diets for sustained weight loss. The lead author, cardiologist Dr Aseem Malhotra, says the scientific evidence is overwhelming. "What's more responsible is that we tell people to concentrate on eating nutritious foods. "It's going to have an impact on their health very quickly. We know the traditional Mediterranean diet, which is higher in fat, proven from randomised controlled trials, reduces the risk of heart attack and stroke even within months of implementation." The article also says adopting a Mediterranean diet after a heart attack is almost three times as effective at reducing deaths as taking cholesterol-lowering statin medication. BBC © 2014
Link ID: 20316 - Posted: 11.17.2014
By Emma Wilkinson Health reporter, BBC News Taking vitamin B12 and folic acid supplements does not seem to cut the risk of developing dementia in healthy people, say Dutch researchers. In one of the largest studies to date, there was no difference in memory test scores between those who had taken the supplements for two years and those who were given a placebo. The research was published in the journal Neurology. Alzheimer's Research UK said longer trials were needed to be sure. B vitamins have been linked to Alzheimer's for some years, and scientists know that higher levels of a body chemical called homocysteine can raise the risk of both strokes and dementia. Vitamin B12 and folic acid are both known to lower levels of homocysteine. That, along with studies linking low vitamin B12 and folic acid intake with poor memory, had prompted scientists to view the supplements as a way to ward off dementia. Yet in the study of almost 3,000 people - with an average age of 74 - who took 400 micrograms of folic acid and 500 micrograms of vitamin B12 or a placebo every day, researchers found no evidence of a protective effect. All those taking part in the trial had high blood levels of homocysteine, which did drop more in those taking the supplements. But on four different tests of memory and thinking skills taken at the start and end of the study, there was no beneficial effect of the supplements on performance. The researchers did note that the supplements might slightly slow the rate of decline but concluded the small difference they detected could just have been down to chance. Study leader Dr Rosalie Dhonukshe-Rutten, from Wageningen University in the Netherlands, said: "Since homocysteine levels can be lowered with folic acid and vitamin B12 supplements, the hope has been that taking these vitamins could also reduce the risk of memory loss and Alzheimer's disease. BBC © 2014
Link ID: 20313 - Posted: 11.15.2014
Carl Zimmer In the early 1970s, Sarah Blaffer Hrdy, then a graduate student at Harvard, traveled to India to study Hanuman langurs, monkeys that live in troops, each made up of several females and a male. From time to time, Dr. Hrdy observed a male invade a troop, driving off the patriarch. And sometimes the new male performed a particularly disturbing act of violence. He attacked the troop’s infants. There had been earlier reports of infanticide by adult male mammals, but scientists mostly dismissed the behavior as an unimportant pathology. But in 1974, Dr. Hrdy made a provocative counter proposal: infanticide, she said, is the product of mammalian evolution. By killing off babies of other fathers, a male improves his chances of having more of his own offspring. Dr. Hrdy went on to become a professor at the University of California, Davis, and over the years she broadened her analysis, arguing that infanticide might well be a common feature of mammalian life. She spurred generations of scientists to document the behavior in hundreds of species. “She’s the goddess of all this stuff,” said Kit Opie, a primatologist at University College London. Forty years after Dr. Hrdy’s initial proposal, two evolutionary biologists at the University of Cambridge have surveyed the evolution of infanticide across all mammals. In a paper published Thursday in Science, the scientists concluded that only certain conditions favor the evolution of infanticide — the conditions that Dr. Hrdy had originally proposed. “My main comment is, ‘Well done,'” said Dr. Hrdy. She said the study was particularly noteworthy for its scope, ranging from opossum to lions. The authors of the new study, Dieter Lukas and Elise Huchard, started by plowing through the scientific literature, looking for evidence of infanticide in a variety of mammalian species. The researchers ended up with data on 260 species, and in 119 of them — over 45 percent — males had been observed killing unrelated young animals. © 2014 The New York Times Company
by Helen Thomson Could a futuristic society of humans with the power to control their own biological functions ever become reality? It's not as out there as it sounds, now the technical foundations have been laid. Researchers have created a link between thoughts and cells, allowing people to switch on genes in mice using just their thoughts. "We wanted to be able to use brainwaves to control genes. It's the first time anyone has linked synthetic biology and the mind," says Martin Fussenegger, a bioengineer at ETH Zurich in Basel, Switzerland, who led the team behind the work. They hope to use the technology to help people who are "locked-in" – that is, fully conscious but unable to move or speak – to do things like self-administer pain medication. It might also be able to help people with epilepsy control their seizures. In theory, the technology could be used for non-medical purposes, too. For example, we could give ourselves a hormone burst on demand, much like in the Culture – Iain M. Banks's utopian society, where people are able to secrete hormones and other chemicals to change their mood. Fussenegger's team started by inserting a light-responsive gene into human kidney cells in a dish. The gene is activated, or expressed, when exposed to infrared light. The cells were engineered so that when the gene activated, it caused a cascade of chemical reactions leading to the expression of another gene – the one the team wanted to switch on. © Copyright Reed Business Information Ltd.
Details of the role of glutamate, the brain’s excitatory chemical, in a drug reward pathway have been identified for the first time. This discovery in rodents — published today in Nature Communications — shows that stimulation of glutamate neurons in a specific brain region (the dorsal raphe nucleus) leads to activation of dopamine-containing neurons in the brain’s reward circuit (dopamine reward system). Dopamine is a neurotransmitter present in regions of the brain that regulate movement, emotion, motivation, and feelings of pleasure. Glutamate is a neurotransmitter whose receptors are important for neural communication, memory formation, and learning. The research was conducted at the Intramural Research Program (IRP) of the National Institute on Drug Abuse (NIDA), which is part of the National Institutes of Health. The research focused on the dorsal raphe nucleus, which has long been a brain region of interest to drug abuse researchers, since nerve cells in this area connect to part of the dopamine reward system. Many of the pathways are rich in serotonin, a neurotransmitter linked to mood regulation. Even though electrical stimulation of the dorsal raphe nucleus promotes reward-related behaviors, drugs that increase serotonin have low abuse potential. As a result, this region of the brain has always presented a seeming contradiction, since it is involved in drug reward but is also abundant in serotonin - a chemical not known for a role in drug reinforcement. This has led researchers to theorize that another neurotransmitter may be responsible for the role that the dorsal raphe nucleus plays in reward.
Keyword: Drug Abuse
Link ID: 20308 - Posted: 11.13.2014
By Kate Kelland LONDON (Reuters) - British scientists say they have found the best way yet to analyze the effects of smoking on the brain -- by taking functional magnetic resonance imaging (fMRI) scans of people while they puff on e-cigarettes. In a small pilot study, the researchers used electronic cigarettes, or e-cigarettes, to mimic the behavioral aspects of smoking tobacco cigarettes, and say future studies could help scientists understand why smoking is so addictive. E-cigarettes use battery-powered cartridges to produce a nicotine-laced vapor to inhale -- hence the new term "vaping". Their use has rocketed in recent years, but there is fierce debate about the risks and benefits. Some public health experts say they could help millions quit tobacco cigarettes, while others argue they could "normalize" the habit and lure children into smoking. While that argument rages, tobacco kills some 6 million people a year, and the World Health Organization estimates that could rise beyond 8 million by 2030. Matt Wall, an imaging scientist at Imperial College London who led the study using e-cigarettes, said he was not aiming to pass judgment on their rights or wrongs, but to use them to dig deeper into smoking addiction. The fact that other forms of nicotine replacement therapy, such as patches or gum, have had only limited success in getting hardened smokers to quit suggests they are hooked on more than just nicotine, he noted. © 2014 Scientific American
Emily Anthes Anna's life began to unravel in 2005 when her husband of 30 years announced that he had fallen in love with another woman. “It had never even occurred to me that my marriage could ever end,” recalls Anna, a retired lawyer then living in Philadelphia, Pennsylvania. “It was pretty shocking.” Over the course of several months, Anna stopped wanting to get up in the morning. She felt tired all the time, and consumed by negative thoughts. “'I'm worthless.' 'I messed up everything.' 'It's all my fault.'” She needed help, but her first therapist bored her and antidepressants only made her more tired. Then she found Cory Newman, director of the Center for Cognitive Therapy at the University of Pennsylvania, who started her on a different kind of therapy. Anna learned how to obsess less over her setbacks and give herself more credit for her triumphs. “It was so helpful to talk to someone who steered me to more positive ways of thinking,” says Anna, whose name has been changed at her request. Cognitive therapy, commonly known as cognitive behavioural therapy (CBT), aims to help people to identify and change negative, self-destructive thought patterns. And although it does not work for everyone with depression, data have been accumulating in its favour. “CBT is one of the clear success stories in psychotherapy,” says Stefan Hofmann, a psychologist at Boston University in Massachusetts. Antidepressant drugs are usually the first-line treatment for depression. They are seen as a quick, inexpensive fix — but clinical trials reveal that only 22–40% of patients emerge from depression with drugs alone. Although there are various approaches to psychotherapy, CBT is the most widely studied; a meta-analysis1 published this year revealed that, depending on how scientists measure outcomes, between 42% and 66% of patients no longer meet the criteria for depression after therapy. © 2014 Nature Publishing Group
Link ID: 20306 - Posted: 11.13.2014