Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Ben Thomas The past several years have brought two parallel revolutions in neuroscience. Researchers have begun using genetically encoded sensors to monitor the behavior of individual neurons, and they’ve been using brief pulses of light to trigger certain types of neurons to activate. These two techniques are known collectively as optogenetics—the science of using light to read and activate genetically specified neurons—but until recently, most researchers have used them separately. Though many had tried, no one had succeeded in combining optogenetic readout and stimulation into one unified system that worked in the brains of living animals. But now, a team led by Michael Hausser, a neuroscientist at University College London’s Wolfson Institute for Biomedical Research, has succeeded in creating just such a unified optogenetic input/output system. In a paper published this January in the journal Nature Methods [Scientific American is part of the Nature Publishing Group], the team explain how they’ve used the system to record complex signaling codes used by specific sets of neurons and to “play” those codes back by reactivating the same neural firing patterns they recorded, paving the way to get neural networks in the brains of living animals to recognize and respond to the codes they send. “This is going to be a game-changer,” Hausser says. Conventional optogenetics starts with genes. Certain genes encode instructions for producing light-sensitive proteins. By introducing these genes into brain cells, researchers are able to trick specific populations of those cells—all the neurons in a given brain region that respond to dopamine, for example—to fire their signals in response to tiny pulses of light. © 2015 Scientific American
Keyword: Brain imaging
Link ID: 20514 - Posted: 01.23.2015
By Bruce Bower Alexithymia: An inability to find words to describe one’s own feelings Mental health workers regard alexithymia as more akin to a personality trait than to a mental disorder. Many people with psychiatric conditions such as autism spectrum disorder and panic disorder — characterized by physical symptoms with emotional causes — also display alexithymia. Researchers are finding that alexithymia has the same effect on people with and without mental disorders and that it undermines the ability to describe others’ feelings as well as one’s own. A study appearing online January 21 in Royal Society Open Science found that nine of 21 young women with eating disorders had difficulty recognizing others’ facial emotions and that this characteristic was probably related to alexithymia, not some inherent feature of anorexia or bulimia. The researchers also looked at 21 women who had alexithymia but no psychiatric disorders and found that seven had comparable problems identifying others’ expressions of happiness, fear and other emotions. Citations R. Brewer et al. Emotion recognition deficits in eating disorders are explained by co-occurring alexithymia. Royal Society Open Science. Published online January 21, 2015. doi: 10.1098/rsos.140382. © Society for Science & the Public 2000 - 2015.
by Catherine Brahic Move over Homo habilis, you're being dethroned. A growing body of evidence – the latest published this week – suggests that our "handy" ancestor was not the first to use stone tools. In fact, the ape-like Australopithecus may have figured out how to be clever with stones before modern humans even evolved. Humans have a way with flint. Sure, other animals use tools. Chimps smash nuts and dip sticks into ant nests to pull out prey. But humans are unique in their ability to apply both precision and strength to their tools. It all began hundreds of thousands of years ago when a distant ancestor began using sharp stone flakes to scrape meat off skin and bones. So who were those first toolmakers? In 2010, German researchers working in Ethiopia discovered markings on two animal bones that were about 3.4 million years old. The cut marks had clearly been made using a sharp stone, and they were at a site that was used by Lucy's species, Australopithecus afarensis. The study, led by Shannon McPherron of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, was controversial. The bones were 800,000 years older than the oldest uncontested stone tools, and at the time few seriously thought that australopithecines had been tool users. Plus, McPherron hadn't found the tool itself. The problem, says McPherron, is that if we just go on tools that have been found, we must conclude that one day somebody made a beautifully flaked Oldowan hand axe, completely out of the blue. That seems unlikely. © Copyright Reed Business Information Ltd.
Link ID: 20512 - Posted: 01.23.2015
By Susan Chenelle and Audrey Fisch “Lord of the Flies” has been a classroom staple for decades, perhaps because the issues of bullying and male aggression remain central concerns in the lives of adolescents, even if they aren’t stranded on a desert island. “To Study Aggression, a Fight Club for Flies” zeros in on the issue of male aggression, but in fruit flies, rather than humans. The connections, beyond the titular, are tantalizing. James Gorman, the science reporter, is focused on research about the neuropeptide tachykinin, produced in the brains of male fruit flies only. When researchers manipulated the neurons, they could decrease aggression in the flies. What does this suggest about the neuroscience of aggression? And what is the relationship between aggression and gender? Below, we match Mr. Gorman’s article with a passage from Chapter 8 of “Lord of the Flies” in which Jack leads his peers in the hunt of a sow. At this point in the novel, Jack has overthrown Ralph and Piggy’s attempts to establish order and civility among the boys. Jack has won over a majority of the boys, and in this scene the group engages in a collective hunt for food that transforms itself into a kind of orgy of male violence. The gender politics of the scene are striking: The attack on the mother pig calls out for careful analysis. The boys are, for example, “wedded to her in lust” and climactically “heavy and fulfilled upon her” at the moment of her killing. What point is William Golding trying to make, here and elsewhere in the novel, about the nature of these young men and the ways in which they turn to and relish in aggression and violence? Key Question: What is the relationship between aggression and gender? © 2015 The New York Times Company
by Linda Geddes OUR personality literally shapes our world. It helps determine how many friends we have, which jobs we excel in and how we cope with adversity. Now it seems it may even play a role in our health – and not just in terms of any hypochondriac tendencies we harbour, but also how prone our bodies are to getting sick in the first place. It is a provocative idea but one that has been steadily gaining traction. We think of conscientiousness, for example, as a positive trait because it suggests caution, careful planning and an aversion to potential danger. But could it also be a symptom of underlying weakness in the immune system? That's one interpretation of a study published last month that sought to pick apart the links between personality traits and the immune system. It found that highly conscientious people had lower levels of inflammation; an immune response that helps the body fight infection and recover from injury. Highly extrovert people had higher levels. This may mean that extroverts are more physically robust – at least while they're young. While this sounds like good news, there's also a downside since sustained inflammation over a lifetime may leave you vulnerable to diabetes, atherosclerosis and cancer. "The biggest take-home message is that what happens in our health is connected to what happens in our heads and what happens in our lives," says Steven Cole at the University of California in Los Angeles (UCLA), who supervised the research. © Copyright Reed Business Information Ltd.
Link ID: 20510 - Posted: 01.22.2015
By Elizabeth Pennisi In the animal kingdom, humans are known for our big brains. But not all brains are created equal, and now we have new clues as to why that is. Researchers have uncovered eight genetic variations that help determine the size of key brain regions. These variants may represent “the genetic essence of humanity,” says Stephan Sanders, a geneticist and pediatrician at the University of California, San Francisco, who was not involved in the study. These results are among the first to come out of the ENIGMA (Enhancing Neuro Imaging Genetics through Meta-Analysis) collaboration, involving some 300 scientists from 33 countries. They contributed MRI scans of more than 30,000 people, along with genetic and other information, most of which had been collected for other reasons. “This paper represents a herculean effort,” Sanders says. Only by pooling their efforts could the researchers track down subtle genetic influences on brain size that would have eluded discovery in smaller studies. “We were surprised we found anything at all,” says Paul Thompson, a neuroscientist at the University of Southern California in Los Angeles. But in the end, “we were able to identify hot points in the genome that help build the brain.” For the analyses, Thompson and his colleagues looked for single-letter (nucleotide base) changes in DNA that correspond to the sizes of key brain regions. One region, the hippocampus, stores memories and helps one learn. Another, called the caudate nucleus, makes it possible to ride a bike, play an instrument, or drive a car without really thinking about it. A third is the putamen, which is involved in running, walking, and moving the body as well as in motivation. The researchers did not try to examine the neocortex, the part of the brain that helps us think and is proportionally much bigger in humans than in other animals. The neocortex has crevices on its surface that look so different from one individual to the next that it’s really hard to measure consistently across labs. © 2015 American Association for the Advancement of Science
|By Simon Makin People with depression process emotional information more negatively than healthy people. They show increased sensitivity to sad faces, for instance, or a weaker response to happy faces. What has been missing is a biological explanation for these biases. Now a study reveals a mechanism: an unusual balance of chemicals in a brain area crucial for the feeling of disappointment. A team led by Roberto Malinow of the University of California, San Diego, studied the lateral habenula, a evolutionarily ancient region deep in the brain [see diagram on bottom]. Neurons in this region are activated by unexpected negative events, such as a punishment out of the blue or the absence of an anticipated reward. For example, studies have shown that primates trained to expect a reward, such as juice, after a visual cue show heightened activity in the lateral habenula if the reward is withheld. Such findings have led to the idea that this area is a key part of a “disappointment circuit.” Past studies have also shown that hyperactivity in the lateral habenula is linked with depressionlike behavior in rodents. In people with depression, low levels of serotonin, the brain chemical targeted by antidepressants, are linked with a rise in lateral habenula activity. The region is unusual because it lacks the standard equipment the brain uses to reduce overactivity: opposing sets of neurons that either increase activity by secreting the chemical glutamate or decrease activity by secreting the chemical GABA. The lateral habenula has very few neurons that decrease activity, so Malinow and his colleagues set out to discover how the brain tamps down activity there. © 2015 Scientific American
Link ID: 20508 - Posted: 01.22.2015
By BENEDICT CAREY The surge of emotion that makes memories of embarrassment, triumph and disappointment so vivid can also reach back in time, strengthening recall of seemingly mundane things that happened just beforehand and that, in retrospect, are relevant, a new study has found. The report, published Wednesday in the journal Nature, suggests that the television detective’s standard query — “Do you remember any unusual behavior in the days before the murder?” — is based on solid brain science, at least in some circumstances. The findings fit into the predominant theory of memory: that it is an adaptive process, continually updating itself according to what knowledge may be important in the future. The new study suggests that human memory has, in effect, a just-in-case file, keeping seemingly trivial sights, sounds and observations in cold storage for a time in case they become useful later on. But the experiment said nothing about the effect of trauma, which shapes memory in unpredictable ways. Rather, it aimed to mimic the arousals of daily life: The study used mild electric shocks to create apprehension and measured how the emotion affected memory of previously seen photographs. In earlier work, researchers had found plenty of evidence in animals and humans of this memory effect, called retroactive consolidation. The new study shows that the effect applies selectively to related, relevant information. “The study provides strong evidence for a specific kind of retroactive enhancement,” said Daniel L. Schacter, a professor of psychology at Harvard who was not involved in the research. “The findings go beyond what we’ve found previously in humans.” © 2015 The New York Times Company
Research suggests that only 20–30% of drug users actually descend into addiction — defined as the persistent seeking and taking of drugs even in the face of dire personal consequences. Why are some people who use drugs able to do so without turning into addicts, while others continue to abuse, even when the repercussions range from jail time to serious health problems? In a comprehensive review in the European Journal of Neuroscience, Barry Everitt outlines the neural correlates and learning-based processes associated with the transition from drug use, to abuse, to addiction. Drug seeking begins as a goal-directed behavior, with an action (finding and taking drugs) leading to a particular outcome (the drug high). This type of associative learning is mediated by the dorsomedial region of the striatum, the area of the brain that is associated with reward processing, which functions primarily through the neurotransmitter dopamine. In this kind of learning, devaluing the outcome (by decreasing the potency of the drug, for example) tends to decrease the pursuit of the action. When the high is not what it used to be, the motivation to continue seeking it out decreases. However, in long-term abusers, this devalued outcome does not reduce the action — indeed, researchers have found that in cases of chronic drug use, a parallel associative learning process eventually comes to the fore. This process is one of stimulus–response; the conditioned stimuli in this case are the various environmental cues — the sight of the powdery white stuff, the smell of burning aluminum foil — that users associate with getting high and that compel them to seek out drugs. © Association for Psychological Science
It is now one hundred years since drugs were first banned - and all through this long century of waging war on drugs, we have been told a story about addiction, by our teachers, and by our governments. This story is so deeply ingrained in our minds that we take it for granted. It seems obvious. It seems manifestly true. Until I set off three and a half years ago on a 30,000-mile journey for my book 'Chasing The Scream - The First And Last Days of the War on Drugs' to figure out what is really driving the drug war, I believed it too. But what I learned on the road is that almost everything we have been told about addiction is wrong - and there is a very different story waiting for us, if only we are ready to hear it. If we truly absorb this new story, we will have to change a lot more than the drug war. We will have to change ourselves. I learned it from an extraordinary mixture of people I met on my travels. From the surviving friends of Billie Holiday, who helped me to learn how the founder of the war on drugs stalked and helped to kill her. From a Jewish doctor who was smuggled out of the Budapest ghetto as a baby, only to unlock the secrets of addiction as a grown man. From a transsexual crack dealer in Brooklyn who was conceived when his mother, a crack-addict, was raped by his father, an NYPD officer. From a man who was kept at the bottom of a well for two years by a torturing dictatorship, only to emerge to be elected President of Uruguay and to begin the last days of the war on drugs. ©2015 TheHuffingtonPost.com, Inc.
By Rachel Feltman Fear is one of our most basic evolutionary instincts, a sudden physical jolt to help us react to danger more quickly. In the modern world, fear often seems excessive -- in the absence of wild animals to flee, we're left screaming over roller coasters and scary movies. But for at least one woman, fear is unobtainable. And while she lives a normal life, her fearlessness is actually a handicap. The researchers who study her keep her closely guarded, using the code-name "SM" when publishing papers about her brave brainpower. And until this year, she'd never been interviewed. "Tell me what fear is," Tranel began. "Well, that's what I'm trying to -- to be honest, I truly have no clue," SM said, her voice raspy. That's actually a symptom of the condition that stole fear from her. Urbach-Wieth disease, which is characterized by a hoarse voice, small bumps around the eyes, and calcium deposits in the brain is rare in its own right -- only 400 people on the planet are known to have it -- but in SM's case, some of those brain-deposits happened to take over her amygdalae. These almond-shaped structures deep inside the brain are crucial to human fear response. And in SM's case, they've been totally calcified since she was a young woman. Now in her 40s, her fear-center is as good as gone. "It's a little bit as if you would go to this region and literally scoop it out," Antonio Damasio, another neuroscientist who studies SM, told "Invisibilia" hosts Lulu Miller and Alix Spiegel.
Link ID: 20504 - Posted: 01.21.2015
Oliver Burkeman One spring morning in Tucson, Arizona, in 1994, an unknown philosopher named David Chalmers got up to give a talk on consciousness, by which he meant the feeling of being inside your head, looking out – or, to use the kind of language that might give a neuroscientist an aneurysm, of having a soul. Though he didn’t realise it at the time, the young Australian academic was about to ignite a war between philosophers and scientists, by drawing attention to a central mystery of human life – perhaps the central mystery of human life – and revealing how embarrassingly far they were from solving it. The scholars gathered at the University of Arizona – for what would later go down as a landmark conference on the subject – knew they were doing something edgy: in many quarters, consciousness was still taboo, too weird and new agey to take seriously, and some of the scientists in the audience were risking their reputations by attending. Yet the first two talks that day, before Chalmers’s, hadn’t proved thrilling. “Quite honestly, they were totally unintelligible and boring – I had no idea what anyone was talking about,” recalled Stuart Hameroff, the Arizona professor responsible for the event. “As the organiser, I’m looking around, and people are falling asleep, or getting restless.” He grew worried. “But then the third talk, right before the coffee break – that was Dave.” With his long, straggly hair and fondness for all-body denim, the 27-year-old Chalmers looked like he’d got lost en route to a Metallica concert. “He comes on stage, hair down to his butt, he’s prancing around like Mick Jagger,” Hameroff said. “But then he speaks. And that’s when everyone wakes up.”
Link ID: 20503 - Posted: 01.21.2015
The presence of a romantic partner during painful medical procedures could make women feel worse rather than better, researchers say. A small study found this increase in pain was most pronounced in women who tended to avoid closeness in their relationships. The authors say bringing a loved one along for support may not be the best strategy for every patient. The work appears in the journal Social Cognitive and Affective Neuroscience. Researchers from University College London, King's College London and the University of Hertfordshire say there has been very little scientific research into the effects of a partner's presence on someone's perception of pain, despite this being common medical advice. They recruited 39 heterosexual couples and asked them a series of questions to measure how much they sought or avoided closeness and emotional intimacy in relationships. Each female volunteer was then subjected to a series of painful laser pulses while her partner was in and then out of the room. The women were asked to score their level of pain. They also had their brain activity measured using a medical test called an EEG. The researchers found that certain women were more likely to score high levels of pain while their partner was in the room. These were women who said they preferred to avoid closeness, trusted themselves more than their partners and felt uncomfortable in their relationships. © 2015 BBC
Keyword: Pain & Touch
Link ID: 20502 - Posted: 01.21.2015
By David Shultz The most venomous animal on the planet isn’t a snake, a spider, or a scorpion; it’s a snail—a cone snail, to be precise. The Conus genus boasts a large variety of marine snails that have adopted an equally diverse assortment of venoms. Online today in the Proceedings of the National Academy of Sciences, researchers report an especially interesting addition to the animals’ arsenal: insulin. According to the paper, this marks the first time insulin has been discovered as a component of venom. Not all cone snails incorporate insulin into their venom cocktail, wonderfully known as nirvana cabal; the hormone was found only in a subset of the animals that hunt with a netting strategy that relies on snaring fish in their large, gaping mouthparts. Unlike the feeding tactics of some cone snails that hunt using speedy venom-tipped “harpoons,” the mouth-netting strategy is a rather slow process. For it to work, the fish either needs to be very unaware of its surroundings or chemically sedated. Scientists speculate that it’s the insulin that provides such sedation. Snails like Conus geographus (seen above) actually produce multiple variants of the hormone, some of which, like one called Con-Ins G1, are more similar to fish insulin than snail varieties. Con-Ins G1 isn’t an exact match of fish insulin though; it’s a stripped-down version that the team suspects may be missing bits that would let fish detect the overdose and respond. If they’re correct, the snail’s venom may yield insight into the nuances of how insulin is regulated that may extend to humans. © 2015 American Association for the Advancement of Science
By Amy Ellis Nutt Scientists have discovered what a traumatic brain injury, or TBI, suffered by a quarter-million combat veterans of Iraq and Afghanistan looks like, and it’s unlike anything they’ve seen before: a honeycomb pattern of broken connections, primarily in the frontal lobes, our emotional control center and the seat of our personality. “In some ways it’s a 100-year-old problem,” said Vassilis Koliatsos, a Johns Hopkins pathologist and neuropsychiatrist. He was referring to the shell-shock victims of World War I, tens of thousands of soldiers who returned home physically sound but mentally wounded, haunted by their experiences and unable to fully resume their lives. “When we started shelling each other on the Western Front of World War I, it created a lot of sick people . . . . [In a way,] we’ve gone back to the Western Front and created veterans who come back and do poorly, and we’re back to the Battle of the Somme,” he said. “They have mood changes, commit suicide, substance abuse, just like in World War I, and they really do poorly and can’t function. It’s a huge problem.” Many of the lingering symptoms of shell shock, or what today is known as neurotrauma, are the same as they were a century ago. Only the nature of the blast has changed, from artillery to improvised explosive devices. Koliatsos and colleagues, who published their findings in the journal Acta Neuropathologica Communications in November, examined the brains of five recent U.S. combat veterans, all of whom suffered a traumatic brain injury from an IED but died of unrelated causes back home. Their controls included the brains of people with a history of auto accidents and of those with no history of auto accidents or TBI. Koliatsos says he was prompted to do this study because he is both a pathologist and a neuropsychiatrist, and he sees many TBI cases, both in veterans and in young people with sports concussions.
By JOHN MARKOFF A new laboratory technique enables researchers to see minuscule biological features, such as individual neurons and synapses, at a nearly molecular scale through conventional optical microscopes. In a paper published last week in the journal Science, researchers at M.I.T. said they were able to increase the physical size of cultured cells and tissue by as much as five times while still preserving their structure. The scientists call the new technique expansion microscopy. The idea of making objects larger to make them more visible is a radical solution to a vexing challenge. By extending the resolving power of conventional microscopes, scientists are able to glimpse such biological mysteries as the protein structures that form ion channels and the outline of the membrane that holds the genome within a cell. The researchers have examined minute neural circuits, gaining new insights into local connections in the brain and a better understanding of larger networks. The maximum resolving power of conventional optical microscopes is about 200 nanometers, about half the wavelength of visible light. (By contrast, a human hair is about 500 times wider.) In recent decades, scientists have struggled to push past these limits. Last year, three scientists received a Nobel Prize for a technique in which fluorescent molecules are used to extend the resolving power of optical microscopes. But the technique requires specialized equipment and is costly. With expansion microscopy, Edward S. Boyden, a co-director of the M.I.T. Center for Neurobiological Engineering, and his colleagues were able to observe objects originally measuring just 70 nanometers in cultured cells and brain tissue through an optical microscope. © 2015 The New York Times Company
Keyword: Brain imaging
Link ID: 20499 - Posted: 01.20.2015
By Christie Aschwanden Maybe it’s their famously protruding brow ridge or perhaps it’s the now-discredited notion that they were primitive scavengers too dumb to use language or symbolism, but somehow Neanderthals picked up a reputation as brutish, dim and mannerless cretins. Yet the latest research on the history and habits of Neanderthals suggests that such portrayals of them are entirely undeserved. It turns out that Neanderthals were capable hunters who used tools and probably had some semblance of culture, and the DNA record shows that if you trace your ancestry to Europe or Asia, chances are very good that you have some Neanderthal DNA in your own genome. The bad rap began when the first Neanderthal skull was discovered around 1850 in Germany, says Paola Villa, an archaeologist at the University of Colorado. “The morphological features of these skulls — big eyebrows, no chin — led to the idea that they were very different from us, and therefore inferior,” she says. While the majority of archaeologists no longer believe this, she says, the idea that Neanderthals were inferior, brutish or stupid remains in popular culture. Neanderthals first appeared in Europe and western Asia between 300,000 and 400,000 years ago. They are our closest (extinct) relative, and their species survived until 30,000 to 40,000 years ago, when they vanish from the fossil record, says Svante Paabo, director of the Max Planck Institute of Evolutionary Anthropology in Leipzig, Germany, and author of “Neanderthal Man: In Search of Lost Genomes.” Why these relatives of ours thrived for so long and then ended their long, successful run about the same time that modern humans began to spread remains a point of debate and speculation.
Link ID: 20498 - Posted: 01.20.2015
// by Jennifer Viegas Researchers eavesdropping on wild chimpanzees determined that the primates communicate about at least two things: their favorite yummy fruits, and the trees where these fruits can be found. Of particular interest to the chimps is the size of trees bearing the fruits that they relish most, such that the chimps yell out that information, according to a new study published in the journal Animal Behaviour. The study is the first to find that information about tree size and available fruit amounts are included in chimp calls, in addition to assessments about food quality. "Chimpanzees definitely have a very complex communication system that includes a variety of vocalizations, but also facial expressions and gestures," project leader Ammie Kalan of the Max Planck Institute for Evolutionary Anthropology told Discovery News. "How much it resembles human language is still a matter of debate," she added, "but at the very least, research shows that chimpanzees use vocalizations in a sophisticated manner, taking into account their social and environmental surroundings." Kalan and colleagues Roger Mundry and Christophe Boesch spent over 750 hours observing chimps and analyzing their food calls in the Ivory Coast's Taï Forest. The Wild Chimpanzee Foundation in West Africa is working hard to try and protect this population of chimps, which is one of the last wild populations of our primate cousins. © 2015 Discovery Communications, LLC
Helen Fisher, a biological anthropologist at Rutgers University responds: Several years ago I embarked on a project to see if the seven-year itch really exists. I began by studying worldwide data on marriage and divorce and noticed that although the median duration of marriage was seven years, of the couples who divorced, most did so around their fourth year together (the “mode”). I also found that divorce occurred most frequently among couples at the height of their reproductive and parenting years—for men, ages 25 to 29, and for women, ages 20 to 24 and 25 to 29—and among those with one dependent child. To try to explain these findings, I began looking at patterns of pair bonding in birds and mammals. Although only about 3 percent of mammals form a monogamous bond to rear their young, about 90 percent of avian species team up. The reason: the individual that sits on the eggs until they hatch will starve unless fed by a mate. A few mammals are in the same predicament. Take the female fox: the vixen produces very thin milk and must feed her young almost constantly, so she relies on her partner to bring her food while she stays in the den to nurse. But here's the key: although some species of birds and mammals bond for life, more often they stay together only long enough to rear their young through infancy and early toddlerhood. When juvenile robins fly away from the nest or maturing foxes leave the den for the last time, their parents part ways as well. Humans retain traces of this natural reproductive pattern. In more contemporary hunter-gatherer societies, women tend to bear their children about four years apart. Moreover, in these societies after a child is weaned at around age four, the child often joins a playgroup and is cared for by older siblings and relatives. This care structure allows unhappy couples to break up and find a more suitable partner with whom to have more young. © 2015 Scientific American
By PAULA SPAN DEDHAM, Mass. — Jerome Medalie keeps his advance directive hanging in a plastic sleeve in his front hall closet, as his retirement community recommends. That’s where the paramedics will look if someone calls 911. Like many such documents, it declares that if he is terminally ill, he declines cardiopulmonary resuscitation, a ventilator and a feeding tube. But Mr. Medalie’s directive also specifies something more unusual: If he develops Alzheimer’s disease or another form of dementia, he refuses “ordinary means of nutrition and hydration.” A retired lawyer with a proclivity for precision, he has listed 10 triggering conditions, including “I cannot recognize my loved ones” and “I cannot articulate coherent thoughts and sentences.” If any three such disabilities persist for several weeks, he wants his health care proxy — his wife, Beth Lowd — to ensure that nobody tries to keep him alive by spoon-feeding or offering him liquids. VSED, short for “voluntarily stopping eating and drinking,” is not unheard-of as an end-of-life strategy, typically used by older adults who hope to hasten their decline from terminal conditions. But now ethicists, lawyers and older adults themselves have begun a quiet debate about whether people who develop dementia can use VSED to end their lives by including such instructions in an advance directive. Experts know of just a handful of people with directives like Mr. Medalie’s. But dementia rates and numbers have begun a steep ascent, already afflicting an estimated 30 percent of those older than 85. Baby boomers are receiving a firsthand view of the disease’s devastation and burdens as they care for aging parents. They may well prove receptive to the idea that they shouldn’t be kept alive if they develop dementia themselves, predicted Alan Meisel, the director of the University of Pittsburgh’s Center for Bioethics and Health Law. © 2015 The New York Times Company
Link ID: 20495 - Posted: 01.20.2015