Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By ROBERT F. WORTH In early 2012, a neuropathologist named Daniel Perl was examining a slide of human brain tissue when he saw something odd and unfamiliar in the wormlike squiggles and folds. It looked like brown dust; a distinctive pattern of tiny scars. Perl was intrigued. At 69, he had examined 20,000 brains over a four-decade career, focusing mostly on Alzheimer’s and other degenerative disorders. He had peered through his microscope at countless malformed proteins and twisted axons. He knew as much about the biology of brain disease as just about anyone on earth. But he had never seen anything like this. The brain under Perl’s microscope belonged to an American soldier who had been five feet away when a suicide bomber detonated his belt of explosives in 2009. The soldier survived the blast, thanks to his body armor, but died two years later of an apparent drug overdose after suffering symptoms that have become the hallmark of the recent wars in Iraq and Afghanistan: memory loss, cognitive problems, inability to sleep and profound, often suicidal depression. Nearly 350,000 service members have been given a diagnosis of traumatic brain injury over the past 15 years, many of them from blast exposure. The real number is likely to be much higher, because so many who have enlisted are too proud to report a wound that remains invisible. For years, many scientists have assumed that explosive blasts affect the brain in much the same way as concussions from football or car accidents. Perl himself was a leading researcher on chronic traumatic encephalopathy, or C.T.E., which has caused dementia in N.F.L. players. Several veterans who died after suffering blast wounds have in fact developed C.T.E. But those veterans had other, nonblast injuries too. No one had done a systematic post-mortem study of blast-injured troops. That was exactly what the Pentagon asked Perl to do in 2010, offering him access to the brains they had gathered for research. It was a rare opportunity, and Perl left his post as director of neuropathology at the medical school at Mount Sinai to come to Washington. © 2016 The New York Times Company
By Teal Burrell Sociability may be skin deep. The social impairments and high anxiety seen in people with autism or related disorders may be partly due to a disruption in the nerves of the skin that sense touch, a new study in mice suggests. Autism spectrum disorders are primarily thought of as disorders of the brain, generally characterized by repetitive behaviors and deficits in communication skills and social interaction. But a majority of people with autism spectrum disorders also have an altered tactile sense; they are often hypersensitive to light touch and can be overwhelmed by certain textures. “They tend to be very wary of social touch [like a hug or handshake], or if they go outside and feel a gust of wind, it can be very unnerving,” says neuroscientist Lauren Orefice from Harvard Medical School in Boston. An appreciation for this sensory aspect of autism has grown in recent years. The newest version of psychiatry’s bible, the Diagnostic and Statistical Manual of Mental Disorders, includes the sensory abnormalities of autism as core features of the disease. “That was a big nod and a recognition that this is a really important aspect of autism,” says Kevin Pelphrey, a cognitive neuroscientist at The George Washington University in Washington, D.C., who was not involved in the work. The sensation of touch starts in the peripheral nervous system—in receptors at the surface of the skin—and travels along nerves that connect into the central nervous system. Whereas many autism researchers focus on the end of the pathway—the brain—Orefice and colleagues wondered about the first leg of the trip. So the group introduced mutations that silenced genes associated with autism spectrum disorders in mice, adding them in a way that restricted the effects to peripheral nerve cells, they report today in Cell. The team singled out the gene Mecp2, which encodes a protein that regulates the expression of genes that help forge connections between nerve cells. © 2016 American Association for the Advancement of Science
By Rita Celli, This is what Jennifer Molson remembers doctors saying to her about the high-stakes procedure she would undergo in 2002 as part of an Ottawa study that has yielded some promising results in multiple sclerosis patients. The 41-year-old Ottawa woman was in a wheelchair before the treatment. She now walks, runs and works full time. "I had no feeling from my chest down. I could barely cut my food," Molson remembers. Molson was diagnosed with MS when she was 21, and within five years she needed full-time care. "It was scary. [The procedure] was my last shot at living." MS is among the most common chronic inflammatory diseases of the central nervous system, affecting an estimated two million people worldwide. New Canadian research led by two Ottawa doctors and published in The Lancet medical journal on Thursday suggests the high-risk therapy may stop the disease from progressing. "This is the first treatment to produce this level of disease control or neurological recovery" from MS, said The Lancet in a news release. But The Lancet also highlights the high mortality rate associated with the procedure — one patient out of 24 involved in the clinical trial died from liver failure. "Treatment related risks limit [the therapy's] widespread use," The Lancet concludes. Results 'impressive' Nevertheless, in the journal's accompanying editorial a German doctor calls the results "impressive." ©2016 CBC/Radio-Canada.
By ALAN COWELL LONDON — When Muhammad Ali died last week, the memories spooled back inevitably to the glory days of the man who called himself the Greatest, a champion whose life intertwined with America’s traumas of race, faith and war. It was a chronicle of valor asserted in the most public of arenas scrutinized by an audience that spanned the globe. But there was another narrative, just as striking to some admirers, of a private courage beyond his klieg-lit renown. For the minority afflicted by Parkinson’s disease, Ali’s 30-year struggle with the same illness magnified the broader status he built from his boxing prowess as a black man who embraced radical Islam, refused to fight in Vietnam, earned the opprobrium of the establishment and yet emerged as an icon. “It was his longest bout, and one that ultimately he could not win,” the reporter Patrick Sawer wrote in The Telegraph, referring to Ali’s illness. Yet the affliction “only served to increase the worldwide admiration he had gained before the disease robbed him of his powers.” As a global superstar, Ali touched many lands, and Britain felt a particular bond. Boxing fans recalled his far-flung bouts — the “Rumble in the Jungle” against George Foreman in Zaire, as the Democratic Republic of Congo was then called, in 1974; “The Thrilla in Manila” in the Philippines against Joe Frazier a year later. But in Britain, his two defeats in the 1960s of Henry Cooper, a much-loved British heavyweight who died in 2011, and his feisty appearances in prime-time television interviews left an indelible mark. © 2016 The New York Times Company
Link ID: 22308 - Posted: 06.11.2016
By Linda Marsa| Helen Epstein felt deeply isolated and alone. Haunted by her parents’ harrowing experiences in Nazi concentration camps in World War II, she was troubled as a child by images of piles of skeletons and barbed wire, and, in her words, “a floating sense of danger and incipient harm.” But her Czech-born parents’ defense against the horrific memories was to detach. “Their survival strategy in the war was denial and dissociation, and that carried into their behavior afterward,” recalls Epstein, who was born shortly after the war and grew up in Manhattan. “They believed in action over reflection. Introspection was not encouraged, but a full schedule of activities was.” It was only when she was a student at Israel’s Hebrew University in the late 1960s that she realized she was part of a community that shared a cultural and historical legacy that included both pain and fear. “I met dozens of kids of survivors,” she says, “one after the other who shared certain characteristics: preoccupation with a family past and Israel, and who spoke several middle European languages — just like me.” Epstein’s 1979 book about her observations, Children of the Holocaust, gave voice to that sense of alienation and free-floating anxiety. In the years since, mental health professionals have largely attributed the second generation’s moodiness, hypervigilance and depression to learned behavior. It is only now, more than three decades later, that science has the tools to see that this legacy of trauma becomes etched in our DNA — a process known as epigenetics, in which environmental factors trigger genetic changes that may be passed on, just as surely as blue eyes and crooked smiles.
Michael Graziano Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it? The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species. Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing. © 2016 by The Atlantic Monthly Group
Tina Hesman Saey Gut microbes cause obesity by sending messages via the vagus nerve to pack on pounds, new research in rodents suggests. Bacteria in the intestines produce a molecule called acetate, which works through the brain and nervous system to make rats and mice fat, researchers report in the June 9 Nature. If the results hold up in humans, scientists would understand one mechanism by which gut microbes induce obesity: First, the microbes convert fats in food to a short-chain fatty acid called acetate. Acetate in the blood somehow makes its way to the brain. The brain sends a signal through the vagus nerve to the pancreas to increase insulin production. Insulin tells fat cells to store more energy. Fat builds up, leading to obesity. Acetate also increases levels of a hunger hormone called ghrelin, which could lead animals and people to eat even more, says Yale University endocrinologist Gerald Shulman, who led the study. “This is a tour-de-force paper,” says biochemist Jonathan Schertzer of McMaster University in Hamilton, Canada. Most studies that examine the health effects of intestinal microbes just list which bacteria, viruses, fungi and other microorganisms make up the gut microbiome, Schertzer says. But a catalog of differences between lean and obese individuals doesn’t address what those microbes do, he says. “What’s in name?” he asks. “When you find a factor that actually influences metabolism, that’s important.” © Society for Science & the Public 2000 - 2016.
Link ID: 22305 - Posted: 06.09.2016
By Esther Landhuis About 100 times rarer than Parkinson’s, and often mistaken for it, progressive supranuclear palsy afflicts fewer than 20,000 people in the U.S.—and two thirds do not even know they have it. Yet this little-known brain disorder that killed comic actor Dudley Moore in 2002 is quietly becoming a gateway for research that could lead to powerful therapies for a range of intractable neurodegenerative conditions including Alzheimer’s and chronic traumatic encephalopathy, a disorder linked to concussions and head trauma. All these diseases share a common feature: abnormal buildup of a protein called tau in the brains of patients. Progressive supranuclear palsy has no cure and is hard to diagnose. Although doctors may have heard of the disease, many know little about it. It was not described in medical literature until 1964 but some experts believe one of the earliest accounts of the debilitating illness appeared in an 1857 short story by Charles Dickens and his friend Wilke Collins: “A cadaverous man of measured speech. A man who seemed as unable to wink, as if his eyelids had been nailed to his forehead. A man whose eyes—two spots of fire—had no more motion than if they had been connected with the back of his skull by screws driven through them, and riveted and bolted outside among his gray hair. He had come in and shut the door, and he now sat down. He did not bend himself to sit as other people do, but seemed to sink bolt upright, as if in water, until the chair stopped him.” © 2016 Scientific American
Most available antidepressants do not help children and teenagers with serious mental health problems and some may be unsafe, experts have warned. A review of clinical trial evidence found that of 14 antidepressant drugs, only one, fluoxetine – marketed as Prozac – was better than a placebo at relieving the symptoms of young people with major depression. Another drug, venlafaxine, was associated with an increased risk of suicidal thoughts and suicide attempts. Blood test could identify people who will respond to antidepressants Read more But the authors stressed that the true effectiveness and safety of antidepressants taken by children and teenagers remained unclear because of the poor design and selective reporting of trials, which were mostly funded by drug companies. They recommended close monitoring of young people on antidepressants, regardless of what drugs they were prescribed, especially at the start of treatment. Professor Peng Xie, a member of the team from Chongqing Medical University in China, said: “The balance of risks and benefits of antidepressants for the treatment of major depression does not seem to offer a clear advantage in children and teenagers, with probably only the exception of fluoxetine.” Major depressive disorder affects around 3% of children aged six to 12 and 6% of teenagers aged 13 to 18. In 2004 the US Food and Drug Administration (FDA) issued a warning against the use of antidepressants in young people up to the age of 24 because of concerns about suicide risk. Yet the number of young people taking the drugs increased between 2005 and 2012, both in the US and UK, said the study authors writing in the Lancet medical journal. In the UK the proportion of children and teenagers aged 19 and under taking antidepressants rose from 0.7% to 1.1%. © 2016 Guardian News and Media Limited
By Amina Zafar, When Susan Robertson's fingers and left arm felt funny while she was Christmas shopping, they were signs of a stroke she experienced at age 36. The stroke survivor is now concerned about her increased risk of dementia. The link between stroke and dementia is stronger than many Canadians realize, the Heart and Stroke Foundation says. The group's annual report, released Thursday, is titled "Mind the connection: preventing stroke and dementia." Stroke happens when blood stops flowing to parts of the brain. Robertson, 41, of Windsor, Ont., said her short-term memory, word-finding and organizational skills were impaired after her 2011 stroke. She's extremely grateful to have recovered the ability to speak and walk after doctors found clots had damaged her brain's left parietal lobe. "I knew what was happening, but I couldn't say it," the occupational nurse recalled. Dementia risk A stroke more than doubles the risk of dementia, said Dr. Rick Swartz, a spokesman for the foundation and a stroke neurologist in Toronto. Raising awareness about the link is not to scare people, but to show how controlling blood pressure, not smoking or quitting if you do, eating a balanced diet and being physically active reduce the risk to individuals and could make a difference at a society level, Swartz said. While aging is a common risk factor in stroke and dementia, evidence in Canada and other developed countries shows younger people are also increasingly affected. ©2016 CBC/Radio-Canada.
By Stephen L. Macknik Every few decades there’s a major new neuroscience discovery that changes everything. I’m not talking about your garden variety discovery. Those happen frequently (this is the golden age of neuroscience after all). But no, what I’m talking about are the holy-moly, scales-falling-from-your-eyes, time-to-rewrite-the-textbooks, game-changing discoveries. Well one was reported in this last month—simultaneously by two separate labs—and it redefines the primary organizational principle of the visual system in the cortex of the brain. This may sound technical, but it concerns how we see light and dark, and the perception of contrast. Since all sensation functions at the pleasure of contrast, these new discoveries impact neuroscience and psychology as a whole. I’ll explain below. The old way of thinking about how the wiring of the visual cortex was organized orbited around the concept of visual-edge orientation. David Hubel (my old mentor) and Torsten Wiesel (my current fellow Brooklynite)—who shared the Nobel Prize in Physiology or Medicine in 1981—arguably made the first major breakthrough concerning how information was organized in the cortex versus earlier stages of visual processing. Before their discovery, the retina (and the whole visual system) was thought to be a kind of neural camera that communicated its image into the brain. The optic nerves connect the eyes’ retinas to the thalamus at the center of the brain—and then the thalamus connects to the visual cortex at the back of the brain through a neural information superhighway called the optic radiations. Scientists knew, even way back then, that neurons at a given point of the visual scene lie physically next to the neuron that sees the neighboring piece of the visual scene. The discovery of this so called retinotopic map in the primary visual cortex (by Talbot and Marshall) was of course important, but because it matched the retinotopic mapping of the retina and thalamus, it didn’t constitute a new way of thinking. It wasn’t a game-changing discovery. © 2016 Scientific American
Link ID: 22301 - Posted: 06.09.2016
By BENEDICT CAREY Jerome S. Bruner, whose theories about perception, child development and learning informed education policy for generations and helped launch the modern study of creative problem solving, known as the cognitive revolution, died on Sunday at his home in Manhattan. He was 100. His death was confirmed by his partner, Eleanor M. Fox. Dr. Bruner was a researcher at Harvard in the 1940s when he became impatient with behaviorism, then a widely held theory, which viewed learning in terms of stimulus and response: the chime of a bell before mealtime and salivation, in Ivan Pavlov’s famous dog experiments. Dr. Bruner believed that behaviorism, rooted in animal experiments, ignored many dimensions of human mental experience. In one 1947 experiment, he found that children from low-income households perceived a coin to be larger than it actually was — their desires apparently shaping not only their thinking but also the physical dimensions of what they saw. In subsequent work, he argued that the mind is not a passive learner — not a stimulus-response machine — but an active one, bringing a full complement of motives, instincts and intentions to shape comprehension, as well as perception. His writings — in particular the book “A Study of Thinking” (1956), written with Jacqueline J. Goodnow and George A. Austin — inspired a generation of psychologists and helped break the hold of behaviorism on the field. To build a more complete theory, he and the experimentalist George A. Miller, a Harvard colleague, founded the Center for Cognitive Studies, which supported investigation into the inner workings of human thought. Much later, this shift in focus from behavior to information processing came to be known as the cognitive revolution. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22300 - Posted: 06.09.2016
By Rachel Feltman Archerfish are already stars of the animal kingdom for their stunning spit-takes. They shoot high-powered water jets from their mouths to stun prey, making them one of just a few fish species known to use tools. But by training Toxotes chatareus to direct those jets of spit at certain individuals, scientists have shown that the little guys have another impressive skill: They seem to be able to distinguish one human face from another, something never before witnessed in fish and spotted just a few times in non-human animals. The results, published Tuesday in the Nature journal Scientific Reports, could help us understand how humans got so good at telling each other apart. Or how most people got to be good at that, anyway. I'm terrible at it. It's generally accepted that the fusiform gyrus, a brain structure located in the neocortex, allows humans to tell one another apart with a speed and accuracy that other species can't manage. But there's some debate over whether human faces are so innately complex — and that distinguishing them is more difficult than other tricks of memory or pattern recognition — that this region of the brain is a necessary facilitator of the skill that evolved especially for it. Birds, which have been shown to distinguish humans from one another, have the same structure. But some researchers still think that facial recognition might be something that humans learn — it's not an innate skill — and that the fusiform gyrus is just the spot where we happen to process all the necessary information.
Jean Fain When Sandra Aamodt talks about dieting, people listen ... or, they stick their fingers in their ears and go la, la, la. Aamodt's neuroscientific take on why diets backfire is that divisive. Aamodt is a neuroscientist, book author and former editor of a leading brain research journal. She also has become a prominent evangelist of the message that traditional diets just don't work and often leave the dieter worse off than before. And she's an enthusiastic proponent of mindful eating. "I define it as eating with attention and joy, without judgment," Aamodt said in an interview. "That includes attention to hunger and fullness, to the experience of eating and to its effects on our bodies." Even if you've never heard of her, you likely will soon. Her new book, Why Diets Make Us Fat, is bound to change the weight-loss conversation, if not dismantle Biggest Loser-sized dreams. I am a therapist specializing in eating issues, and among my clients, Aamodt has already shifted the focus from weight loss to self-care. Most clients are reluctant to accept her central argument: That our body weight tends to settle at "set points" — that 10- to 15-pound range the brain maintains despite repeated efforts to lower it. However, once they see how the set-point theory reflects their dieting experience, they realize that although they don't have the final say on their weight (their brain does), they do have real influence — through exercise and other health-affirming activities — over their health and well-being. © 2016 npr
Link ID: 22298 - Posted: 06.08.2016
By Anahad O'Connor The federal government’s decision to update food labels last month marked a sea change for consumers: For the first time, beginning in 2018, nutrition labels will be required to list a breakdown of both the total sugars and the added sugars in packaged foods. But is sugar really that bad for you? And is the sugar added to foods really more harmful than the sugars found naturally in foods? We spoke with some top scientists who study sugar and its effects on metabolic health to help answer some common questions about sugar. Here’s what they had to say. Why are food labels being revised? The shift came after years of urging by many nutrition experts, who say that excess sugar is a primary cause of obesity and heart disease, the leading killer of Americans. Many in the food industry opposed the emphasis on added sugars, arguing that the focus should be on calories rather than sugar. They say that highlighting added sugar on labels is unscientific, and that the sugar that occurs naturally in foods like fruits and vegetables is essentially no different than the sugar commonly added to packaged foods. But scientists say it is not that simple. So, is added sugar different from the naturally occurring sugar in food? It depends. Most sugars are essentially combinations of two molecules, glucose and fructose, in different ratios. The sugar in a fresh apple, for instance, is generally the same as the table sugar that might be added to homemade apple pie. Both are known technically as sucrose, and they are broken down in the intestine into glucose and fructose. Glucose can be metabolized by any cell in the body. But fructose is handled almost exclusively by the liver. “Once you get to that point, the liver doesn’t know whether it came from fruit or not,” said Kimber Stanhope, a researcher at the University of California, Davis, who studies the effects of sugar on health. © 2016 The New York Times Company
By Virginia Morell Sex is never simple—even among lizards. Unlike mammals, the sex of central bearded dragons, large lizards found in eastern Australia, is determined by their chromosomes and the environment. If the eggs are incubated in high temperatures, male embryos turn into females. Such sex-reversed lizards still retain the chromosomal makeup of a male, but they develop into functional superfemales, whose output of eggs exceeds that of the regular females. Now, a new study predicts that—in some cases—these superfemales may be able to drive regular ones to extinction. That’s because superfemales not only produce more eggs, but they’re also exceptionally bold. Looking at the shape, physiology, and behavior of 20 sex-reversed females, 55 males, and 40 regular females, scientists found that the sex-reversed dragons were physically similar to regular males: They had a male dragon’s long tail and high body temperature. They were also behaviorally similar, acting like bold, active males—even as they produced viable eggs. Indeed, the scientists report in the current issue of the Proceedings of the Royal Society B that these sex-reversed females were behaviorally more malelike than the genetic males. Because of these advantages, this third sex could reproductively outcompete normal females, the scientists say, possibly causing some populations to lose the female sex chromosome. (Females are the heterogametic sex, like human males.) In such a population, the dragons’ sex would then be determined solely by temperature instead of genetics—something that’s occurred in the lab within a single generation. Could it happen in the wild? The scientists are still investigating. © 2016 American Association for the Advancement of Science
By JAMES GORMAN This summer’s science horror blockbuster is a remake: Return of the Leaping Electric Eel! If you have any kind of phobia of slimy, snakelike creatures that can rise from the water and use their bodies like Tasers, this story — and the accompanying video — may not be for you. The original tale (there was, alas, no video) dates to 1800 when the great explorer Alexander von Humboldt was in South America and enlisted local fishermen to catch some of these eels for the new (at the time) study of electricity. He wrote that the men herded horses and mules into a shallow pond and let the eels attack by pressing themselves against the horses. The horses and mules tried to escape, but the fishermen kept them in the water until the eels used up their power. Two horses died, probably from falling and drowning. Or so Humboldt said. Though the story was widely retold, no other report of this kind of fishing-with-horses phenomenon surfaced for more than 200 years, according to Kenneth Catania, a scientist with a passion for studying the eel species in question, electrophorus electricus. In 2014, he reported on how the eels freeze their prey. They use rapid pulses of more than 600 volts generated by modified muscle cells and sent through the water. These volleys of shocks cause the muscles of prey to tense at once, stopping all movement. The eels’ bodies function like Tasers, Dr. Catania wrote. But they can also project high-voltage pulses in the water in isolated couplets rather than full volleys for a different effect. The pairs of shocks don’t freeze the prey, but cause their bodies to twitch. That movement reveals the prey’s location, and then the eels send out a rapid volley to immobilize then swallow it. Dr. Catania noticed another kind of behavior, however. He was using a metal-handled net — wearing rubber gloves — while working with eels in an aquarium, and the eels would fling themselves up the handle of the net, pressing themselves to the metal and generating rapid electric shocks. © 2016 The New York Times Company
By Sarah DeWeerdt, Spectrum Brains from people with autism show patterns of gene expression similar to those from people with schizophrenia, according to a new analysis. The findings, published May 24 in Translational Psychiatry, deepen the connections between the two conditions, says study leader Dan Arking, associate professor of genetic medicine at Johns Hopkins University in Baltimore, Maryland. People who have either autism or schizophrenia share features such as language problems and difficulty understanding other people’s thoughts and feelings. They also have genetic risk factors in common. “And now I think we can show that they share overlap in gene expression,” Arking says. The study builds on previous work, in which Arking’s team characterized gene expression in postmortem brain tissue from 32 individuals with autism and 40 controls. In the new analysis, the researchers made use of that dataset as well as one from the Stanley Medical Research Institute that looked at 31 people with schizophrenia, 25 with bipolar disorder and 26 controls3. They found 106 genes expressed at lower levels in autism and schizophrenia brains than in controls. These genes are involved in the development of neurons, especially the formation of the long projections that carry nerve signals and the development of the junctions, or synapses, between one cell and the next. The results are consistent with those from previous studies indicating a role for genes involved in brain development in both conditions. “On the one hand, it’s exciting because it tells us that there’s a lot of overlap,” says Jeremy Willsey, assistant professor of psychiatry at the University of California, San Francisco, who was not involved in the work. “On the other hand, these are fairly general things that are overlapping.” © 2016 Scientific American
By Sandra G. Boodman Richard McGhee and his family believed the worst was behind them. McGhee, a retired case officer at the Defense Intelligence Agency who lives near Annapolis, had spent six months battling leukemia as part of a clinical trial at MD Anderson Cancer Center in Houston. The experimental chemotherapy regimen he was given had worked spectacularly, driving his blood cancer into a complete remission. But less than nine months after his treatment ended, McGhee abruptly fell apart. He became moody, confused and delusional — even childish — a jarring contrast with the even-keeled, highly competent person he had been. He developed tremors in his arms, had trouble walking and became incontinent. “I was really a mess,” he recalled. Doctors suspected he had developed a rapidly progressive and fatal dementia, possibly a particularly aggressive form of Alzheimer’s disease. If that was the case, his family was told, his life span would be measured in months. Luckily, the cause of McGhee’s precipitous decline proved to be much more treatable — and prosaic — than doctors initially feared. “It’s really a pleasure to see somebody get better so rapidly,” said Michael A. Williams, a professor of neurology and neurosurgery at the University of Washington School of Medicine in Seattle. Until recently, Williams was affiliated with Baltimore’s Sinai Hospital, where he treated McGhee in 2010. “This was a diagnosis waiting to be found.”
By Clare Wilson We’ve all been there: after a tough mental slog your brain feels as knackered as your body does after a hard workout. Now we may have pinpointed one of the brain regions worn out by a mentally taxing day – and it seems to also affect our willpower, so perhaps we should avoid making important decisions when mentally fatigued. Several previous studies have suggested that our willpower is a finite resource, and if it gets depleted in one way – like finishing a difficult task – we find it harder to make other good choices, like resisting a slice of cake. In a small trial, Bastien Blain at INSERM in Paris and his colleagues asked volunteers to spend six hours doing tricky memory tasks, while periodically choosing either a small sum of cash now, or a larger amount after a delay. .. As the day progressed, people became more likely to act on impulse and to pick an immediate reward. This didn’t happen in the groups that spent time doing easier memory tasks, reading or gaming. For those engaged in difficult work, fMRI brain scans showed a decrease in activity in the middle frontal gyrus, a brain area involved in decision-making. “That suggests this region is becoming less excitable, which could be impairing people’s ability to resist temptation,” says Blain. It’s involved in decisions like ‘Shall I have a beer with my friends tonight, or shall I save money to buy a bike next month,’ he says. Previous research has shown that children with more willpower in a similar type of choice test involving marshmallows end up as more successful adults, by some measures. “Better impulse control predicts your eventual wealth and health,” says Blain. The idea that willpower can be depleted is contentious as some researchers have failed to replicate others’ findings. © Copyright Reed Business Information Ltd.