Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Paul Oswell “Cool” is a bit of a moving target. Sixty years ago it was James Dean, nonchalantly smoking a cigarette as he sat on a motorbike, glaring down 1950s conformity with brooding disapproval. Five years ago it was Zooey Deschanel holding a cupcake. In a phone interview with Steve Quartz, the co-author of the recently published Cool: How the Brain’s Hidden Quest for Cool Drives Our Economy and Shapes Our World, we skirted around a working definition. Defining cool turns out to be tricky even for someone who has just written an entire book examining the neurological processes behind it. Quartz’s most succinct definition was that cool is “the sweet spot between being innovative and unconventional, but not weird”. Quartz is the director of the Social Cognitive Neuroscience Laboratory at the California Institute of Technology. So when asked to describe what the lab does, he did not deliver a “cool” answer, but rather a precise one: it is, he said, “concerned with all the dimensions of decision making, from simple gambles and risk assessment right up to very complex reasoning and the nature of moral behaviour”. He wrote the book with his colleague Anette Asp, with whom he has long done research on “neuroeconomics” and “neuromarketing”. Those fields use imaging techniques to look at the ways our brains process the emotions and responses we have to brands and products. The results, as Quartz and Asp posit in the book, reflect primal instincts we have around ideas of status. Their technique gives results that are much more accurate about what the kids are into, these days, than traditional marketing focus groups have ever been able to give us. © 2015 Guardian News and Media Limited
by Helen Thomson Giving people the illusion of teleporting around a room has revealed how the brain constructs our sense of self. The findings may aid treatments for schizophrenia and asomatognosia – a rare condition characterised by a lack of awareness of a part of one's body. As we go about our daily lives, we experience our body as a physical entity with a specific location. For instance, when you sit at a desk you are aware of your body and its rough position with respect to objects around you. These experiences are thought to form a fundamental aspect of self-consciousness. Arvid Guterstam, a neuroscientist at the Karolinska Institute in Stockholm, Sweden, and his colleagues wondered how the brain produces these experiences. To find out, Guterstam's team had 15 people lie in an fMRI brain scanner while wearing a head-mounted display. This was connected to a camera on a dummy body lying elsewhere in the room, enabling the participants to see the room – and themselves inside the scanner - from the dummy's perspective. A member of the team then stroked the participant's body and the dummy's body at the same time. This induced the out-of-body experience of owning the dummy body and being at its location. The experiment was repeated with the dummy body positioned in different parts of the room, allowing the person to be perceptually teleported between the different locations, says Guterstam. All that was needed to break the illusion was to touch the participant's and the dummy's bodies at different times. © Copyright Reed Business Information Ltd.
Monya Baker An ambitious effort to replicate 100 research findings in psychology ended last week — and the data look worrying. Results posted online on 24 April, which have not yet been peer-reviewed, suggest that key findings from only 39 of the published studies could be reproduced. But the situation is more nuanced than the top-line numbers suggest (See graphic, 'Reliability test'). Of the 61 non-replicated studies, scientists classed 24 as producing findings at least “moderately similar” to those of the original experiments, even though they did not meet pre-established criteria, such as statistical significance, that would count as a successful replication. The results should convince everyone that psychology has a replicability problem, says Hal Pashler, a cognitive psychologist at the University of California, San Diego, and an author of one of the papers whose findings were successfully repeated. “A lot of working scientists assume that if it’s published, it’s right,” he says. “This makes it hard to dismiss that there are still a lot of false positives in the literature.” But Daniele Fanelli, who studies bias and scientific misconduct at Stanford University in California, says the results suggest that the reproducibility of findings in psychology does not necessarily lag behind that in other sciences. There is plenty of room for improvement, he adds, but earlier studies have suggested that reproducibility rates in cancer biology and drug discovery could be even lower1, 2. “From my expectations, these are not bad at all,” Fanelli says. “Though I have spoken to psychologists who are quite disappointed.” © 2015 Nature Publishing Group,
Link ID: 20871 - Posted: 05.02.2015
By JEFFREY ELY, ALEXANDER FRANKEL and EMIR KAMENICA IMAGINE the following situation. After a grueling day at work, you plop down in front of your TV, ready to relax. Your TiVo has recorded all of the day’s March Madness games. You’ve sequestered yourself away from any news about who won or lost. Which game to watch? Suddenly, your spouse pops in and tells you to stay away from Villanova versus Lafayette, which was a blowout, and to watch Baylor versus Georgia State, a nail-biter. Is this recommendation appreciated? Hardly. Baylor versus Georgia State was exciting because the unexpected happened: It was a back-and-forth affair in which Georgia State, the underdog, clinched the upset only in the final moments. But if you know in advance that it’s a nail-biter, you will expect the unexpected, ruining the surprise. It’s a lesson that the filmmaker M. Night Shyamalan, for one, seems to have missed. Once it’s common knowledge that your movie will have a dramatic, unexpected plot twist at the end, then your movie no longer has a dramatic, unexpected plot twist at the end. To be thrilling, you must occasionally be boring. This is one of several lessons that came out of our recent study of drama-based entertainment using the tools of information economics — the results of which were published in the Journal of Political Economy in February. When we recognize that the capacity to surprise an audience is a scarce resource (“You can’t fool all of the people all of the time”), it becomes natural to use economic theory to optimize that resource.
Link ID: 20860 - Posted: 04.29.2015
Amy Coats Those split second decisions, made almost without thinking. When to put your foot on the pedal when you’re at the red light. When to check how those sausages are doing. Remembering to grab your lunch from the fridge seconds before you leave the house. Or – too often – 20 minutes after. And those carefully considered ones. Do I just finish this paragraph before I make a cup of tea? Or do I wait until the boss is clear of the kitchen? Timing, that is our perception and estimation of time, is key in determining how we behave and in the decisions we make. New findings suggest that time in the brain is relative, not absolute. This means that your brain ‘encodes’ your sense of time depending on what happens to you, and not by the second, minute or hour. And this in turn determines how you behave. Alas, you could be forgiven for feeling that the units of time common to everyone worldwide, except perhaps the odd Amazonian tribe, are pretty well ingrained. My partner and I will often make a quick bet on what time it is before we check our phone (all sigh!/rejoice! [delete as appropriate], the dwindling watch-less generation). And we’re both pretty good at getting to within 5 or 10 minutes, even if we haven’t known the exact time all day. He’s normally better at it, perhaps because he’s male? Perhaps it tends to fly/drag for me because I’m having more/less fun? Perhaps that’s another story. In the 2004 reality TV show Shattered, contestants who had been sleep-deprived for over 140 hours went head-to-head to predict when an arbitrary amount of time had passed – in this case, one minute and seven seconds. With the pressure of £100,000 prize money at stake, Dermot O’Leary grimacing nearby, a studio audience rustling in the darkness, and no cues except their ‘inner clock’, contestants were almost unbelievably close. The loser, Jonathan, was 0.4 seconds out, while Jimmy, the winner, was just one tenth of a second out. © 2015 Guardian News and Media Limited
Link ID: 20859 - Posted: 04.29.2015
Julian Baggini is that happy thing – a philosopher who recognises that readers go glassy-eyed if presented with high-octane philosophical discourse. And yet, as his latest book, Freedom Regained: The Possibility of Free Will, makes clear, it is in all our interests to consider crucial aspects of what it means to be human. Indeed, in this increasingly complex world, maybe more so than ever. Freedom is one of the great, emotive political watchwords. The emancipation of slaves and women has inspired political movements on a grand scale. But, latterly, the concept of freedom has defected from the public realm to the personal. How responsible are we as individuals for the actions we take? To what degree are we truly autonomous agents? The argument that environmental circumstances are crucial determinates on our actions – the “Officer Krupke” argument (from the West Side Story song: “Gee, Officer Krupke, we’re very upset/We never had the love that every child ought to get”) – has for some time carried weight, not least in the defence of violent crime. Defective genes are also a common part of the artillery in the argument against the possibility of free choice. Excessive testosterone and low resting heart rates, for example, both statistically bias a person towards violence. And now neuroscience brings us the unnerving news that while even the most sane, genetically well endowed and law-abiding of us believe we make free choices, the evidence of brain scans suggests otherwise. Neuroscience reveals the seemingly novel fact that “we are not the authors of our thoughts and actions in the way people generally suppose”. I say “seemingly novel”, for it is no news that many of our apparently willed choices have unconscious determinates, which are at variance from our known wishes and desires. The whole of psychoanalysis is predicated on that principle but, as anyone who can drive a car will attest, often routine physical actions take their source from an internalised history rather than any conscious decision-making. The neural information that has made waves, however, is the fact that scans indicates the brain’s chemistry consistently determines a decision prior to our consciously making that decision. So when I deliberate over a menu and finally choose a mushroom risotto over a rare steak, my brain has anticipated this before I am aware of my choice. © 2015 Guardian News and Media Limited
Link ID: 20856 - Posted: 04.28.2015
|By Rebecca Harrington Kraft Macaroni & Cheese—that favorite food of kids, packaged in the nostalgic blue box—will soon be free of yellow dye. Kraft announced Monday that it will remove artificial food coloring, notably Yellow No. 5 and Yellow No. 6 dyes, from its iconic product by January 2016. Instead, the pasta will maintain its bright yellow color by using natural ingredients: paprika, turmeric and annatto (the latter of which is derived from achiote tree seeds). The company said it decided to pull the dyes in response to growing consumer pressure for more natural foods. But claims that the dyes may be linked to attention-deficit hyperactivity disorder (ADHD) in children have also risen recently, as they did years ago, putting food dyes under sharp focus once again. On its Web site Kraft says synthetic colors are not harmful, and that their motivation to remove them is because consumers want more foods with no artificial colors. The U.S. Food and Drug Administration maintains artificial food dyes are safe but some research studies have found the dyes can contribute to hyperactive behavior in children. Food dyes have been controversial since pediatrician Benjamin Feingold published findings in the 1970s that suggested a link between artificial colors and hyperactive behavior, but scientists, consumers and the government have not yet reached a consensus on the extent of this risk or the correct path to address it. After a 2007 study in the U.K. showed that artificial colors and/or the common preservative sodium benzoate increased hyperactivity in children, the European Union started requiring food labels indicating that a product contains any one of six dyes that had been investigated. The label states the product "may have an adverse effect on activity and attention in children." © 2015 Scientific American
by Katie Collins Sarah-Jayne Blakemore is just as fascinated by the links between neuroscience and education as she is outraged by the pseudo science that often intrudes upon this territory. Neuroscience in education has really been flourishing in recent years, she says on stage at WIRED Health 2015, but some theories about neuroscience have already infiltrated schools, and not necessarily in a good way. Some products that makes claims about having a positive effect on cognition make bogus claims that may well have positive effects in the classroom, but at the same time promote completely inaccurate science. Blakemore points specifically to the Brain Gym educational model, which claims to improve memory, concentration and information retention. There are no problems with the exercises themselves, she says, but the claims made about the brain are baseless. For a start, she said, Brain Gym claims that children can push "brain buttons" on their bodies that will stimulate blood flow to the brain. Another physical exercise claimed to increase and improve connectivity between the two sides of the brain. "This makes no sense -- they are in communication anyway," says Blakemore. Teachers like Brain Gym because it does what it says and results in improvements in the classroom, but it could just as easily be placebo or novelty causing the effects. One thing Blakemore is sure of? "They're nothing to do with brain buttons or coordinating the two brain hemispheres."
Keyword: Development of the Brain
Link ID: 20844 - Posted: 04.25.2015
By Jerry Adler Smithsonian Magazine | In London, Benjamin Franklin once opened a bottle of fortified wine from Virginia and poured out, along with the refreshment, three drowned flies, two of which revived after a few hours and flew away. Ever the visionary, he wondered about the possibility of incarcerating himself in a wine barrel for future resurrection, “to see and observe the state of America a hundred years hence.” Alas, he wrote to a friend in 1773, “we live in an age too early . . . to see such an art brought in our time to its perfection.” If Franklin were alive today he would find a kindred spirit in Ken Hayworth, a neuroscientist who also wants to be around in 100 years but recognizes that, at 43, he’s not likely to make it on his own. Nor does he expect to get there preserved in alcohol or a freezer; despite the claims made by advocates of cryonics, he says, the ability to revivify a frozen body “isn’t really on the horizon.” So Hayworth is hoping for what he considers the next best thing. He wishes to upload his mind—his memories, skills and personality—to a computer that can be programmed to emulate the processes of his brain, making him, or a simulacrum, effectively immortal (as long as someone keeps the power on). Hayworth’s dream, which he is pursuing as president of the Brain Preservation Foundation, is one version of the “technological singularity.” It envisions a future of “substrate-independent minds,” in which human and machine consciousness will merge, transcending biological limits of time, space and memory. “This new substrate won’t be dependent on an oxygen atmosphere,” says Randal Koene, who works on the same problem at his organization, Carboncopies.org. “It can go on a journey of 1,000 years, it can process more information at a higher speed, it can see in the X-ray spectrum if we build it that way.”
|By Tara Haelle When it comes to treating attention-deficit hyperactivity disorder (ADHD) a lot of kids are getting the meds they need—but they may be missing out on other treatments. Despite clinical guidelines that urge that behavioral therapy always be used alongside medication, less than half of the children with ADHD received therapy as part of treatment in 2009 and 2010, according to the first nationally representative study of ADHD treatment in U.S. children. The findings, published online March 31 in The Journal of Pediatrics, come from data collected during that period on 9,459 children, aged four to 17, with diagnosed ADHD—just before the American Academy of Pediatrics (AAP) issued its clinical practice guidelines on treatments of the condition in 2011. They provide a baseline for comparison when the next report is issued in 2017. Medication alone was the most common treatment for children with ADHD: 74 percent had taken medication in the previous week whereas 44 percent had received behavioral therapy in the past year. Just under a third of children of all ages had received both medication and behavioral therapy, the AAP-recommended treatment for all ages. “It’s not at all surprising that medication is the most common treatment,” says Heidi Feldman, a professor of developmental and behavioral pediatrics at Stanford University School of Medicine who served on the AAP clinical practice guidelines committee. “It works very effectively to reduce the core symptoms of the condition,” she adds, “and stimulants are relatively safe if used properly. The limitation of stimulant medications for ADHD is that studies do not show a long-term functional benefit from medication use.” © 2015 Scientific American
Link ID: 20827 - Posted: 04.21.2015
By ALAN SCHWARZ Fading fast at 11 p.m., Elizabeth texted her dealer and waited just 30 minutes for him to reach her third-floor New York apartment. She handed him a wad of twenties and fifties, received a tattered envelope of pills, and returned to her computer. Her PowerPoint needed another four hours. Investors in her health-technology start-up wanted re-crunched numbers, a presentation begged for bullet points and emails from global developers would keep arriving well past midnight. She gulped down one pill — pale orange, like baby aspirin — and then, reconsidering, took one of the pinks, too. “O.K., now I can work,” Elizabeth exhaled. Several minutes later, she felt her brain snap to attention. She pushed her glasses up her nose and churned until 7 a.m. Only then did she sleep for 90 minutes, before arriving at her office at 9. The pills were versions of the drug Adderall, an amphetamine-based stimulant prescribed for attention deficit hyperactivity disorder that many college students have long used illicitly while studying. Now, experts say, stimulant abuse is graduating into the work force. But in interviews, dozens of people in a wide spectrum of professions said they and co-workers misused stimulants like Adderall, Vyvanse and Concerta to improve work performance. Most spoke on the condition of anonymity for fear of losing their jobs or access to the medication. Doctors and medical ethicists expressed concern for misusers’ health, as stimulants can cause anxiety, addiction and hallucinations when taken in high doses. But they also worried about added pressure in the workplace — where the use by some pressures more to join the trend. © 2015 The New York Times Company
by Anil Ananthaswamy HOLD that thought. When it comes to consciousness, the brain may be doing just that. It now seems that conscious perception requires brain activity to hold steady for hundreds of milliseconds. This signature in the pattern of brainwaves can be used to distinguish between levels of impaired consciousness in people with brain injury. The new study by Aaron Schurger at the Swiss Federal Institute of Technology in Lausanne doesn't explain the so-called "hard problem of consciousness" – how roughly a kilogram of nerve cells is responsible for the miasma of sensations, thoughts and emotions that make up our mental experience. However, it does chip away at it, and support the idea that it may one day be explained in terms of how the brain processes information. Neuroscientists think that consciousness requires neurons to fire in such a way that they produce a stable pattern of brain activity. The exact pattern will depend on what the sensory information is, but once information has been processed, the idea is that the brain should hold a pattern steady for a short period of time – almost as if it needs a moment to read out the information. In 2009, Schurger tested this theory by scanning 12 people's brains with fMRI machines. The volunteers were shown two images simultaneously, one for each eye. One eye saw a red-on-green line drawing and the other eye saw green-on-red. This confusion caused the volunteers to sometimes consciously perceive the drawing and sometimes not. © Copyright Reed Business Information Ltd.
By Shereen Lehman (Reuters Health) - Children exposed to tobacco smoke at home are up to three times more likely to have attention deficit hyperactive disorder (ADHD) as unexposed kids, according to a new study from Spain. The association was stronger for kids with one or more hours of secondhand smoke exposure every day, the authors found. And the results held when researchers accounted for parents' mental health and other factors. "We showed a significant and substantial dose-response association between (secondhand smoke) exposure in the home and a higher frequency of global mental problems," the authors write in Tobacco Control, online March 25. According to the Centers for Disease Control and Prevention, two of every five children in the US are exposed to secondhand smoke regularly. Alicia Padron of the University of Miami Miller School of Medicine in Florida and colleagues in Spain analyzed data from the 2011 to 2012 Spanish National Health Interview Survey, in which parents of 2,357 children ages four to 12 reported the amount of time their children were exposed to secondhand smoke every day. The parents also filled out questionnaires designed to evaluate their children's mental health. According to the results, about eight percent of the kids had a probable mental disorder. About 7% of the kids were exposed to secondhand smoke for less than one hour per day, and 4.5% were exposed for an hour or more each day. © 2015 Scientific American,
By Lawrence Berger A cognitive scientist and a German philosopher walk into the woods and come upon a tree in bloom: What does each one see? And why does it matter? While that may sound like the set-up to a joke making the rounds at a philosophy conference, I pose it here sincerely, as a way to explore the implications of two distinct strains of thought — that of cognitive science and that of phenomenology, in particular, the thought of Martin Heidegger, who offers a most compelling vision of the ultimate significance of our being here, and what it means to be fully human. When we feel that someone is really listening to us, we feel more alive, we feel our true selves coming to the surface — this is the sense in which worldly presence matters. It can be argued that cognitive scientists tend to ignore the importance of what many consider to be essential features of human existence, preferring to see us as information processors rather than full-blooded human beings immersed in worlds of significance. In general, their intent is to explain human activity and life as we experience it on the basis of physical and physiological processes, the implicit assumption being that this is the domain of what is ultimately real. Since virtually everything that matters to us as human beings can be traced back to life as it is experienced, such thinking is bound to be unsettling. For instance, an article in The Times last year by Michael S. A. Graziano, a professor of psychology and neuroscience at Princeton, about whether we humans are “really conscious,” argued, among other things, that “we don’t actually have inner feelings in the way most of us think we do.” © 2015 The New York Times Company
by Michael Slezak What were we talking about? Oh yes, brain-training programmes may be useful for helping inattentive people focus on tasks in their daily life. At least, that's the implication of an analysis looking at one particular programme. It's the latest salvo in a field that has seen the battles lines drawn between those who believe there is no compelling scientific evidence that training the brain to do a specific task better can offer wider cognitive improvements, and those that think it can work in some cases. The party line is that brain training improves only that which it exercises, says Jared Horvath from the University of Melbourne in Australia. "What this means is, if the training programme uses a working memory game, you get better at working memory games and little else." But an analysis by Megan Spencer-Smith of Monash University in Melbourne, Australia, and Torkel Klingberg at the Karolinska Institute in Stockholm, Sweden, claims to show that there are benefits for daily life – at least for people with attention deficit hyperactivity disorder or other problems related to attentiveness. They focused on a programme called Cogmed, which Klingberg has helped develop, and combined the results of several smaller studies. Cogmed is designed to improve how much verbal or visual information you can temporarily remember and work with. © Copyright Reed Business Information Ltd.
By Nicholas Weiler Where did the thief go? You might get a more accurate answer if you ask the question in German. How did she get away? Now you might want to switch to English. Speakers of the two languages put different emphasis on actions and their consequences, influencing the way they think about the world, according to a new study. The work also finds that bilinguals may get the best of both worldviews, as their thinking can be more flexible. Cognitive scientists have debated whether your native language shapes how you think since the 1940s. The idea has seen a revival in recent decades, as a growing number of studies suggested that language can prompt speakers to pay attention to certain features of the world. Russian speakers are faster to distinguish shades of blue than English speakers, for example. And Japanese speakers tend to group objects by material rather than shape, whereas Koreans focus on how tightly objects fit together. Still, skeptics argue that such results are laboratory artifacts, or at best reflect cultural differences between speakers that are unrelated to language. In the new study, researchers turned to people who speak multiple languages. By studying bilinguals, “we’re taking that classic debate and turning it on its head,” says psycholinguist Panos Athanasopoulos of Lancaster University in the United Kingdom. Rather than ask whether speakers of different languages have different minds, he says, “we ask, ‘Can two different minds exist within one person?’ ” Athanasopoulos and colleagues were interested in a particular difference in how English and German speakers treat events. © 2015 American Association for the Advancement of Science
Brian Owens Our choice between two moral options might be swayed by tracking our gaze, and asking for a decision at the right moment. People asked to choose between two written moral statements tend to glance more often towards the option they favour, experimental psychologists say. More surprisingly, the scientists also claim it’s possible to influence a moral choice: asking for an immediate decision as soon as someone happens to gaze at one statement primes them to choose that option. It’s well known that people tend to look more towards the option they are going to choose when they are choosing food from a menu, says Philip Pärnamets, a cognitive scientist from Lund University in Sweden. He wanted to see if that applied to moral reasoning as well. “Moral decisions have long been considered separately from general decision-making,” he says. “I wanted to integrate them.” In a paper published today in the Proceedings of the National Academy of Sciences1, Pärnamets and his colleagues explain how they presented volunteers with a series of moral statements, such as 'murder is sometimes justified,' 'masturbating with the aid of a willing animal is acceptable' and 'paying taxes is a good thing.' Then the psychologists tracked the volunteers’ gaze as two options appeared on a screen. Once the tracker had determined that a person had spent at least 750 milliseconds looking at one answer and 250 milliseconds at the other, the screen changed to prompt them to make a decision. Almost 60% of the time, they chose the most viewed option — indicating, says Pärnamets, that eye gaze tracks an unfolding moral decision. © 2015 Nature Publishing Group,
|By Esther Landhuis As we age, we seem to get worse at ignoring irrelevant stimuli. It's what makes restaurant conversations challenging—having to converse while also shutting out surrounding chatter. New research bears out the aging brain's distractibility but also suggests that training may help us tune out interference. Scientists at Brown University recruited seniors and twentysomethings for a visual experiment. Presented with a sequence of letters and numbers, participants were asked to report back only the numbers—all the while disregarding a series of meaningless dots. Sometimes the dots moved randomly, but other times they traveled in a clear direction, making them harder to ignore. Older participants ended up accidentally learning the dots' patterns, based on the accuracy of their answers when asked which way the dots were moving, whereas young adults seemed able to suppress that information and focus on the numbers, the researchers reported last November in Current Biology. In a separate study published in Neuron, scientists at the University of California, San Francisco, showed they could train aging brains to become less distractible. Their regimen helped aging rats as well as older people. The researchers played three different sounds and rewarded trainees for identifying a target tone while ignoring distracter frequencies. As the subjects improved, the task grew more challenging—the distracting tone became harder to discriminate from the target. © 2015 Scientific American,
Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.
Link ID: 20672 - Posted: 03.10.2015
By TIMOTHY WILLIAMS In January 1972, Cecil Clayton was cutting wood at his family’s sawmill in southeastern Missouri when a piece of lumber flew off the circular saw blade and struck him in the forehead. The impact caved in part of Mr. Clayton’s skull, driving bone fragments into his brain. Doctors saved his life, but in doing so had to remove 20 percent of his frontal lobe, which psychiatrists say led Mr. Clayton to be tormented for years by violent impulses, schizophrenia and extreme paranoia. In 1996, his lawyers say, those impulses drove Mr. Clayton to kill a law enforcement officer. Today, as Mr. Clayton, 74, sits on death row, his lawyers have returned to that 1972 sawmill accident in a last-ditch effort to save his life, arguing that Missouri’s death penalty law prohibits the execution of severely brain-damaged people. Lawyers for Mr. Clayton, who has an I.Q. of 71, say he should be spared because his injury has made it impossible for him to grasp the significance of his death sentence, scheduled for March 17. “There was a profound change in him that he doesn’t understand, and neither did his family,” said Elizabeth Unger Carlyle, one of Mr. Clayton’s lawyers. While several rulings by the United States Supreme Court in recent years have narrowed the criteria for executing people who have a mental illness, states continue to hold wide sway in establishing who is mentally ill. The debate surrounding Mr. Clayton involves just how profoundly his impairment has affected his ability to understand what is happening to him. Mr. Clayton is missing about 7.7 percent of his brain. © 2015 The New York Times Company