Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Christian Jarrett Back in the 1980s, the American scientist Benjamin Libet made a surprising discovery that appeared to rock the foundations of what it means to be human. He recorded people’s brain waves as they made spontaneous finger movements while looking at a clock, with the participants telling researchers the time at which they decided to waggle their fingers. Libet’s revolutionary finding was that the timing of these conscious decisions was consistently preceded by several hundred milliseconds of background preparatory brain activity (known technically as “the readiness potential”). The implication was that the decision to move was made nonconsciously, and that the subjective feeling of having made this decision is tagged on afterward. In other words, the results implied that free will as we know it is an illusion — after all, how can our conscious decisions be truly free if they come after the brain has already started preparing for them? For years, various research teams have tried to pick holes in Libet’s original research. It’s been pointed out, for example, that it’s pretty tricky for people to accurately report the time that they made their conscious decision. But, until recently, the broad implications of the finding have weathered these criticisms, at least in the eyes of many hard-nosed neuroscientists, and over the last decade or so his basic result has been replicated and built upon with ever more advanced methods such as fMRI and the direct recording of neuronal activity using implanted electrodes. © 2016, New York Media LLC
Link ID: 21859 - Posted: 02.04.2016
By Anna K. Bobak, Sarah Bate For years scientists have studied the biological basis of human speed, and reported that the fastest athletes are short and muscular in build. However, these conclusions were challenged in 2008 when a new athlete, substantially taller than previous world-record holders, was identified as the fastest man in history. Usain Bolt presented the purest expression of human speed on the planet – and raised the possibility that scientists may need to entirely change the way they think about human biometrics. In the same vein, one might ask whether examinations of the brain at its height of efficiency will present new insights into its workings. Although researchers have historically examined people with a very high IQ (i.e. those with more generalised skills), it has become more and more clear that some individuals only perform extraordinarily well on specific cognitive tasks. Among the most interesting of these is facial identity recognition. In fact, the extraordinary skills of these so-called “super-recognisers” do not seem to correlate with IQ or memory for objects, yet they claim to recognise faces which they have only briefly been seen before, or have undergone substantial changes in appearance. For instance, in a recent scientific report from our laboratory (unpublished), one super-recogniser described bumping into a girl from a children’s’ swimming class he coached as a teenager. He recognised her immediately, despite the fact that he’d not seen her for over ten years and she was now an adult. So how can these people change the way that scientists think about the human brain? For many years researchers have generally agreed that faces are “special.” © 2016 Scientific American
Link ID: 21853 - Posted: 02.03.2016
By David Shultz Is my yellow the same as your yellow? Does your pain feel like my pain? The question of whether the human consciousness is subjective or objective is largely philosophical. But the line between consciousness and unconsciousness is a bit easier to measure. In a new study of how anesthetic drugs affect the brain, researchers suggest that our experience of reality is the product of a delicate balance of connectivity between neurons—too much or too little and consciousness slips away. “It’s a very nice study,” says neuroscientist Melanie Boly at the University of Wisconsin, Madison, who was not involved in the work. “The conclusions that they draw are justified.” Previous studies of the brain have revealed the importance of “cortical integration” in maintaining consciousness, meaning that the brain must process and combine multiple inputs from different senses at once. Our experience of an orange, for example, is made up of sight, smell, taste, touch, and the recollection of our previous experiences with the fruit. The brain merges all of these inputs—photons, aromatic molecules, etc.—into our subjective experience of the object in that moment. “There is new meaning created by the interaction of things,” says Enzo Tagliazucchi, a physicist at the Institute for Medical Psychology in Kiel, Germany. Consciousness ascribes meaning to the pattern of photons hitting your retina, thus differentiating you from a digital camera. Although the brain still receives these data when we lose consciousness, no coherent sense of reality can be assembled. © 2016 American Association for the Advancement of Science.
Link ID: 21830 - Posted: 01.27.2016
Timothy Egan This weekend, I’m going to the Mojave Desert, deep into an arid wilderness of a half-million acres, for some stargazing, bouldering and January sunshine on my public lands. I won’t be out of contact. I checked. If Sarah Palin says something stupid on Donald Trump’s behalf — scratch that. When Sarah Palin says something stupid on Donald Trump’s behalf, I’ll get her speaking-in-tongues buffoonery in real time, along with the rest of the nation. The old me would have despised the new me for admitting such a thing. I’ve tried to go on digital diets, fasting from my screens. I was a friend’s guest at a spa in Arizona once and had so much trouble being “mindful” that they nearly kicked me out. Actually, I just wanted to make sure I didn’t miss the Seahawks game, mindful of Seattle’s woeful offensive line. In the information blur of last year, you may have overlooked news of our incredibly shrinking attention span. A survey of Canadian media consumption by Microsoft concluded that the average attention span had fallen to eight seconds, down from 12 in the year 2000. We now have a shorter attention span than goldfish, the study found. Attention span was defined as “the amount of concentrated time on a task without becoming distracted.” I tried to read the entire 54-page report, but well, you know. Still, a quote from Satya Nadella, the chief executive officer of Microsoft, jumped out at me. “The true scarce commodity” of the near future, he said, will be “human attention.” Putting aside Microsoft’s self-interest in promoting quick-flash digital ads with what may be junk science, there seems little doubt that our devices have rewired our brains. We think in McNugget time. The trash flows, unfiltered, along with the relevant stuff, in an eternal stream. And the last hit of dopamine only accelerates the need for another one. © 2016 The New York Times Company
Link ID: 21812 - Posted: 01.23.2016
By Melissa Dahl It’s the fifth inning and the Tampa Bay Rays are beating the Cleveland Indians 6–2 when Cleveland’s relief pitcher Nick Hagadone steps in. Alas, Hagadone does little to turn around the Indians’ luck that day, closing out the long inning with a score of 10–2. Hagadone, apparently frustrated by his own lackluster performance, heads to the clubhouse and, on the way there, punches a door with his left fist — the fist that is, unfortunately, connected to his pitching arm. That momentary impulse would cost him dearly. Hagadone required surgery and eight months’ recovery time — and, to add insult to a literal injury, his team also relegated him to the minor leagues, a move that shrank his annual salary by more than 80 percent. When asked about what could possibly explain an action like this in a usually easy-going guy, the Indians’ team psychologist, Charlie Maher, could only offer variations on this: “He just snapped.” Unless you are also a relief pitcher in the major leagues, you will likely never be in exactly this situation. But how many times have you reacted aggressively, even violently, in a way that felt almost out of your control? You hurl your smartphone across the room, or you unleash a stream of expletives in a manner that would seem to a calmer, rational mind to be disproportionate to the situation at hand. “I just snapped” is how we explain it to ourselves and others, and then we move on. The phrase has become such a cliché that it’s easy to forget that it doesn’t really explain much of anything. What’s behind this impulsive, immediately regrettable behavior? R. Douglas Fields, a senior investigator at the National Institutes of Health, sought out an explanation in his new book, Why We Snap: Understanding the Rage Circuit in Your Brain, which includes the Hagadone story recounted above. © 2016, New York Media LLC
by Emily Reynolds We know more about what the brain does when it's active than we do when it's at rest. It makes sense -- much neuroscientific research has looked to understand particular (and active) processes. James Kozloski, a researcher at IBM, has investigated what the brain does when it's resting -- what he calls 'the Grand Loop'. "The brain consumes a great amount of energy doing nothing. It's a great mystery of neuroscience," Kozloski told PopSci. He argued that around 90 percent of the energy used by the brain remained "unaccounted for". He believes that the brain is constantly 'looping signals', retracing neural pathways over and over again. It's a "closed loop", according to Kozloski, meaning it isn't reliant on external inputs as much of the brain's activity is. Kozloski tested his theory by running his model through IBM's neural tissue simulator and found that it could potentially account for genetic mutations such as Huntington's. He argued that information created by one mutated gene could, through the 'Grand Loop', affect an entire neural pathway. So what happens when our brain is at work? And how does expending energy affect our neural processes? Much historic research into anxiety has found that people tend to exert more energy or force when they're being watched -- something that leads to slip-ups or mistakes under pressure.
A map for other people’s faces has been discovered in the brain. It could help explain why some of us are better at recognising faces than others. Every part of your body that you can move or feel is represented in the outer layer of your brain. These “maps”, found in the motor and sensory cortices (see diagram, below), tend to preserve the basic spatial layout of the body – neurons that represent our fingers are closer to neurons that represent our arms than our feet, for example. The same goes for other people’s faces, says Linda Henriksson at Aalto University in Helsinki, Finland. Her team scanned 12 people’s brains while they looked at hundreds of images of noses, eyes, mouths and other facial features and recorded which bits of the brain became active. This revealed a region in the occipital face area in which features that are next to each other on a real face are organised together in the brain’s representation of that face. The team have called this map the “faciotopy”. The occipital face area is a region of the brain known to be involved in general facial processing. “Facial recognition is so fundamental to human behaviour that it makes sense that there would be a specialised area of the brain that maps features of the face,” she says. © Copyright Reed Business Information Ltd.
It seems like the ultimate insult, but getting people with brain injuries to do maths may lead to better diagnoses. A trial of the approach has found two people in an apparent vegetative state that may be conscious but “locked-in”. People who are in a vegetative state are awake but have lost all cognitive function. Occasionally, people diagnosed as being in this state are actually minimally conscious with fleeting periods of awareness, or even locked-in. This occurs when they are totally aware but unable to move any part of their body. It can be very difficult to distinguish between each state, which is why a team of researchers in China have devised a brain-computer interface that tests whether people with brain injuries can perform mental arithmetic – a clear sign of conscious awareness. The team, led by Yuanqing Li at South China University of Technology and Jiahui Pan at the South China Normal University in Guangzhou showed 11 people with various diagnoses a maths problem on a screen. This was followed by two possible answers flickering at frequencies designed to evoke different patterns of brain activity. Frames around each number also flashed several times. The participants were asked to focus on the correct answer and count the number of times its frame flashed. The brain patterns from the flickering answers together with the detection of another kind of brain signal that occurs when someone counts, enabled a computer to tell which answer, if any, the person was focusing on. © Copyright Reed Business Information Ltd.
Link ID: 21801 - Posted: 01.19.2016
Patricia Neighmond When Cathy Fields was in her late 50s, she noticed she was having trouble following conversations with friends. "I could sense something was wrong with me," she says. "I couldn't focus. I could not follow." Fields was worried she had suffered a stroke or was showing signs of early dementia. Instead she found out she had attention deficit hyperactivity disorder or ADHD. Fields is now 66 years old and lives in Ponte Vedra Beach, Fla. She's a former secretary and mother of two grown children. Fields was diagnosed with ADHD about eight years ago. Her doctor ruled out any physical problems and suggested she see a psychiatrist. She went to Dr. David Goodman at Johns Hopkins School of Medicine, who by chance specializes in ADHD. Goodman asked Fields a number of questions about focus, attention and completing tasks. He asked her about her childhood and how she did in school. Since ADHD begins in childhood, it's important for mental health professionals to understand these childhood experiences in order to make an accurate diagnosis of ADHD in adulthood. Online screening tests are available, too, so you can try it yourself. Goodman decided that Fields most definitely had ADHD. She's not alone. Goodman says he's seeing more and more adults over the age of 50 newly diagnosed with ADHD. © 2016 npr
Link ID: 21795 - Posted: 01.18.2016
Laura Sanders Signals in the brain can hint at whether a person undergoing anesthesia will slip under easily or fight the drug, a new study suggests. The results, published January 14 in PLOS Computational Biology, bring scientists closer to being able to tailor doses of the powerful drugs for specific patients. Drug doses are often given with a one-size-fits-all attitude, says bioengineer and neuroscientist Patrick Purdon of Massachusetts General Hospital and Harvard Medical School. But the new study finds clear differences in people’s brain responses to similar doses of an anesthetic drug, Purdon says. “To me, that’s the key and interesting point.” Cognitive neuroscientist Tristan Bekinschtein of the University of Cambridge and colleagues recruited 20 people to receive low doses of the general anesthetic propofol. The low dose wasn’t designed to knock people out, but to instead dial down their consciousness until they teetered on the edge of awareness — a point between being awake and alert and being drowsy and nonresponsive. While the drug was being delivered, participants repeatedly heard either a buzzing sound or a noise and were asked each time which they heard, an annoying question designed to gauge awareness. Of the 20 people, seven were sidelined by the propofol and they began to respond less. Thirteen other participants, however, kept right on responding, “fighting the drug,” Bekinschtein says. © Society for Science & the Public 2000 - 2016.
Maggie Koerth-Baker In 1990, when James Danckert was 18, his older brother Paul crashed his car into a tree. He was pulled from the wreckage with multiple injuries, including head trauma. The recovery proved difficult. Paul had been a drummer, but even after a broken wrist had healed, drumming no longer made him happy. Over and over, Danckert remembers, Paul complained bitterly that he was just — bored. “There was no hint of apathy about it at all,” says Danckert. “It was deeply frustrating and unsatisfying for him to be deeply bored by things he used to love.” A few years later, when Danckert was training to become a clinical neuropsychologist, he found himself working with about 20 young men who had also suffered traumatic brain injury. Thinking of his brother, he asked them whether they, too, got bored more easily than they had before. “And every single one of them,” he says, “said yes.” Those experiences helped to launch Danckert on his current research path. Now a cognitive neuroscientist at the University of Waterloo in Canada, he is one of a small but growing number of investigators engaged in a serious scientific study of boredom. There is no universally accepted definition of boredom. But whatever it is, researchers argue, it is not simply another name for depression or apathy. It seems to be a specific mental state that people find unpleasant — a lack of stimulation that leaves them craving relief, with a host of behavioural, medical and social consequences. © 2016 Nature Publishing Group
Don’t blame impulsive people for their poor decisions. It’s not necessarily their fault. Impulsivity could result from not having enough time to veto our own actions. At least that is the implication of a twist on a classic experiment on free will. In 1983, neuroscientist Benjamin Libet performed an experiment to test whether we have free will. Participants were asked to voluntarily flex a finger while watching a clock-face with a rotating dot. They had to note the position of the dot as soon as they became aware of their intention to act. As they were doing so, Libet recorded their brain activity via EEG electrodes attached to the scalp. He found that a spike in brain activity called the readiness potential, which precedes a voluntary action, occurred about 350-milliseconds before the volunteers became consciously aware of their intention to act. The readiness potential is thought to signal the brain preparing for movement. Libet interpreted his results to mean that free will is an illusion. But we’re not complete slaves to our neurons, he reasoned, as there was a 200-millisecond gap between conscious awareness of our intention and the initiation of movement. Libet argued that this was enough time to consciously veto the action, or exert our “free won’t”. While Libet’s interpretations have remained controversial, this hasn’t stopped scientists carrying out variations of his experiment. Among other things, this has revealed that people with Tourette’s syndrome, who have uncontrollable tics, experience a shorter veto window than people without the condition, as do those with schizophrenia. © Copyright Reed Business Information Ltd.
By Melissa Healy A new study finds that policies on defining brain death vary from hospital to hospital and could result in serious errors. Since 2010, neurologists have had a clear set of standards and procedures to distinguish a brain-dead patient from one who might emerge from an apparent coma. But when profoundly unresponsive patients are rushed to hospitals around the nation, the physicians who make the crucial call are not always steeped in the diagnostic fine points of brain death and the means of identifying it with complete confidence. State laws governing the diagnosis of brain death vary widely. Some states allow any physician to make the diagnosis, while others dictate the level of specialty a physician making the call must have. Some require that a second physician confirm the diagnosis or that a given period of time elapse. Others make no such demands. Given these situations, hospital policies can be invaluable guides for physicians, hospital administrators and patients’ families. In the absence of consistent physician expertise or legal requirements, hospital protocols can translate a scientific consensus into a step-by-step checklist. That would help ensure that no one who is not brain-dead is denied further care or considered a potential organ donor and that the deceased and their families would have every opportunity to donate organs.
Link ID: 21749 - Posted: 01.05.2016
By KARL OVE KNAUSGAARD I arrived in Tirana, Albania, on a Sunday evening in late August, on a flight from Istanbul. The sun had set while the plane was midflight, and as we landed in the dark, images of fading light still filled my mind. The man next to me, a young, red-haired American wearing a straw hat, asked me if I knew how to get into town from the airport. I shook my head, put the book I had been reading into my backpack, got up, lifted my suitcase out of the overhead compartment and stood waiting in the aisle for the door up ahead to open. That book was the reason I had come. It was called “Do No Harm,” and it was written by the British neurosurgeon Henry Marsh. His job is to slice into the brain, the most complex structure we know of in the universe, where everything that makes us human is contained, and the contrast between the extremely sophisticated and the extremely primitive — all of that work with knives, drills and saws — fascinated me deeply. I had sent Marsh an email, asking if I might meet him in London to watch him operate. He wrote a cordial reply saying that he seldom worked there now, but he was sure something could be arranged. In passing, he mentioned that he would be operating in Albania in August and in Nepal in September, and I asked hesitantly whether I could join him in Albania. Now I was here. Tense and troubled, I stepped out of the door of the airplane, having no idea what lay ahead. I knew as little about Albania as I did about brain surgery. The air was warm and stagnant, the darkness dense. A bus was waiting with its engine running. Most of the passengers were silent, and the few who chatted with one another spoke a language I didn’t know. It struck me that 25 years ago, when this was among the last remaining Communist states in Europe, I would not have been allowed to enter; then, the country was closed to the outside world, almost like North Korea today. Now the immigration officer barely glanced at my passport before stamping it. She dully handed it back to me, and I entered Albania. © 2015 The New York Times Company
Link ID: 21739 - Posted: 12.30.2015
By Diana Kwon Pupils are a rich source of social information. Although changes in pupil size are automatic and uncontrollable, they can convey interest, arousal, helpful or harmful intentions, and a variety of emotions. According to a new study published in Psychological Science, we even synchronize our pupil size with others—and doing so influences social decisions. Mariska Kret, a psychologist now at the University of Amsterdam in the Netherlands, and her colleagues recruited 69 Dutch university students to take part in an investment game. Each participant decided whether to transfer zero or five euros to a virtual partner after viewing a video of their eyes for four seconds. The invested money is tripled, and the receiver chooses how much to give back to the donor—so subjects had to make quick decisions about how trustworthy each virtual partner seemed. Using an eye tracker, the investigators found that the participants' pupils tended to mimic the changes in the partners' pupils, whether they dilated, constricted or remained static. As expected, subjects were more likely to give more money to partners with dilating pupils, a well-established signal of nonthreatening intentions. The more a subject mirrored the dilating pupils of a partner, the more likely he or she was to invest—but only if they were of the same race. The Caucasian participants trusted Caucasian eyes more than Asian eyes—which suggests that group membership is important when interpreting these subtle signals. © 2015 Scientific American
James Bond's villain in the latest 007 film, Spectre, could use a lesson in neuroanatomy, a Toronto neurosurgeon says. In a scene recorded in a Morroccan desert, Ernst Stavro Blofeld, played by Christoph Waltz, tortures Bond using restraints and a head clamp fused with a robotic drill. The goal is to inflict pain and erase 007's memory bank of faces. But Blofeld didn't have his brain anatomy down and could have likely killed Daniel Craig's character instead, Dr. Michael Cusimano of St. Michael's Hospital, says in a letter published in this week's issue of the journal Nature. Aiming to erase Bond's memory of faces, the villain correctly intends to drill into the lateral fusiform gyrus, an area of the brain responsible for recognizing faces, Cusimano said. But in practice, the drill was placed in the wrong area, aiming for the neck instead of the brain. "Whereas the drill should have been aimed just in front of 007's ear, it was directed below the mastoid process under and behind his left ear," Cusimano wrote. It likely would have triggered a stroke or massive hemorrhage, he said. In a draft of the letter, Cusimano said he was "spellbound" watching the film in a packed theatre, but his enjoyment was somewhat marred by the blunder. "I laughed," he recalled in an interview. "I think people around me kind of looked at me and were wondering why I was laughing because it's a pretty tense part of the movie." ©2015 CBC/Radio-Canada.
Link ID: 21726 - Posted: 12.27.2015
By Ferris Jabr Matthew Brien has struggled with overeating for the past 20 years. At age 24, he stood at 5′10′′ and weighed a trim 135 pounds. Today the licensed massage therapist tips the scales at 230 pounds and finds it particularly difficult to resist bread, pasta, soda, cookies and ice cream—especially those dense pints stuffed with almonds and chocolate chunks. He has tried various weight-loss programs that limit food portions, but he can never keep it up for long. “It's almost subconscious,” he says. “Dinner is done? Okay, I am going to have dessert. Maybe someone else can have just two scoops of ice cream, but I am going to have the whole damn [container]. I can't shut those feelings down.” Eating for the sake of pleasure, rather than survival, is nothing new. But only in the past several years have researchers come to understand deeply how certain foods—particularly fats and sweets—actually change brain chemistry in a way that drives some people to overconsume. Scientists have a relatively new name for such cravings: hedonic hunger, a powerful desire for food in the absence of any need for it; the yearning we experience when our stomach is full but our brain is still ravenous. And a growing number of experts now argue that hedonic hunger is one of the primary contributors to surging obesity rates in developed countries worldwide, particularly in the U.S., where scrumptious desserts and mouthwatering junk foods are cheap and plentiful. “Shifting the focus to pleasure” is a new approach to understanding hunger and weight gain, says Michael Lowe, a clinical psychologist at Drexel University who coined the term “hedonic hunger” in 2007. © 2015 Scientific American
By JOSEPH LEDOUX IN this age of terror, we struggle to figure out how to protect ourselves — especially, of late, from active shooters. One suggestion, promoted by the Federal Bureau of Investigation and Department of Homeland Security, and now widely disseminated, is “run, hide, fight.” The idea is: Run if you can; hide if you can’t run; and fight if all else fails. This three-step program appeals to common sense, but whether it makes scientific sense is another question. Underlying the idea of “run, hide, fight” is the presumption that volitional choices are readily available in situations of danger. But the fact is, when you are in danger, whether it is a bicyclist speeding at you or a shooter locked and loaded, you may well find yourself frozen, unable to act and think clearly. Freezing is not a choice. It is a built-in impulse controlled by ancient circuits in the brain involving the amygdala and its neural partners, and is automatically set into motion by external threats. By contrast, the kinds of intentional actions implied by “run, hide, fight” require newer circuits in the neocortex. Contemporary science has refined the old “fight or flight” concept — the idea that those are the two hard-wired options when in mortal danger — to the updated “freeze, flee, fight.” While “freeze, flee, fight” is superficially similar to “run, hide, fight,” the two expressions make fundamentally different assumptions about how and why we do what we do, when in danger. Why do we freeze? It’s part of a predatory defense system that is wired to keep the organism alive. Not only do we do it, but so do other mammals and other vertebrates. Even invertebrates — like flies — freeze. If you are freezing, you are less likely to be detected if the predator is far away, and if the predator is close by, you can postpone the attack (movement by the prey is a trigger for attack). © 2015 The New York Times Company
Scientists showed that they could alter brain activity of rats and either wake them up or put them in an unconscious state by changing the firing rates of neurons in the central thalamus, a region known to regulate arousal. The study, published in eLIFE, was partially funded by the National Institutes of Health. “Our results suggest the central thalamus works like a radio dial that tunes the brain to different states of activity and arousal,” said Jin Hyung Lee, Ph.D., assistant professor of neurology, neurosurgery and bioengineering at Stanford University, and a senior author of the study. Located deep inside the brain the thalamus acts as a relay station sending neural signals from the body to the cortex. Damage to neurons in the central part of the thalamus may lead to problems with sleep, attention, and memory. Previous studies suggested that stimulation of thalamic neurons may awaken patients who have suffered a traumatic brain injury from minimally conscious states. Dr. Lee’s team flashed laser pulses onto light sensitive central thalamic neurons of sleeping rats, which caused the cells to fire. High frequency stimulation of 40 or 100 pulses per second woke the rats. In contrast, low frequency stimulation of 10 pulses per second sent the rats into a state reminiscent of absence seizures that caused them to stiffen and stare before returning to sleep. “This study takes a big step towards understanding the brain circuitry that controls sleep and arousal,” Yejun (Janet) He, Ph.D., program director at NIH’s National Institute of Neurological Disorders and Stroke (NINDS).
Link ID: 21711 - Posted: 12.19.2015
By Geoffrey S. Holtzman In November 1834, a 9-year-old boy named Major Mitchell was tried in Maine on one charge of maiming and one charge of felonious assault with intent to maim. He had lured an 8-year-old classmate into a field, beaten him with sticks, attempted to drown him in a stream, and castrated him with a piece of tin. Yet what makes this case so remarkable is neither the age of the defendant nor the violence of his crime, but the nature of his trial. Mitchell’s case marks the first time in U.S. history that a defendant’s attorney sought leniency from a jury on account of there being something wrong with the defendant’s brain. More recently, there has been an explosion in the number of criminals who have sought leniency on similar grounds. While the evidence presented by Mitchell’s defense was long ago debunked as pseudoscience (and was rightly dismissed by the judge), the case for exculpating Major Mitchell may actually be stronger today than it was 181 years ago. In a curious historical coincidence, recent advances in neuroscience suggest that there really might have been something wrong with Major Mitchell’s brain and that neurological deficits really could have contributed to his violent behavior. The case provides a unique window through which to view the relationship between 19th-century phrenology—the pseudoscientific study of the skull as an index of mental faculties—and 21st-century neuroscience. As you might expect, there is a world of difference between the two, but maintaining that difference depends crucially on the responsible use of neuroscience. Major Mitchell’s story cautions against overlooking neuroscience’s limitations, as well as its ability to be exploited for suspect purposes. © 2015 The Slate Group LLC.