Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 894

by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.

Keyword: Attention; Consciousness
Link ID: 20284 - Posted: 11.06.2014

By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.

Keyword: Attention
Link ID: 20282 - Posted: 11.06.2014

By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company

Keyword: ADHD; Learning & Memory
Link ID: 20272 - Posted: 11.03.2014

Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR

Keyword: Learning & Memory; Attention
Link ID: 20271 - Posted: 11.03.2014

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20253 - Posted: 10.28.2014

Sarah Boseley, health editor A record haul of “smart” drugs, sold to students to enhance their memory and thought processes, stay awake and improve concentration, has been seized from a UK website by the medicines regulator, which is alarmed about the recent rise of such sites. The seizure, worth £200,000, illustrates the increasing internet trade in cognitive enhancement drugs and suggests people who want to stay focused and sharp are moving on from black coffee and legally available caffeine tablets. Most of the seized drugs are medicines that should only be available on a doctor’s prescription. One, Sunifiram, is entirely experimental and has never been tested on humans in clinical trials. Investigators from the Medicines and Healthcare Products Regulatory Authority (MHRA) are worried at what they see as a new phenomenon – the polished, plausible, commercial website targeting students and others who are looking for a mental edge over the competition. In addition to Ritalin, the drug that helps young people with attention deficit disorder (ADD) focus in class and while writing essays, and Modafinil (sold as Provigil), licensed in the US for people with narcolepsy, they are also offering experimental drugs and research chemicals. MHRA head of enforcement, Alastair Jeffrey, said the increase in people buying cognitive-enhancing drugs or “nootropics” is recent and very worrying. “The idea that people are willing to put their overall health at risk in order to attempt to get an intellectual edge over others is deeply troubling,” he said. © 2014 Guardian News and Media Limited

Keyword: Drug Abuse; ADHD
Link ID: 20242 - Posted: 10.27.2014

James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."

Keyword: Emotions; Attention
Link ID: 20234 - Posted: 10.23.2014

David DiSalvo @neuronarrative One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions. For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will. Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.

Keyword: Consciousness
Link ID: 20232 - Posted: 10.23.2014

By Scott Barry Kaufman “Just because a diagnosis [of ADHD] can be made does not take away from the great traits we love about Calvin and his imaginary tiger friend, Hobbes. In fact, we actually love Calvin BECAUSE of his ADHD traits. Calvin’s imagination, creativity, energy, lack of attention, and view of the world are the gifts that Mr. Watterson gave to this character.” — The Dragonfly Forest In his 2004 book “Creativity is Forever“, Gary Davis reviewed the creativity literature from 1961 to 2003 and identified 22 reoccurring personality traits of creative people. This included 16 “positive” traits (e.g., independent, risk-taking, high energy, curiosity, humor, artistic, emotional) and 6 “negative” traits (e.g., impulsive, hyperactive, argumentative). In her own review of the creativity literature, Bonnie Cramond found that many of these same traits overlap to a substantial degree with behavioral descriptions of Attention Deficit Hyperactive Disorder (ADHD)– including higher levels of spontaneous idea generation, mind wandering, daydreaming, sensation seeking, energy, and impulsivity. Research since then has supported the notion that people with ADHD are more likely to reach higher levels of creative thought and achievement than those without ADHD (see here, here, here, here, here, here, here, here, here, and here). What’s more, recent research by Darya Zabelina and colleagues have found that real-life creative achievement is associated with the ability to broaden attention and have a “leaky” mental filter– something in which people with ADHD excel. Recent work in cognitive neuroscience also suggests a connection between ADHD and creativity (see here and here). Both creative thinkers and people with ADHD show difficulty suppressing brain activity coming from the “Imagination Network“: © 2014 Scientific American

Keyword: ADHD; Attention
Link ID: 20228 - Posted: 10.22.2014

By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20219 - Posted: 10.20.2014

By Smitha Mundasad Health reporter, BBC News Scientists have uncovered hidden signatures in the brains of people in vegetative states that suggest they may have a glimmer of consciousness. Doctors normally consider these patients - who have severe brain injuries - to be unaware of the world around them although they appear awake. Researchers hope their work will help identify those who are actually conscious, but unable to communicate. Their report appears in PLoS Computational Biology. After catastrophic brain injuries, for example due to car crashes or major heart attacks, some people can appear to wake up yet do not respond to events around them. Doctors describe these patients as being in a vegetative state. Patients typically open their eyes and look around, but cannot react to commands or make any purposeful movements. Some people remain in this state for many years. But a handful of recent studies have questioned this diagnosis - suggesting some patients may actually be aware of what is going on around them, but unable to communicate. A team of scientists at Cambridge University studied 13 patients in vegetative states, mapping the electrical activity of their nerves using a mesh of electrodes applied to their scalps. The electrical patterns and connections they recorded were then compared with healthy volunteers. The study reveals four of the 13 patients had an electrical signature that was very similar to those seen in the volunteers. Dr Srivas Chennu, who led the research, said: "This suggests some of the brain networks that support consciousness in healthy adults may be well-preserved in a number of people in persistent vegetative state too." BBC © 2014

Keyword: Consciousness
Link ID: 20217 - Posted: 10.18.2014

Daniel Cressey Mirrors are often used to elicit aggression in animal behavioural studies, with the assumption being that creatures unable to recognize themselves will react as if encountering a rival. But research suggests that such work may simply reflect what scientists expect to see, and not actual aggression. For most people, looking in a mirror does not trigger a bout of snarling hostility at the face staring back. But many animals do seem to react aggressively to their mirror image, and for years mirrors have been used to trigger such responses for behavioural research on species ranging from birds to fish. “There’s been a very long history of using a mirror as it’s just so handy,” says Robert Elwood, an animal-behaviour researcher at Queen’s University in Belfast, UK. Using a mirror radically simplifies aggression experiments, cutting down the number of animals required and providing the animal being observed with an ‘opponent’ perfectly matched in terms of size and weight. But in a study just published in Animal Behaviour1, Elwood and his team add to evidence that many mirror studies are flawed. The researchers looked at how convict cichlid fish (Amatitlania nigrofasciata) reacted both to mirrors and to real fish of their own species. This species prefers to display their right side in aggression displays, which means that they end up alongside each other in a head-to-tail configuration. It is impossible for a fish to achieve this with their own reflection, but Elwood reasoned that fish faced with a mirror would attempt it, and flip from side to side as they tried to present an aggressive display. On the other hand, if the reflection did not trigger an aggressive reaction, the fish would not display such behaviour as much or as frequently. © 2014 Nature Publishing Group,

Keyword: Consciousness; Aggression
Link ID: 20202 - Posted: 10.13.2014

By MICHAEL S. A. GRAZIANO OF the three most fundamental scientific questions about the human condition, two have been answered. First, what is our relationship to the rest of the universe? Copernicus answered that one. We’re not at the center. We’re a speck in a large place. Second, what is our relationship to the diversity of life? Darwin answered that one. Biologically speaking, we’re not a special act of creation. We’re a twig on the tree of evolution. Third, what is the relationship between our minds and the physical world? Here, we don’t have a settled answer. We know something about the body and brain, but what about the subjective life inside? Consider that a computer, if hooked up to a camera, can process information about the wavelength of light and determine that grass is green. But we humans also experience the greenness. We have an awareness of information we process. What is this mysterious aspect of ourselves? Many theories have been proposed, but none has passed scientific muster. I believe a major change in our perspective on consciousness may be necessary, a shift from a credulous and egocentric viewpoint to a skeptical and slightly disconcerting one: namely, that we don’t actually have inner feelings in the way most of us think we do. Imagine a group of scholars in the early 17th century, debating the process that purifies white light and rids it of all colors. They’ll never arrive at a scientific answer. Why? Because despite appearances, white is not pure. It’s a mixture of colors of the visible spectrum, as Newton later discovered. The scholars are working with a faulty assumption that comes courtesy of the brain’s visual system. The scientific truth about white (i.e., that it is not pure) differs from how the brain reconstructs it. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20196 - Posted: 10.11.2014

By Gretchen Reynolds Encourage young boys and girls to run, jump, squeal, hop and chase after each other or after erratically kicked balls, and you substantially improve their ability to think, according to the most ambitious study ever conducted of physical activity and cognitive performance in children. The results underscore, yet again, the importance of physical activity for children’s brain health and development, especially in terms of the particular thinking skills that most affect academic performance. The news that children think better if they move is hardly new. Recent studies have shown that children’s scores on math and reading tests rise if they go for a walk beforehand, even if the children are overweight and unfit. Other studies have found correlations between children’s aerobic fitness and their brain structure, with areas of the brain devoted to thinking and learning being generally larger among youngsters who are more fit. But these studies were short-term or associational, meaning that they could not tease out whether fitness had actually changed the children’s’ brains or if children with well-developed brains just liked exercise. So for the new study, which was published in September in Pediatrics, researchers at the University of Illinois at Urbana-Champaign approached school administrators at public elementary schools in the surrounding communities and asked if they could recruit the school’s 8- and 9-year-old students for an after-school exercise program. This group was of particular interest to the researchers because previous studies had determined that at that age, children typically experience a leap in their brain’s so-called executive functioning, which is the ability to impose order on your thinking. Executive functions help to control mental multitasking, maintain concentration, and inhibit inappropriate responses to mental stimuli. © 2014 The New York Times Company

Keyword: ADHD; Attention
Link ID: 20174 - Posted: 10.08.2014

By Clare Wilson If you’re facing surgery, this may well be your worst nightmare: waking up while under the knife without medical staff realizing. The biggest-ever study of this phenomenon is shedding light on what such an experience feels like and is causing debate about how best to prevent it. For a one-year period starting in 2012, an anesthetist at every hospital in the United Kingdom and Ireland recorded every case where a patient told a staff member that he had been awake during surgery. Prompted by these reports, the researchers investigated 300 cases, interviewing the patient and doctors involved. One of the most striking findings, says the study’s lead author, Jaideep Pandit of Oxford University Hospitals, was that pain was not generally the worst part of the experience: It was paralysis. For some operations, paralyzing drugs are given to relax muscles and stop reflex movements. “Pain was something they understood, but very few of us have experienced what it’s like to be paralyzed,” Pandit says. “They thought they had been buried alive.” “I thought I was about to die,” says Sandra, who regained consciousness but was unable to move during a dental operation when she was 12 years old. “It felt as though nothing would ever work again — as though the anesthetist had removed everything apart from my soul.”

Keyword: Consciousness
Link ID: 20168 - Posted: 10.07.2014

James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class. The improvements in this case came in executive control, which consists of inhibition (resisting distraction, maintaining focus), working memory, and cognitive flexibility (switching between tasks). The images above show the brain activity in the group of kids who did the program as opposed to the group that didn't. It's the kind of difference that's so dramatic it's a little unsettling. The study only lasted nine months, but when you're only seven years old, nine months is a long time to be sitting in class with a blue head. © 2014 by The Atlantic Monthly Group.

Keyword: ADHD
Link ID: 20152 - Posted: 10.02.2014

By Erik Parens Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished? Some researchers working at the intersection of psychology, neuroscience and philosophy think the answer is yes. Their reasoning is straightforward: if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense. As Joshua Greene and Jonathan Cohen speculated in a 2004 essay: “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.” Just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, we need to view ourselves through two lenses to gain a greater depth of understanding of ourselves. This past summer, Greene and several other colleagues did empirical work that appears to confirm that 2004 speculation. The new work finds that when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished. According to the study’s authors, once students are led to question the concept of free will — understood as the idea that humans “can generate spontaneous choices and actions not determined by prior events” — they begin to find the idea of “just deserts” untenable. “When genuine choice is deemed impossible, condemnation is less justified,” the authors write. © 2014 The New York Times Company

Keyword: Consciousness; Emotions
Link ID: 20131 - Posted: 09.29.2014

By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was ­texting a girlfriend at the time, adding in unmistakable anguish that he can’t even ­remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20124 - Posted: 09.27.2014

Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.

Keyword: Attention
Link ID: 20115 - Posted: 09.25.2014

By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.

Keyword: Attention
Link ID: 20088 - Posted: 09.18.2014