Chapter 18. Attention and Higher Cognition
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."
David DiSalvo @neuronarrative One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions. For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will. Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.
Link ID: 20232 - Posted: 10.23.2014
By Scott Barry Kaufman “Just because a diagnosis [of ADHD] can be made does not take away from the great traits we love about Calvin and his imaginary tiger friend, Hobbes. In fact, we actually love Calvin BECAUSE of his ADHD traits. Calvin’s imagination, creativity, energy, lack of attention, and view of the world are the gifts that Mr. Watterson gave to this character.” — The Dragonfly Forest In his 2004 book “Creativity is Forever“, Gary Davis reviewed the creativity literature from 1961 to 2003 and identified 22 reoccurring personality traits of creative people. This included 16 “positive” traits (e.g., independent, risk-taking, high energy, curiosity, humor, artistic, emotional) and 6 “negative” traits (e.g., impulsive, hyperactive, argumentative). In her own review of the creativity literature, Bonnie Cramond found that many of these same traits overlap to a substantial degree with behavioral descriptions of Attention Deficit Hyperactive Disorder (ADHD)– including higher levels of spontaneous idea generation, mind wandering, daydreaming, sensation seeking, energy, and impulsivity. Research since then has supported the notion that people with ADHD are more likely to reach higher levels of creative thought and achievement than those without ADHD (see here, here, here, here, here, here, here, here, here, and here). What’s more, recent research by Darya Zabelina and colleagues have found that real-life creative achievement is associated with the ability to broaden attention and have a “leaky” mental filter– something in which people with ADHD excel. Recent work in cognitive neuroscience also suggests a connection between ADHD and creativity (see here and here). Both creative thinkers and people with ADHD show difficulty suppressing brain activity coming from the “Imagination Network“: © 2014 Scientific American
By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company
Link ID: 20219 - Posted: 10.20.2014
By Smitha Mundasad Health reporter, BBC News Scientists have uncovered hidden signatures in the brains of people in vegetative states that suggest they may have a glimmer of consciousness. Doctors normally consider these patients - who have severe brain injuries - to be unaware of the world around them although they appear awake. Researchers hope their work will help identify those who are actually conscious, but unable to communicate. Their report appears in PLoS Computational Biology. After catastrophic brain injuries, for example due to car crashes or major heart attacks, some people can appear to wake up yet do not respond to events around them. Doctors describe these patients as being in a vegetative state. Patients typically open their eyes and look around, but cannot react to commands or make any purposeful movements. Some people remain in this state for many years. But a handful of recent studies have questioned this diagnosis - suggesting some patients may actually be aware of what is going on around them, but unable to communicate. A team of scientists at Cambridge University studied 13 patients in vegetative states, mapping the electrical activity of their nerves using a mesh of electrodes applied to their scalps. The electrical patterns and connections they recorded were then compared with healthy volunteers. The study reveals four of the 13 patients had an electrical signature that was very similar to those seen in the volunteers. Dr Srivas Chennu, who led the research, said: "This suggests some of the brain networks that support consciousness in healthy adults may be well-preserved in a number of people in persistent vegetative state too." BBC © 2014
Link ID: 20217 - Posted: 10.18.2014
Daniel Cressey Mirrors are often used to elicit aggression in animal behavioural studies, with the assumption being that creatures unable to recognize themselves will react as if encountering a rival. But research suggests that such work may simply reflect what scientists expect to see, and not actual aggression. For most people, looking in a mirror does not trigger a bout of snarling hostility at the face staring back. But many animals do seem to react aggressively to their mirror image, and for years mirrors have been used to trigger such responses for behavioural research on species ranging from birds to fish. “There’s been a very long history of using a mirror as it’s just so handy,” says Robert Elwood, an animal-behaviour researcher at Queen’s University in Belfast, UK. Using a mirror radically simplifies aggression experiments, cutting down the number of animals required and providing the animal being observed with an ‘opponent’ perfectly matched in terms of size and weight. But in a study just published in Animal Behaviour1, Elwood and his team add to evidence that many mirror studies are flawed. The researchers looked at how convict cichlid fish (Amatitlania nigrofasciata) reacted both to mirrors and to real fish of their own species. This species prefers to display their right side in aggression displays, which means that they end up alongside each other in a head-to-tail configuration. It is impossible for a fish to achieve this with their own reflection, but Elwood reasoned that fish faced with a mirror would attempt it, and flip from side to side as they tried to present an aggressive display. On the other hand, if the reflection did not trigger an aggressive reaction, the fish would not display such behaviour as much or as frequently. © 2014 Nature Publishing Group,
By MICHAEL S. A. GRAZIANO OF the three most fundamental scientific questions about the human condition, two have been answered. First, what is our relationship to the rest of the universe? Copernicus answered that one. We’re not at the center. We’re a speck in a large place. Second, what is our relationship to the diversity of life? Darwin answered that one. Biologically speaking, we’re not a special act of creation. We’re a twig on the tree of evolution. Third, what is the relationship between our minds and the physical world? Here, we don’t have a settled answer. We know something about the body and brain, but what about the subjective life inside? Consider that a computer, if hooked up to a camera, can process information about the wavelength of light and determine that grass is green. But we humans also experience the greenness. We have an awareness of information we process. What is this mysterious aspect of ourselves? Many theories have been proposed, but none has passed scientific muster. I believe a major change in our perspective on consciousness may be necessary, a shift from a credulous and egocentric viewpoint to a skeptical and slightly disconcerting one: namely, that we don’t actually have inner feelings in the way most of us think we do. Imagine a group of scholars in the early 17th century, debating the process that purifies white light and rids it of all colors. They’ll never arrive at a scientific answer. Why? Because despite appearances, white is not pure. It’s a mixture of colors of the visible spectrum, as Newton later discovered. The scholars are working with a faulty assumption that comes courtesy of the brain’s visual system. The scientific truth about white (i.e., that it is not pure) differs from how the brain reconstructs it. © 2014 The New York Times Company
Link ID: 20196 - Posted: 10.11.2014
By Gretchen Reynolds Encourage young boys and girls to run, jump, squeal, hop and chase after each other or after erratically kicked balls, and you substantially improve their ability to think, according to the most ambitious study ever conducted of physical activity and cognitive performance in children. The results underscore, yet again, the importance of physical activity for children’s brain health and development, especially in terms of the particular thinking skills that most affect academic performance. The news that children think better if they move is hardly new. Recent studies have shown that children’s scores on math and reading tests rise if they go for a walk beforehand, even if the children are overweight and unfit. Other studies have found correlations between children’s aerobic fitness and their brain structure, with areas of the brain devoted to thinking and learning being generally larger among youngsters who are more fit. But these studies were short-term or associational, meaning that they could not tease out whether fitness had actually changed the children’s’ brains or if children with well-developed brains just liked exercise. So for the new study, which was published in September in Pediatrics, researchers at the University of Illinois at Urbana-Champaign approached school administrators at public elementary schools in the surrounding communities and asked if they could recruit the school’s 8- and 9-year-old students for an after-school exercise program. This group was of particular interest to the researchers because previous studies had determined that at that age, children typically experience a leap in their brain’s so-called executive functioning, which is the ability to impose order on your thinking. Executive functions help to control mental multitasking, maintain concentration, and inhibit inappropriate responses to mental stimuli. © 2014 The New York Times Company
By Clare Wilson If you’re facing surgery, this may well be your worst nightmare: waking up while under the knife without medical staff realizing. The biggest-ever study of this phenomenon is shedding light on what such an experience feels like and is causing debate about how best to prevent it. For a one-year period starting in 2012, an anesthetist at every hospital in the United Kingdom and Ireland recorded every case where a patient told a staff member that he had been awake during surgery. Prompted by these reports, the researchers investigated 300 cases, interviewing the patient and doctors involved. One of the most striking findings, says the study’s lead author, Jaideep Pandit of Oxford University Hospitals, was that pain was not generally the worst part of the experience: It was paralysis. For some operations, paralyzing drugs are given to relax muscles and stop reflex movements. “Pain was something they understood, but very few of us have experienced what it’s like to be paralyzed,” Pandit says. “They thought they had been buried alive.” “I thought I was about to die,” says Sandra, who regained consciousness but was unable to move during a dental operation when she was 12 years old. “It felt as though nothing would ever work again — as though the anesthetist had removed everything apart from my soul.”
Link ID: 20168 - Posted: 10.07.2014
James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class. The improvements in this case came in executive control, which consists of inhibition (resisting distraction, maintaining focus), working memory, and cognitive flexibility (switching between tasks). The images above show the brain activity in the group of kids who did the program as opposed to the group that didn't. It's the kind of difference that's so dramatic it's a little unsettling. The study only lasted nine months, but when you're only seven years old, nine months is a long time to be sitting in class with a blue head. © 2014 by The Atlantic Monthly Group.
Link ID: 20152 - Posted: 10.02.2014
By Erik Parens Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished? Some researchers working at the intersection of psychology, neuroscience and philosophy think the answer is yes. Their reasoning is straightforward: if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense. As Joshua Greene and Jonathan Cohen speculated in a 2004 essay: “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.” Just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, we need to view ourselves through two lenses to gain a greater depth of understanding of ourselves. This past summer, Greene and several other colleagues did empirical work that appears to confirm that 2004 speculation. The new work finds that when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished. According to the study’s authors, once students are led to question the concept of free will — understood as the idea that humans “can generate spontaneous choices and actions not determined by prior events” — they begin to find the idea of “just deserts” untenable. “When genuine choice is deemed impossible, condemnation is less justified,” the authors write. © 2014 The New York Times Company
By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was texting a girlfriend at the time, adding in unmistakable anguish that he can’t even remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company
Link ID: 20124 - Posted: 09.27.2014
Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.
Link ID: 20115 - Posted: 09.25.2014
By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.
Link ID: 20088 - Posted: 09.18.2014
by Bethany Brookshire Most of us wish we ate better. I know I certainly do. But when hunger strikes, and you’re standing in line at the grab-and-go food joint, that salad seems really lackluster sitting next to that tasty-looking cookie. I can’t help but think that my diet — and my waistline — would look a lot better if I just craved lettuce a little more. Now a new study shows that although we may never cease to love cookies, we might be able to make that carrot a little more appealing. In overweight people, a behavioral intervention was associated with changes in how their brains responded to high- and low-calorie foods. The small pilot study is intriguing, but with just 13 participants, a larger study is needed before scientists will know if training the brain can make us abstain. “Everyone responds more strongly to high-calorie foods than low-calorie foods. It’s just normal,” says study coauthor Susan Roberts, a behavioral nutrition scientist from Tufts University in Medford, Mass. While most people prefer brownies over beets, people who are overweight or obese have a harder time avoiding high-calorie foods, she says. “When someone becomes overweight, there’s a dampening effect on a number of brain structures, including the reward system,” she says. “It’s harder to enjoy food generally, and so when someone becomes overweight, they really want to eat those high-calorie foods, because those are the foods that activate reward systems to the biggest extent.” Craving is a particular issue. Craving is distinct from hunger and focuses on a particular food, often foods that are high calorie. Other studies show that people who are obese have more cravings than those who are not. © Society for Science & the Public 2000 - 2014
By Megan Allison Diagnoses of Attention Hyperactivity Disorder are on the rise. The Centers for Disease Control and Prevention calculated that by 2011, 11 percent of children had been diagnosed with ADHD, and 6.1 percent of all US children were taking an ADHD medication. But could a solution be as simple as exercise? A study published this month in the Journal of Abnormal Child Psychology found that aerobic activity sessions before school helped children with ADHD with their moods and attention spans. The study involved a group of just over 200 students in kindergarten through second grade at schools in Indiana and Vermont. For 12 weeks, the students did 31 minutes of physical activity. The control group participated in classroom activities during this time. Although the results showed that all students showed improvement, authors noted that the exercise especially helped kids with ADHD. “It benefits all the kids, but I definitely see where it helps the kids with ADHD a lot,” said Jill Fritz, a fourth-grade teacher in Jacksonville, Fla. in an interview with The Wall Street Journal. “It really helps them get back on track and get focused.” In the Boston area, Dr. Sarah Sparrow Benes, Program Director of Physical and Health Education Programs at Boston University, teaches elementary and special educators how to use movement as a strategy in their classroom for learning. She finds that all students can benefit from exercise.
Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group
Link ID: 20080 - Posted: 09.16.2014
By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company
Link ID: 20056 - Posted: 09.10.2014
By Smitha Mundasad Health reporter, BBC News More than 300 people a year in the UK and Ireland report they have been conscious during surgery - despite being given general anaesthesia. In the largest study of its kind, scientists suggests this happens in one in every 19,000 operations. They found episodes were more likely when women were given general anaesthesia for Caesarean sections or patients were given certain drugs. Experts say though rare, much more needs to be done to prevent such cases. Led by the Royal College of Anaesthetists and Association of Anaesthetists of Great Britain and Ireland, researchers studied three million operations over a period of one year. More than 300 people reported they had experienced some level of awareness during surgery. Most episodes were short-lived and occurred before surgery started or after operations were completed. But some 41% of cases resulted in long-term psychological harm. Patients described a variety of experiences - from panic and pain to choking - though not all episodes caused concern. The most alarming were feelings of paralysis and being unable to communicate, the researchers say. One patient, who wishes to remain anonymous, described her experiences of routine orthodontic surgery at the age of 12. She said: "I could hear voices around me and I realised with horror that I had woken up in the middle of the operation but couldn't move a muscle. BBC © 2014
By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.
Link ID: 20049 - Posted: 09.09.2014