Links for Keyword: Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 81

David DiSalvo @neuronarrative One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions. For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will. Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20232 - Posted: 10.23.2014

By Smitha Mundasad Health reporter, BBC News Scientists have uncovered hidden signatures in the brains of people in vegetative states that suggest they may have a glimmer of consciousness. Doctors normally consider these patients - who have severe brain injuries - to be unaware of the world around them although they appear awake. Researchers hope their work will help identify those who are actually conscious, but unable to communicate. Their report appears in PLoS Computational Biology. After catastrophic brain injuries, for example due to car crashes or major heart attacks, some people can appear to wake up yet do not respond to events around them. Doctors describe these patients as being in a vegetative state. Patients typically open their eyes and look around, but cannot react to commands or make any purposeful movements. Some people remain in this state for many years. But a handful of recent studies have questioned this diagnosis - suggesting some patients may actually be aware of what is going on around them, but unable to communicate. A team of scientists at Cambridge University studied 13 patients in vegetative states, mapping the electrical activity of their nerves using a mesh of electrodes applied to their scalps. The electrical patterns and connections they recorded were then compared with healthy volunteers. The study reveals four of the 13 patients had an electrical signature that was very similar to those seen in the volunteers. Dr Srivas Chennu, who led the research, said: "This suggests some of the brain networks that support consciousness in healthy adults may be well-preserved in a number of people in persistent vegetative state too." BBC © 2014

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20217 - Posted: 10.18.2014

Daniel Cressey Mirrors are often used to elicit aggression in animal behavioural studies, with the assumption being that creatures unable to recognize themselves will react as if encountering a rival. But research suggests that such work may simply reflect what scientists expect to see, and not actual aggression. For most people, looking in a mirror does not trigger a bout of snarling hostility at the face staring back. But many animals do seem to react aggressively to their mirror image, and for years mirrors have been used to trigger such responses for behavioural research on species ranging from birds to fish. “There’s been a very long history of using a mirror as it’s just so handy,” says Robert Elwood, an animal-behaviour researcher at Queen’s University in Belfast, UK. Using a mirror radically simplifies aggression experiments, cutting down the number of animals required and providing the animal being observed with an ‘opponent’ perfectly matched in terms of size and weight. But in a study just published in Animal Behaviour1, Elwood and his team add to evidence that many mirror studies are flawed. The researchers looked at how convict cichlid fish (Amatitlania nigrofasciata) reacted both to mirrors and to real fish of their own species. This species prefers to display their right side in aggression displays, which means that they end up alongside each other in a head-to-tail configuration. It is impossible for a fish to achieve this with their own reflection, but Elwood reasoned that fish faced with a mirror would attempt it, and flip from side to side as they tried to present an aggressive display. On the other hand, if the reflection did not trigger an aggressive reaction, the fish would not display such behaviour as much or as frequently. © 2014 Nature Publishing Group,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20202 - Posted: 10.13.2014

By MICHAEL S. A. GRAZIANO OF the three most fundamental scientific questions about the human condition, two have been answered. First, what is our relationship to the rest of the universe? Copernicus answered that one. We’re not at the center. We’re a speck in a large place. Second, what is our relationship to the diversity of life? Darwin answered that one. Biologically speaking, we’re not a special act of creation. We’re a twig on the tree of evolution. Third, what is the relationship between our minds and the physical world? Here, we don’t have a settled answer. We know something about the body and brain, but what about the subjective life inside? Consider that a computer, if hooked up to a camera, can process information about the wavelength of light and determine that grass is green. But we humans also experience the greenness. We have an awareness of information we process. What is this mysterious aspect of ourselves? Many theories have been proposed, but none has passed scientific muster. I believe a major change in our perspective on consciousness may be necessary, a shift from a credulous and egocentric viewpoint to a skeptical and slightly disconcerting one: namely, that we don’t actually have inner feelings in the way most of us think we do. Imagine a group of scholars in the early 17th century, debating the process that purifies white light and rids it of all colors. They’ll never arrive at a scientific answer. Why? Because despite appearances, white is not pure. It’s a mixture of colors of the visible spectrum, as Newton later discovered. The scholars are working with a faulty assumption that comes courtesy of the brain’s visual system. The scientific truth about white (i.e., that it is not pure) differs from how the brain reconstructs it. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20196 - Posted: 10.11.2014

By Clare Wilson If you’re facing surgery, this may well be your worst nightmare: waking up while under the knife without medical staff realizing. The biggest-ever study of this phenomenon is shedding light on what such an experience feels like and is causing debate about how best to prevent it. For a one-year period starting in 2012, an anesthetist at every hospital in the United Kingdom and Ireland recorded every case where a patient told a staff member that he had been awake during surgery. Prompted by these reports, the researchers investigated 300 cases, interviewing the patient and doctors involved. One of the most striking findings, says the study’s lead author, Jaideep Pandit of Oxford University Hospitals, was that pain was not generally the worst part of the experience: It was paralysis. For some operations, paralyzing drugs are given to relax muscles and stop reflex movements. “Pain was something they understood, but very few of us have experienced what it’s like to be paralyzed,” Pandit says. “They thought they had been buried alive.” “I thought I was about to die,” says Sandra, who regained consciousness but was unable to move during a dental operation when she was 12 years old. “It felt as though nothing would ever work again — as though the anesthetist had removed everything apart from my soul.”

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20168 - Posted: 10.07.2014

By Erik Parens Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished? Some researchers working at the intersection of psychology, neuroscience and philosophy think the answer is yes. Their reasoning is straightforward: if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense. As Joshua Greene and Jonathan Cohen speculated in a 2004 essay: “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.” Just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, we need to view ourselves through two lenses to gain a greater depth of understanding of ourselves. This past summer, Greene and several other colleagues did empirical work that appears to confirm that 2004 speculation. The new work finds that when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished. According to the study’s authors, once students are led to question the concept of free will — understood as the idea that humans “can generate spontaneous choices and actions not determined by prior events” — they begin to find the idea of “just deserts” untenable. “When genuine choice is deemed impossible, condemnation is less justified,” the authors write. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20131 - Posted: 09.29.2014

2014 by Dan Jones The vast majority of people think we have free will and are the authors of our own life stories. But if neuroscientists were one day able to predict our every action based on brain scans, would people abandon this belief in droves? A new study concludes that such knowledge would not by itself be enough to shake our confidence in our own volition. Many neuroscientists, such as the late Francis Crick, have argued that our sense of free will is no more than the behaviour of a vast assembly of nerve cells. This is tied to the idea of determinism, which has it that every effect is preceded by a cause, with cause and effect connected by physical laws. This is why the behaviour of physical systems can be predicted – even the brain, in principle. As author Sam Harris puts it: "If determinism is true, the future is set – and this includes all our future states of mind and our subsequent behaviour." If people lost their belief in their own free will, that would have important consequences for how we think about moral responsibility, and even how we behave. For example, numerous studies have shown that when people are led to reject free will they are more likely to cheat, and are also less bothered about punishing other wrongdoers. For those who argue that what we know about neuroscience is incompatible with free will, predicting what our brain is about to do should reveal the illusory nature of free will, and lead people to reject it. Experimental philosopher Eddy Nahmias at Georgia State University in Atlanta dubs this view "willusionism". He recently set out to test it. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 20102 - Posted: 09.22.2014

Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20080 - Posted: 09.16.2014

By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20056 - Posted: 09.10.2014

By Smitha Mundasad Health reporter, BBC News More than 300 people a year in the UK and Ireland report they have been conscious during surgery - despite being given general anaesthesia. In the largest study of its kind, scientists suggests this happens in one in every 19,000 operations. They found episodes were more likely when women were given general anaesthesia for Caesarean sections or patients were given certain drugs. Experts say though rare, much more needs to be done to prevent such cases. Led by the Royal College of Anaesthetists and Association of Anaesthetists of Great Britain and Ireland, researchers studied three million operations over a period of one year. More than 300 people reported they had experienced some level of awareness during surgery. Most episodes were short-lived and occurred before surgery started or after operations were completed. But some 41% of cases resulted in long-term psychological harm. Patients described a variety of experiences - from panic and pain to choking - though not all episodes caused concern. The most alarming were feelings of paralysis and being unable to communicate, the researchers say. One patient, who wishes to remain anonymous, described her experiences of routine orthodontic surgery at the age of 12. She said: "I could hear voices around me and I realised with horror that I had woken up in the middle of the operation but couldn't move a muscle. BBC © 2014

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 20055 - Posted: 09.10.2014

by Tom Siegfried René Descartes was a very clever thinker. He proved his own existence, declaring that because he thought, he must exist: “I think, therefore I am.” But the 17th century philosopher-mathematician-scientist committed a serious mental blunder when he decided that the mind doing the thinking was somehow separate from the brain it lived in. Descartes believed that thought was insubstantial, transmitted from the ether to the pineal gland, which played the role of something like a Wi-Fi receiver embedded deep in the brain. Thereafter mind-brain dualism became the prevailing prejudice. Nowadays, though, everybody with a properly working brain realizes that the mind and brain are coexistent. Thought processes and associated cognitive mental activity all reflect the physics and chemistry of cells and molecules inhabiting the brain’s biological tissue. Many people today do not realize, though, that there’s a modern version of Descartes’ mistaken dichotomy. Just as he erroneously believed the mind was distinct from the brain, some scientists have mistakenly conceived of the brain as distinct from the body. Much of the early research in artificial intelligence, for instance, modeled the brain as a computer, seeking to replicate mental life as information processing, converting inputs to outputs by logical rules. But even if such a machine could duplicate the circuitry of the brain, it would be missing essential peripheral input from an attached body. Actual intelligence requires both body and brain, as the neurologist Antonio Damasio pointed out in his 1994 book, Descartes’ Error. “Mental activity, from its simplest aspects to its most sublime, requires both brain and body proper,” Damasio wrote. © Society for Science & the Public 2000 - 2013.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 5: The Sensorimotor System
Link ID: 20002 - Posted: 08.27.2014

By KATHARINE Q. SEELYE SPARTA, N.J. — When Gail Morris came home late one night after taking her daughter to college, she saw her teenage son, Alex, asleep on the sofa in the family room. Nothing seemed amiss. An unfinished glass of apple juice sat on the table. She tucked him in under a blanket and went to bed. The next morning, he would not wake up. He was stiff and was hardly breathing. Over the next several hours, Ms. Morris was shocked to learn that her son had overdosed on heroin. She was told he would not survive. He did survive, but barely. He was in a coma for six weeks. He went blind and had no function in his arms or legs. He could not speak or swallow. Hospitalized for 14 months, Alex, who is 6-foot-1, dropped to 90 pounds. One of his doctors said that Alex had come as close to dying as anyone he knew who had not actually died. Most people who overdose on heroin either die or fully recover. But Alex plunged into a state that was neither dead nor functional. There are no national statistics on how often opioid overdose leads to cases like Alex’s, but doctors say they worry that with the dramatic increase in heroin abuse and overdoses, they will see more such outcomes. “I would expect that we will,” said Dr. Nora Volkow, director of the National Institute on Drug Abuse. “They are starting to report isolated cases like this. And I would not be surprised if you have more intermediate cases with more subtle impairment.” More than 660,000 Americans used heroin in 2012, the federal government says, double the number of five years earlier. Officials attribute much of the increase to a crackdown on prescription painkillers, prompting many users to turn to heroin, which is cheaper and easier to get than other opioids. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 19935 - Posted: 08.11.2014

|By William Skaggs One of the most frustrating and mysterious medical conditions affecting the mind is impaired consciousness, as can occur with brain damage. Patients in a coma or a vegetative or minimally conscious state sometimes spontaneously recover to varying degrees, but in most cases there is little that doctors can do to help. Now a rigorous study by a group at Liège University Hospital Center in Belgium has found that a simple treatment called transcranial direct-current stimulation (tDCS) can temporarily raise awareness in minimally conscious patients. In tDCS, electrodes are glued to the scalp, and a weak electric current is passed through them to stimulate the underlying brain tissue. Scientists led by neurologist Steven Laureys applied the electric current for 20 minutes to patients' left prefrontal cortex, an area known to be involved in attentiveness and working memory. Afterward, the effects on consciousness were measured by doctors who did not know whether the patient had received real tDCS or a sham treatment, in which the apparatus ran, but no current was delivered. For patients in a vegetative state, who display no communication or purposeful behavior, the stimulation might have led to improvement in two patients, but no statistically compelling evidence emerged. Yet 13 of 30 patients in a minimally conscious state—defined by occasional moments of low-level awareness—showed measurable gains in their responses to questions and sensory stimuli. Some had only recently been injured, but others had been minimally conscious for months. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19934 - Posted: 08.11.2014

By KATE MURPHY ONE of the biggest complaints in modern society is being overscheduled, overcommitted and overextended. Ask people at a social gathering how they are and the stock answer is “super busy,” “crazy busy” or “insanely busy.” Nobody is just “fine” anymore. When people aren’t super busy at work, they are crazy busy exercising, entertaining or taking their kids to Chinese lessons. Or maybe they are insanely busy playing fantasy football, tracing their genealogy or churning their own butter. And if there is ever a still moment for reflective thought — say, while waiting in line at the grocery store or sitting in traffic — out comes the mobile device. So it’s worth noting a study published last month in the journal Science, which shows how far people will go to avoid introspection. “We had noted how wedded to our devices we all seem to be and that people seem to find any excuse they can to keep busy,” said Timothy Wilson, a psychology professor at the University of Virginia and lead author of the study. “No one had done a simple study letting people go off on their own and think.” The results surprised him and have created a stir in the psychology and neuroscience communities. In 11 experiments involving more than 700 people, the majority of participants reported that they found it unpleasant to be alone in a room with their thoughts for just 6 to 15 minutes. Moreover, in one experiment, 64 percent of men and 15 percent of women began self-administering electric shocks when left alone to think. These same people, by the way, had previously said they would pay money to avoid receiving the painful jolt. It didn’t matter if the subjects engaged in the contemplative exercise at home or in the laboratory, or if they were given suggestions of what to think about, like a coming vacation; they just didn’t like being in their own heads. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19884 - Posted: 07.26.2014

By Neuroskeptic An entertaining paper just out in Frontiers in Systems Neuroscience offers a panoramic view of the whole of neuroscience: Enlarging the scope: grasping brain complexity The paper is remarkable not just for its content but also for its style. Some examples: How does the brain work? This nagging question is an habitué from the top ten lists of enduring problems in Science’s grand challenges. Grasp this paradox: how is one human brain – a chef d’oeuvre of complexity honed by Nature – ever to reach such a feast as to understand itself? Where one brain may fail at this notorious philosophical riddle, may be a strong and diversely-skilled army of brains may come closer. Or It remains an uneasy feeling that so much of Brain Science is built upon the foundation of a pair of neurons, outside the context of their networks, and with two open-ended areas of darkness at either of their extremities that must be thought of as the entire remainder of the organism’s brain (and body). And my favorite: As humans tend to agree, increased size makes up for smarter brains (disclosure: both authors are human) I love it. I’m not sure I understand it, though. The authors, Tognoli and Kelso, begin by framing a fundamental tension between directed information transfer and neural synchrony, pointing out that neurons firing perfectly in synch with each other could not transfer information between themselves.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19829 - Posted: 07.15.2014

By Dominic Basulto It turns out that the human brain may not be as mysterious as it has always seemed to be. Researchers at George Washington University, led by Mohamad Koubeissi, may have found a way to turn human consciousness on and off by targeting a specific region of the brain with electrical currents. For brain researchers, unlocking the mystery of human consciousness has always been viewed as one of the keys for eventually building an artificial brain, and so this could be a big win for the future of brain research. What the researchers did was deliver a serious of high frequency electrical impulses to the claustrum region of the brain in a woman suffering from epilepsy. Before the electric shocks, the woman was capable of writing and talking. During the electric shocks, the woman faded out of consciousness, and started staring blankly into space, incapable of even the most basic sensory functions. Even her breathing slowed. As soon as the electrical shocks stopped, the woman immediately regained her sensory skills with no memory of the event. The researchers claim that this test case is evidence of being able to turn consciousness on and off. Granted, there’s a lot still to be done. That George Washington test, for example, has only been successfully performed on one person. And that woman had already had part of her hippocampus removed, so at least one researcher says the whole experiment must be interpreted carefully. There have been plenty of scientific experiments that have been “one and done,” so it remains to be seen whether these results can be replicated again.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19817 - Posted: 07.12.2014

by Helen Thomson ONE moment you're conscious, the next you're not. For the first time, researchers have switched off consciousness by electrically stimulating a single brain area. Scientists have been probing individual regions of the brain for over a century, exploring their function by zapping them with electricity and temporarily putting them out of action. Despite this, they have never been able to turn off consciousness – until now. Although only tested in one person, the discovery suggests that a single area – the claustrum – might be integral to combining disparate brain activity into a seamless package of thoughts, sensations and emotions. It takes us a step closer to answering a problem that has confounded scientists and philosophers for millennia – namely how our conscious awareness arises. Many theories abound but most agree that consciousness has to involve the integration of activity from several brain networks, allowing us to perceive our surroundings as one single unifying experience rather than isolated sensory perceptions. One proponent of this idea was Francis Crick, a pioneering neuroscientist who earlier in his career had identified the structure of DNA. Just days before he died in July 2004, Crick was working on a paper that suggested our consciousness needs something akin to an orchestra conductor to bind all of our different external and internal perceptions together. With his colleague Christof Koch, at the Allen Institute for Brain Science in Seattle, he hypothesised that this conductor would need to rapidly integrate information across distinct regions of the brain and bind together information arriving at different times. For example, information about the smell and colour of a rose, its name, and a memory of its relevance, can be bound into one conscious experience of being handed a rose on Valentine's day. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19787 - Posted: 07.03.2014

by Tania Lombrozo Science doesn't just further technology and help us predict and control our environment. It also changes the way we understand ourselves and our place in the natural world. This understanding can and a sense of . But it can also be , especially when it calls into question our basic assumptions about the kinds of creatures we are and the universe we inhabit. Current developments in neuroscience seem to be triggering precisely this jumble of reactions: wonder alongside disquiet, hope alongside alarm. A recent at Salon.com, for example, promises an explanation for "how neuroscience could save addicts from relapse," while an by Nathan Greenslit at The Atlantic, published less than a week later, raises worries that neuroscience is being used to reinforce racist drug policy. Obama's hails "," but with it comes the need to rapidly work out the of what we're learning about the brain and about ourselves. We're ; but we're not always sure what to make of it. In at the journal Psychological Science, psychologists Azim Shariff, Joshua Greene and six of their colleagues bring these heady issues down to earth by considering whether learning about neuroscience can influence judgments in a real-world situation: deciding how someone who commits a crime should be punished. The motivating intuition is this: to hold someone responsible for her actions, she must have acted with free will. ©2014 NPR

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19737 - Posted: 06.17.2014

Neil Levy Can human beings still be held responsible in the age of neuroscience? Some people say no: they say once we understand how the brain processes information and thereby causes behaviour, there’s nothing left over for the person to do. This argument has not impressed philosophers, who say there doesn’t need to be anything left for the person to do in order to be responsible. People are not anything over and above the causal systems involved in information processing, we are our brains (plus some other, equally physical stuff). We are responsible if our information processing systems are suitably attuned to reasons, most philosophers think. There are big philosophical debates concerning what it takes to be suitably attuned to reasons, and whether this is really enough for responsibility. But I want to set those debates aside here. It’s more interesting to ask what we can learn from neuroscience about the nature of responsibility and about when we’re responsible. Even if neuroscience doesn’t tell us that no one is ever responsible, it might be able to tell us if particular people are responsible for particular actions. A worthy case study Consider a case like this: early one morning in 1987, a Canadian man named Ken Parks got up from the sofa where he had fallen asleep and drove to his parents’-in-law house. There he stabbed them both before driving to the police station, where he told police he thought he had killed someone. He had: his mother-in-law died from her injuries. © 2010–2014, The Conversation Trust (UK)

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19702 - Posted: 06.06.2014

By Matthew R. Francis Possibly no subject in science has inspired more nonsense than quantum mechanics. Sure, it’s a complicated field of study, with a few truly mysterious facets that are not settled to everyone’s satisfaction after nearly a century of work. At the same time, though, using quantum to mean “we just don’t know” is ridiculous—and simply wrong. Quantum mechanics is the basis for pretty much all our modern technology, from smartphones to fluorescent lights, digital cameras to fiber-optic communications. If I had to pick a runner-up in the nonsense sweepstakes, it would be human consciousness, another subject with a lot of mysterious aspects. We are made of ordinary matter yet are self-aware, capable of abstractly thinking about ourselves and of recognizing others (including nonhumans) as separate entities with their own needs. As a physicist, I’m fascinated by the notion that our consciousness can imagine realities other than our own: The universe is one way, but we are perfectly happy to think of how it might be otherwise. I hold degrees in physics and have spent a lot of time learning and teaching quantum mechanics. Nonphysicists seem to have the impression that quantum physics is really esoteric, with those who study it spending their time debating the nature of reality. In truth, most of a quantum mechanics class is lots and lots of math, in the service of using a particle’s quantum state—the bundle of physical properties such as position, energy, spin, and the like—to describe the outcomes of experiments. Sure, there’s some weird stuff and it’s fun to talk about, but quantum mechanics is aimed at being practical (ideally, at least). © 2014 The Slate Group LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19671 - Posted: 05.31.2014