Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 1142

By Colin Barras Subtract 8 from 52. Did you see the calculation in your head? While a leading theory suggests our visual experiences are linked to our understanding of numbers, a study of people who have been blind from birth suggests the opposite. The link between vision and number processing is strong. Sighted people can estimate the number of people in a crowd just by looking, for instance, while children who can mentally rotate an object and correctly imagine how it might look from a different angle often develop better mathematical skills. “It’s actually hard to think of a situation when you might process numbers through any modality other than vision,” says Shipra Kanjlia at Johns Hopkins University in Baltimore, Maryland. But blind people can do maths too. To understand how they might compensate for their lack of visual experience, Kanjlia and her colleagues asked 36 volunteers – 17 of whom had been blind at birth – to do simple mental arithmetic inside an fMRI scanner. To level the playing field, the sighted participants wore blindfolds. We know that a region of the brain called the intraparietal sulcus (IPS) is, and brain scans revealed that the same area is similarly active in blind people too. “It’s really surprising,” says Kanjlia. “It turns out brain activity is remarkably similar, at least in terms of classic number processing.” This may mean we have a deep understanding of how to handle numbers that is entirely independent of visual experience. This suggests we are all born with a natural understanding of numbers – an idea many researchers find difficult to accept. © Copyright Reed Business Information Ltd.

Keyword: Vision; Attention
Link ID: 22664 - Posted: 09.17.2016

Dean Burnett You remember that time a children’s TV presenter, one who has been working in children’s television for decades and is now employed on a channel aimed at under-8-year-olds, decided to risk it all and say one of the worst possible swear words on a show for pre-schoolers that he is famous for co-hosting? Remember how he took a huge risk for no appreciable gain and uttered a context-free profanity to an audience of toddlers? How he must have wanted to swear on children’s TV but paradoxically didn’t want anyone to notice so “snuck it in” as part of a song, where it would be more ambiguous? How all the editors and regulators at the BBC happened to completely miss it and allow it to be aired? Remember this happening? Well you shouldn’t, because it clearly didn’t. No presenter and/or channel would risk their whole livelihood in such a pointless, meaningless way, especially not the ever-pressured BBC. And, yet, an alarming number of people do think it happened. Apparently, there have been some “outraged parents” who are aghast at the whole thing. This seems reasonable in some respects; if your toddler was subjected to extreme cursing then as a parent you probably would object. On the other hand, if your very small child is able to recognise strong expletives, then perhaps misheard lyrics on cheerful TV shows aren’t the most pressing issue in their life. Regardless, a surprising number of people report that they did genuinely “hear” the c-word. This is less likely to be due to a TV presenter having some sort of extremely-fleeting breakdown, and more likely due to the quirks and questionable processing of our senses by our powerful yet imperfect brains. © 2016 Guardian News and Media Limited

Keyword: Hearing; Attention
Link ID: 22662 - Posted: 09.17.2016

André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited

Keyword: Attention
Link ID: 22653 - Posted: 09.15.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Keyword: Attention; Learning & Memory
Link ID: 22646 - Posted: 09.12.2016

By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist

Keyword: Learning & Memory; Attention
Link ID: 22640 - Posted: 09.10.2016

Chris Chambers One of the most compelling impressions in everyday life is that wherever we look, we “see” everything that is happening in front of us – much like a camera. But this impression is deceiving. In reality our senses are bombarded by continual waves of stimuli, triggering an avalanche of sensations that far exceed the brain’s capacity. To make sense of the world, the brain needs to determine which sensations are the most important for our current goals, focusing resources on the ones that matter and throwing away the rest. These computations are astonishingly complex, and what makes attention even more remarkable is just how effortless it is. The mammalian attention system is perhaps the most efficient and precisely tuned junk filter we know of, refined through millions of years of annoying siblings (and some evolution). Attention is amazing but no system is ever perfect. Our brain’s computational reserves are large but not infinite, and under the right conditions we can “break it” and peek behind the curtain. This isn’t just a fun trick – understanding these limits can yield important insights into psychology and neurobiology, helping us to diagnose and treat impairments that follow brain injury and disease. Thanks to over a hundred years of psychology research, it’s relatively easy to reveal attention in action. One way is through the phenomenon of change blindness. Try it yourself by following the instructions in the short video below (no sound). When we think of the term “blindness” we tend to assume a loss of vision caused by damage to the eye or optic nerves. But as you saw in the video, change blindness is completely normal and is caused by maxing out your attentional capacity. © 2016 Guardian News and Media Limited

Keyword: Attention; Vision
Link ID: 22633 - Posted: 09.06.2016

A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily

Keyword: Attention; Vision
Link ID: 22627 - Posted: 09.05.2016

By STEVE SILBERMAN In the late 1930s, Charles Bradley, the director of a home for “troublesome” children in Rhode Island, had a problem. The field of neuroscience was still in its infancy, and one of the few techniques available to allow psychiatrists like Bradley to ponder the role of the brain in emotional disorders was a procedure that required replacing a volume of cerebrospinal fluid in the patient’s skull with air. This painstaking process allowed any irregularities to stand out clearly in X-ray images, but many patients suffered excruciating headaches that lasted for weeks afterward. Meanwhile, a pharmaceutical company called Smith, Kline & French was facing a different sort of problem. The firm had recently acquired the rights to sell a powerful stimulant then called “benzedrine sulfate” and was trying to create a market for it. Toward that end, the company made quantities of the drug available at no cost to doctors who volunteered to run studies on it. Bradley was a firm believer that struggling children needed more than a handful of pills to get better; they also needed psychosocial therapy and the calming and supportive environment that he provided at the home. But he took up the company’s offer, hoping that the drug might eliminate his patients’ headaches. It did not. But the Benzedrine did have an effect that was right in line with Smith, Kline & French’s aspirations for its new product: The drug seemed to boost the children’s eagerness to learn in the classroom while making them more amenable to following the rules. The drug seemed to calm the children’s mood swings, allowing them to become, in the words of their therapists, more “attentive” and “serious,” able to complete their schoolwork and behave. Bradley was amazed that Benzedrine, a forerunner of Ritalin and Adderall, was such a great normalizer, turning typically hard-to-manage kids into models of complicity and decorum. But even after marveling at the effects of the drug, he maintained that medication should be considered for children only in addition to other forms of therapy. © 2016 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 22612 - Posted: 08.30.2016

By Usha Lee McFarling @ushamcfarling LOS ANGELES — A team of physicians and neuroscientists on Wednesday reported the successful use of ultrasound waves to “jump start” the brain of a 25-year-old man recovering from coma — and plan to launch a much broader test of the technique, in hopes of finding a way to help at least some of the tens of thousands of patients in vegetative states. The team, based at the University of California, Los Angeles, cautions that the evidence so far is thin: They have no way to know for sure whether the ultrasound stimulation made the difference for their young patient, or whether he spontaneously recovered by coincidence shortly after the therapy. But the region of the brain they targeted with the ultrasound — the thalamus — has previously been shown to be important in restoring consciousness. In 2007, a 38-year-old man who had been minimally conscious for six years regained some functions after electrodes were implanted in his brain to stimulate the thalamus. The ultrasound technique is a “good idea” that merits further study, said Dr. Nicholas Schiff, a pioneer in the field of using brain stimulation to restore consciousness who conducted the 2007 study. “It’s intriguing and it’s an interesting possibility,” said Schiff, a neuroscientist at Weill Cornell Medicine. The UCLA procedure used an experimental device, about the size of a teacup saucer, to focus ultrasonic waves on the thalamus, two walnut-sized bulbs in the center of the brain that serve as a critical hub for information flow and help regulate consciousness and sleep.

Keyword: Consciousness
Link ID: 22606 - Posted: 08.27.2016

Laura Sanders Brain scientists Eric Jonas and Konrad Kording had grown skeptical. They weren’t convinced that the sophisticated, big data experiments of neuroscience were actually accomplishing anything. So they devised a devilish experiment. Instead of studying the brain of a person, or a mouse, or even a lowly worm, the two used advanced neuroscience methods to scrutinize the inner workings of another information processor — a computer chip. The unorthodox experimental subject, the MOS 6502, is the same chip that dazzled early tech junkies and kids alike in the 1980s by powering Donkey Kong, Space Invaders and Pitfall, as well as the Apple I and II computers. Of course, these experiments were rigged. The scientists already knew everything about how the 6502 works. “The beauty of the microprocessor is that unlike anything in biology, we understand it on every level,” says Jonas, of the University of California, Berkeley. A barrel-hurling gorilla is the enemy in Donkey Kong, a video game powered by the MOS 6502 microprocessor. Along with Space Invaders and Pitfall, this game served as the “behavior” in a recent experiment. Using a simulation of MOS 6502, Jonas and Kording, of Northwestern University in Chicago, studied the behavior of electricity-moving transistors, along with aspects of the chip’s connections and its output, to reveal how it handles information. Since they already knew what the outcomes should be, they were actually testing the methods. By the end of their experiments, Jonas and Kording had discovered almost nothing. |© Society for Science & the Public 2000 - 2016

Keyword: Brain imaging
Link ID: 22597 - Posted: 08.24.2016

By Jessica Hamzelou Feel like you’ve read this before? Most of us have experienced the eerie familiarity of déjà vu, and now the first brain scans of this phenomenon have revealed why – it’s a sign of our brain checking its memory. Déjà vu was thought to be caused by the brain making false memories, but research by Akira O’Connor at the University of St Andrews, UK, and his team now suggests this is wrong. Exactly how déjà vu works has long been a mystery, partly because its fleeting and unpredictable nature makes it difficult to study. To get around this, O’Connor and his colleagues developed a way to trigger the sensation of déjà vu in the lab. The team’s technique uses a standard method to trigger false memories. It involves telling a person a list of related words – such as bed, pillow, night, dream – but not the key word linking them together, in this case, sleep. When the person is later quizzed on the words they’ve heard, they tend to believe they have also heard “sleep” – a false memory. To create the feeling of déjà vu, O’ Connor’s team first asked people if they had heard any words beginning with the letter “s”. The volunteers replied that they hadn’t. This meant that when they were later asked if they had heard the word sleep, they were able to remember that they couldn’t have, but at the same time, the word felt familiar. “They report having this strange experience of déjà vu,” says O’Connor. © Copyright Reed Business Information Ltd.

Keyword: Attention; Learning & Memory
Link ID: 22565 - Posted: 08.17.2016

By MIKE SACKS You’ve seen me. I know you have. I’m the guy wearing gloves on the subway in October. Or even into April. Perhaps I’m wearing just one glove, allowing my naked hand to turn the pages of a book. No big deal. Just another one-gloved commuter, heading home. If it’s crowded, you may have noticed me doing my best to “surf,” sans contact, until the car comes to a stop, in which case I may knock into a fellow passenger. Aboveground you may have seen me acting the gentleman, opening doors for others with a special paper towel I carry in my front left pocket for just such a momentous occasion. No? How about that guy walking quickly ahead of you, the one impishly avoiding sidewalk cracks? Or perhaps you’ve noticed a stranger who turns and makes eye contact with you for seemingly no reason. You may have asked, “You got a problem?” Oh, I definitely have a problem. But it has nothing to do with you, sir or madam. (And, yes, even in my thoughts I refer to you as “sir” and “madam.”) The problem here is what multiple doctors have diagnosed as obsessive-compulsive disorder. You may refer to it by its kicky abbreviation, O.C.D. I prefer to call it Da Beast. Da Beast is a creature I have lived with since I was 11, a typical age for O.C.D. to snarl into one’s life without invitation or warning. According to the International O.C.D. Foundation, roughly one in 100 adults suffers from the disorder. Each of us has his or her own obsessive thoughts and fears to contend with. My particular beast of burden is a fear of germs and sickness. It’s a popular one, perhaps the most common. © 2016 The New York Times Company

Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 22541 - Posted: 08.11.2016

By ABBY GOODNOUGH TUSCALOOSA, Ala. — Roslyn Lewis was at work at a dollar store here in Tuscaloosa, pushing a heavy cart of dog food, when something popped in her back: an explosion of pain. At the emergency room the next day, doctors gave her Motrin and sent her home. Her employer paid for a nerve block that helped temporarily, numbing her lower back, but she could not afford more injections or physical therapy. A decade later, the pain radiates to her right knee and remains largely unaddressed, so deep and searing that on a recent day she sat stiffly on her couch, her curtains drawn, for hours. The experience of African-Americans, like Ms. Lewis, and other minorities illustrates a problem as persistent as it is complex: Minorities tend to receive less treatment for pain than whites, and suffer more disability as a result. While an epidemic of prescription opioid abuse has swept across the United States, African-Americans and Hispanics have been affected at much lower rates than whites. Researchers say minority patients use fewer opioids, and they offer a thicket of possible explanations, including a lack of insurance coverage and a greater reluctance among members of minority groups to take opioid painkillers even if they are prescribed. But the researchers have also found evidence of racial bias and stereotyping in recognizing and treating pain among minorities, particularly black patients. “We’ve done a good job documenting that these disparities exist,” said Salimah Meghani, a pain researcher at the University of Pennsylvania. “We have not done a good job doing something about them.” Dr. Meghani’s 2012 analysis of 20 years of published research found that blacks were 34 percent less likely than whites to be prescribed opioids for conditions such as backaches, abdominal pain and migraines, and 14 percent less likely to receive opioids for pain from traumatic injuries or surgery. © 2016 The New York Times Company

Keyword: Pain & Touch; Attention
Link ID: 22532 - Posted: 08.09.2016

By EUGENE M. CARUSO, ZACHARY C. BURNS and BENJAMIN A. CONVERSE Watching slow-motion footage of an event can certainly improve our judgment of what happened. But can it also impair judgment? This question arose in the 2009 murder trial of a man named John Lewis, who killed a police officer during an armed robbery of a Dunkin’ Donuts in Philadelphia. Mr. Lewis pleaded guilty; the only question for the jury was whether the murder resulted from a “willful, deliberate and premeditated” intent to kill or — as Mr. Lewis argued — from a spontaneous, panicked reaction to seeing the officer enter the store unexpectedly. The key piece of evidence was a surveillance video of the shooting, which the jury saw both in real time and in slow motion. The jury found that Mr. Lewis had acted with premeditation, and he was sentenced to death. Mr. Lewis appealed the decision, arguing that the slow-motion video was prejudicial. Specifically, he claimed that watching the video in slow motion artificially stretched the relevant time period and created a “false impression of premeditation.” Did it? We recently conducted a series of experiments whose results are strikingly consistent with that claim. Our studies, published this week in the Proceedings of the National Academy of Sciences, show that seeing replays of an action in slow motion leads viewers to believe that the actor had more time to think before acting than he actually did. The result is that slow motion makes actions seem more intentional, more premeditated. In one of our studies, participants watched surveillance video of a fatal shooting that occurred outside a convenience store during an armed robbery. We gave them a set of instructions similar to those given to the jurors in Mr. Lewis’s case, asking them to decide whether the crime was premeditated or not. We assigned half our participants to watch the video in slow motion and the other half to watch it at regular speed. © 2016 The New York Times Company

Keyword: Attention
Link ID: 22525 - Posted: 08.08.2016

By BENEDICT CAREY Solving a hairy math problem might send a shudder of exultation along your spinal cord. But scientists have historically struggled to deconstruct the exact mental alchemy that occurs when the brain successfully leaps the gap from “Say what?” to “Aha!” Now, using an innovative combination of brain-imaging analyses, researchers have captured four fleeting stages of creative thinking in math. In a paper published in Psychological Science, a team led by John R. Anderson, a professor of psychology and computer science at Carnegie Mellon University, demonstrated a method for reconstructing how the brain moves from understanding a problem to solving it, including the time the brain spends in each stage. The imaging analysis found four stages in all: encoding (downloading), planning (strategizing), solving (performing the math), and responding (typing out an answer). “I’m very happy with the way the study worked out, and I think this precision is about the limit of what we can do” with the brain imaging tools available, said Dr. Anderson, who wrote the report with Aryn A. Pyke and Jon M. Fincham, both also at Carnegie Mellon. To capture these quicksilver mental operations, the team first taught 80 men and women how to interpret a set of math symbols and equations they had not seen before. The underlying math itself wasn’t difficult, mostly addition and subtraction, but manipulating the newly learned symbols required some thinking. The research team could vary the problems to burden specific stages of the thinking process — some were hard to encode, for instance, while others extended the length of the planning stage. The scientists used two techniques of M.R.I. data analysis to sort through what the participants’ brains were doing. One technique tracked the neural firing patterns during the solving of each problem; the other identified significant shifts from one kind of mental state to another. The subjects solved 88 problems each, and the research team analyzed the imaging data from those solved successfully. © 2016 The New York Times Company

Keyword: Attention; Brain imaging
Link ID: 22496 - Posted: 07.30.2016

By ERICA GOODE You are getting sleepy. Very sleepy. You will forget everything you read in this article. Hypnosis has become a common medical tool, used to reduce pain, help people stop smoking and cure them of phobias. But scientists have long argued about whether the hypnotic “trance” is a separate neurophysiological state or simply a product of a hypnotized person’s expectations. A study published on Thursday by Stanford researchers offers some evidence for the first explanation, finding that some parts of the brain function differently under hypnosis than during normal consciousness. The study was conducted with functional magnetic resonance imaging, a scanning method that measures blood flow in the brain. It found changes in activity in brain areas that are thought to be involved in focused attention, the monitoring and control of the body’s functioning, and the awareness and evaluation of a person’s internal and external environments. “I think we have pretty definitive evidence here that the brain is working differently when a person is in hypnosis,” said Dr. David Spiegel, a professor of psychiatry and behavioral sciences at Stanford who has studied the effectiveness of hypnosis. Functional imaging is a blunt instrument and the findings can be difficult to interpret, especially when a study is looking at activity levels in many brain areas. Still, Dr. Spiegel said, the findings might help explain the intense absorption, lack of self-consciousness and suggestibility that characterize the hypnotic state. © 2016 The New York Times Company

Keyword: Attention; Brain imaging
Link ID: 22493 - Posted: 07.30.2016

By ANNA WEXLER EARLIER this month, in the journal Annals of Neurology, four neuroscientists published an open letter to practitioners of do-it-yourself brain stimulation. These are people who stimulate their own brains with low levels of electricity, largely for purposes like improved memory or learning ability. The letter, which was signed by 39 other researchers, outlined what is known and unknown about the safety of such noninvasive brain stimulation, and asked users to give careful consideration to the risks. For the last three years, I have been studying D.I.Y. brain stimulators. Their conflict with neuroscientists offers a fascinating case study of what happens when experimental tools normally kept behind the closed doors of academia — in this case, transcranial direct current stimulation — are appropriated for use outside them. Neuroscientists began experimenting in earnest with transcranial direct current stimulation about 15 years ago. In such stimulation, electric current is administered at levels that are hundreds of times less than those used in electroconvulsive therapy. To date, more than 1,000 peer-reviewed studies of the technique have been published. Studies have suggested, among other things, that the stimulation may be beneficial for treating problems like depression and chronic pain as well as enhancing cognition and learning in healthy individuals. The device scientists use for stimulation is essentially a nine-volt battery attached to two wires that are connected to electrodes placed at various spots on the head. A crude version can be constructed with just a bit of electrical know-how. Consequently, as reports of the effects of the technique began to appear in scientific journals and in newspapers, people began to build their own devices at home. By late 2011 and early 2012, diagrams, schematics and videos began to appear online. © 2016 The New York Times Company

Keyword: ADHD
Link ID: 22471 - Posted: 07.23.2016

Rachel Ehrenberg The brain doesn’t really go out like a light when anesthesia kicks in. Nor does neural activity gradually dim, a new study in monkeys reveals. Rather, intermittent flickers of brain activity appear as the effects of an anesthetic take hold. Some synchronized networks of brain activity fall out of step as the monkeys gradually drift from wakefulness, the study showed. But those networks resynchronized when deep unconsciousness set in, researchers reported in the July 20 Journal of Neuroscience. That the two networks behave so differently during the drifting-off stage is surprising, says study coauthor Yumiko Ishizawa of Harvard Medical School and Massachusetts General Hospital. It isn’t clear what exactly is going on, she says, except that the anesthetic’s effects are a lot more complex than previously thought. Most studies examining the how anesthesia works useelectroencephalograms, or EEGs, which record brain activity using electrodes on the scalp. The new study offers unprecedented surveillance by eavesdropping via electrodes implanted inside macaque monkeys’ brains. This new view provides clues to how the brain loses and gains consciousness. “It’s a very detailed description of something we know very little about,” says cognitive neuroscientist Tristan Bekinschtein of the University of Cambridge, who was not involved with the work. Although the study is elegant, it isn’t clear what to make of the findings, he says. “These are early days.” |© Society for Science & the Public 2000 - 2016.

Keyword: Consciousness
Link ID: 22457 - Posted: 07.20.2016

James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,

Keyword: Attention; Development of the Brain
Link ID: 22447 - Posted: 07.16.2016

Michael Egnor The most intractable question in modern neuroscience and philosophy of the mind is often phrased "What is consciousness?" The problem has been summed up nicely by philosopher David Chalmers as what he calls the Hard Problem of consciousness: How is it that we are subjects, and not just objects? Chalmers contrasts this hard question with what he calls the Easy Problem of consciousness: What are the neurobiological substrates underlying such things as wakefulness, alertness, attention, arousal, etc. Chalmers doesn't mean of course that the neurobiology of arousal is easy. He merely means to show that even if we can understand arousal from a neurobiological standpoint, we haven't yet solved the hard problem: the problem of subjective experience. Why am I an I, and not an it? Chalmers's point is a good one, and I think that it has a rather straightforward solution. First, some historical background is necessary. "What is consciousness?" is a modern question. It wasn't asked before the 17th century, because no one before Descartes thought that the mind was particularly mysterious. The problem of consciousness was created by moderns. The scholastic philosophers, following Aristotle and Aquinas, understood the soul as the animating principle of the body. In a human being, the powers of the soul -- intellect, will, memory, perception, appetite, and such -- were no more mysterious than the other powers of the soul, such as respiration, circulation, etc. Of course, biology in the Middle Ages wasn't as advanced as it is today, so there was much they didn't understand about human physiology, but in principle the mind was just another aspect of human biology, not inherently mysterious. In modern parlance, the scholastics saw the mind as the Easy Problem, no more intractable than understanding how breathing or circulation work.

Keyword: Consciousness
Link ID: 22441 - Posted: 07.15.2016