Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 416

By YUDHIJIT BHATTACHARJEE One summer night in 2011, a tall, 40-something professor named Diederik Stapel stepped out of his elegant brick house in the Dutch city of Tilburg to visit a friend around the corner. It was close to midnight, but his colleague Marcel Zeelenberg had called and texted Stapel that evening to say that he wanted to see him about an urgent matter. The two had known each other since the early ’90s, when they were Ph.D. students at the University of Amsterdam; now both were psychologists at Tilburg University. In 2010, Stapel became dean of the university’s School of Social and Behavioral Sciences and Zeelenberg head of the social psychology department. Stapel and his wife, Marcelle, had supported Zeelenberg through a difficult divorce a few years earlier. As he approached Zeelenberg’s door, Stapel wondered if his colleague was having problems with his new girlfriend. Zeelenberg, a stocky man with a shaved head, led Stapel into his living room. “What’s up?” Stapel asked, settling onto a couch. Two graduate students had made an accusation, Zeelenberg explained. His eyes began to fill with tears. “They suspect you have been committing research fraud.” Stapel was an academic star in the Netherlands and abroad, the author of several well-regarded studies on human attitudes and behavior. That spring, he published a widely publicized study in Science about an experiment done at the Utrecht train station showing that a trash-filled environment tended to bring out racist tendencies in individuals. And just days earlier, he received more media attention for a study indicating that eating meat made people selfish and less social. © 2013 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18090 - Posted: 04.29.2013

by Helen Thomson "I feel like I have been dropped into my body. I know this is my voice and these are my memories, but they don't feel like they belong to me." It happened out of the blue. Louise Airey was 8 years old, off sick from school, when suddenly she felt like she had been dropped into her own body. "It's just so difficult to verbalise what this feels like," she says. "All of a sudden you're hyper aware, and everything else in the world seems unreal, like a movie." She panicked, but told no one. The feeling soon passed but returned several times until, at the age of 19, a migraine triggered a sensation of being disconnected from the world that was to last 18 months. When she was in her 30s she was diagnosed with depersonalisation disorder – an altered sense of self with all-encompassing feelings of not occupying your own body, and detachment from your thoughts and actions. It has come and gone throughout her life, but since a traumatic pregnancy 20 months ago, these feelings have remained constant. "Other people seem like robots," Airey says. "It's like I'm watching a film, like I'm on my own in the centre of everything and nothing else is real. I'll be speaking to my children and I'll catch my voice talking and it seems really alien and foreign. It makes you feel very separated and lonely from everything, like you're the only person that is real." Depersonalisation disorder is not as rare as you might think, says Anthony David at King's College London and the Maudsley Hospital: it may affect almost 1 per cent of the British population (Social Psychiatry and Psychiatric Epidemiology, DOI: 10.1007/s00127-010-0327-7). We've all probably experienced mild versions of it at some point, in the unreal, spaced-out feeling you might get while severely jet-lagged or hung-over, for example. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 18077 - Posted: 04.27.2013

Jennifer Raymond I have a bias against women in science. Please don't hold this against me. I am a woman scientist, mentor and advocate for women in science, and an associate dean in my school's Office of Diversity, with a budding field biologist as a daughter. Yet my performance on the Implicit Association Test (https://implicit.harvard.edu/implicit/demo), which measures unconscious associations between concepts, revealed that I have a tendency to associate men with science and career, and women with liberal arts and family. I didn't even need to wait for my score; I could feel that my responses were slower and that I made more mistakes when I had to group science words such as 'astronomy' with female words such as 'wife' rather than male words such as 'uncle'. The results from hundreds of thousands of people indicate that I am not an outlier — 70% of men and women across 34 countries view science as more male than female1. Gender bias is not just a problem in science. A host of studies shows that people tend to rate women as less competent than men across many domains, from musical abilities to leadership2, and that many individuals hold biases about competency on the basis of other irrelevant attributes, such as skin colour, body weight, religion, sexual orientation and parental status. Such biases have important consequences in the workplace. One study showed that mothers are 79% less likely to be hired and are offered US$11,000 less salary than women with no children3. By contrast, the same study shows that parenthood confers an advantage to men in the workplace. © 2013 Nature Publishing Group,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 8: Hormones and Sex
Link ID: 17880 - Posted: 03.09.2013

by Trevor Quirk Many smartphones claim to filter out background noise, but they've got nothing on the human brain. We can tune in to just one speaker at a noisy cocktail party with little difficulty—an ability that has been a scientific mystery since the early 1950s. Now, researchers argue that the competing noise of other partygoers is filtered out in the brain before it reaches regions involved in higher cognitive functions, such as language and attention control. Their experiments were the first to demonstrate this process. The scientists didn't do anything as social as attend a noisy party. Instead, Charles Schroeder, a psychiatrist at the Columbia University College of Physicians and Surgeons in New York City, and colleagues recorded the brain activity of six people with intractable epilepsy who required brain surgery. In order to identify the part of their brains responsible for seizures, the patients underwent 1 to 4 weeks of observation through electrocorticography (ECoG), a technique that provides precise neural recordings via electrodes placed directly on the surface of the brain. Schroeder and his team, using the ECoG data, conducted their experiments during this time. The researchers showed the patients two videos simultaneously, each of a person telling a 9- to 12-second story; they were asked to concentrate on just one speaker. To determine which neural recordings corresponded to the "ignored" and "attended" speech, the team reconstructed speech patterns from the brain's electrical activity using a mathematical model. The scientists then matched the reconstructed patterns with the original patterns coming from the ignored and attended speakers. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 17876 - Posted: 03.07.2013

By George Johnson In the week since I wrote about Oliver Sacks and the idiot savant twins, I’ve been catching up with Season 2 of “Touch,” the TV series about an autistic boy named Jake who has an inexplicable ability to commune with a secret world of numbers — a buried skein of mathematics in which the Golden Mean, the fibonacci sequence, the genetic code, and the Kabbalah are all mysteriously connected. Jungian synchronicity, quantum entanglement, chaos theory — all turn out to be manifestations of an underlying order in which everything that perplexes us ultimately makes sense. It is the dream of both mystics and scientists, and I had wondered shortly after the show first began how the conceit was going to be sustained through more than a few episodes. The connecting thread has turned out to be a conspiracy by a shadowy corporation called AsterCorp — as secretive and powerful as Massive Dynamic, purveyors of the mind-enhancing medicine Cortexiphan in “Fringe” — to kidnap Jake and others like him in their attempt to control the world. Or the universe. It is too soon to tell. Dr. Sacks’s twins, with their power to see, hear, smell — somehow sense within minutes if a number was prime — would also have been on AsterCorp’s wish list. Something keeps pulling me back to Sacks’s story. That is how enchanting a writer he is. (His memoir, Uncle Tungsten, is my favorite of his books.) There are plenty of accounts in the psychiatric literature of amazing human calculators and mnemonists. Sacks describes some famous cases in his essay. But what he thought he saw in the twins went far beyond that. Somehow, as Sacks described it, they could recognize that a number is prime in the way that one might recognize a face. Something on the surface of 3334401341 told them it was prime while 3334401343 was not.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17867 - Posted: 03.05.2013

By Daisy Yuhas It's news chocolate lovers have been craving: raw cocoa may be packed with brain-boosting compounds. Researchers at the University of L'Aquila in Italy, with scientists from Mars, Inc., and their colleagues published findings last September that suggest cognitive function in the elderly is improved by ingesting high levels of natural compounds found in cocoa called flavanols. The study included 90 individuals with mild cognitive impairment, a precursor to Alzheimer's disease. Subjects who drank a cocoa beverage containing either moderate or high levels of flavanols daily for eight weeks demonstrated greater cognitive function than those who consumed low levels of flavanols on three separate tests that measured factors that included verbal fluency, visual searching and attention. Exactly how cocoa causes these changes is still unknown, but emerging research points to one flavanol in particular: (-)-epicatechin, pronounced “minus epicatechin.” Its name signifies its structure, differentiating it from other catechins, organic compounds highly abundant in cocoa and present in apples, wine and tea. The graph below shows how (-)-epicatechin fits into the world of brain-altering food molecules. Other studies suggest that the compound supports increased circulation and the growth of blood vessels, which could explain improvements in cognition, because better blood flow would bring the brain more oxygen and improve its function. Animal research has already demonstrated how pure (-)-epicatechin enhances memory. Findings published last October in the Journal of Experimental Biology note that snails can remember a trained task—such as holding their breath in deoxygenated water—for more than a day when given (-)-epicatechin but for less than three hours without the flavanol. Salk Institute neuroscientist Fred Gage and his colleagues found previously that (-)-epicatechin improves spatial memory and increases vasculature in mice. “It's amazing that a single dietary change could have such profound effects on behavior,” Gage says. If further research confirms the compound's cognitive effects, flavanol supplements—or raw cocoa beans—could be just what the doctor ordered. © 2013 Scientific American

Related chapters from BP7e: Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology; Chapter 14: Attention and Consciousness
Link ID: 17863 - Posted: 03.02.2013

By Ingrid Wickelgren How many times have you arrived someplace but had no memory of the trip there? Have you ever been sitting in an auditorium daydreaming, not registering what the people on stage are saying or playing? We often spin through our days lost in mental time travel, thinking about something from the past, or future, leaving us oblivious to what is happening right around us right now. In doing so, we miss much of life. We also make ourselves relatively miserable, and prone to poor performance and mishaps. peaceful scene, village by the water. The opposite mental state, mindfulness, is a calm, focused awareness of the present. Cultivating that state is associated with improvements in both mental and physical health, as you will learn from the current cover story of Scientific American Mind (see “Mindfulness Can Improve Your Attention and Health” by Amishi P. Jha). It can even ameliorate mental illness. It turns out that mindfulness training works in large part by training our ability to pay attention. As we learn to focus on the here and now, we also learn to manipulate our mental focus more generally. The ability to direct our own minds at will means we control what we think about. It is no wonder that honing such a skill can make us happier. It can also boost the performance of soldiers, surgeons, athletes and many others who need to maintain a tight focus on what they are doing. Some people are naturally more mindful than others, but it is possible to train yourself to enter this state more often. Simple exercises performed as little as 12 minutes daily can help you become more mindful. For a sample exercise, watch this video “Learn to Live in the Now.” © 2013 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17824 - Posted: 02.19.2013

by Nic Halverson By studying a magic trick that has been around for thousands of years, neuroscientists have shed light on human attention and visual systems -- as well as on the trick, itself. "Magicians, in particular, are very intellectual performance artists. They are very interested in the mind and how behavior happens," Dr. Stephen Macknik, director of the Laboratory of Behavioral Neurophysiology at the Barrow Neurological Institute(BNI), told Discovery News. "What scientists are doing when we study perception is pretty much the same thing, except we're using the scientific method." The hope is that magicians' intuitive insight could help instruct the field of neuroscience and perhaps, even be applied in medicine to help people with attention deficit issues. In their study, recently published in the inaugural issue of PeerJ, the researchers focused upon a famous trick by a pair of very famous magicians. Penn & Teller's 10-year run at The Rio All-Suite Hotel & Casino has made them one of the longest-running and most beloved acts in Las Vegas history. Their trick, "Cups and Balls," is a classic illusion performed by Roman magicians as far back as 2,000 years ago when gladiators still battled in the Colosseum. While the trick has many derivatives, the most common uses three brightly colored balls and three opaque cups. Using sleight-of-hand, the magician seemingly makes the balls pass through the bottoms of cups, jump from cup to cup, disappear and reappear elsewhere or turn into entirely different objects. In Penn & Teller's case, that different object is often a potato. © 2013 Discovery Communications, LLC. T

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17792 - Posted: 02.13.2013

Wray Herbert The Invisible Gorilla is part of the popular culture nowadays, thanks largely to a widely-read 2010 book of that title. In that book, authors and cognitive psychologists Dan Simons and Christopher Chabris popularized a phenomenon of human perception—known in the jargon as “inattentional blindness”—which they had demonstrated in a study some years before. In the best known version of the experiment, volunteers were told to keep track of how many times some basketball players tossed a basketball. While they did this, someone in a gorilla suit walked across the basketball court, in plain view, yet many of the volunteers failed even to notice the beast. What the invisible gorilla study shows is that, if we are paying very close attention to one thing, we often fail to notice other things in our field of vision—even very obvious things. We all love these quirks of human perception. It’s entertaining to know that our senses can play tricks on us. And that’s no doubt the extent of most people’s familiarity with this psychological phenomenon. But what if this perceptual quirk has serious implications—even life-threatening implications? A new study raises that disturbing possibility. Three psychological scientists at Brigham and Women’s Hospital in Boston—Trafton Drew, Melissa Vo and Jeremy Wolfe—wondered if expert observers are also subject to this perceptual blindness. The subjects in the classic study were “naïve”—untrained in any particular domain of expertise and performing a task nobody does in real life. But what about highly trained professionals who make their living doing specialized kinds of observations? The scientists set out to explore this, and in an area of great importance to many people—cancer diagnosis. © Association for Psychological Science

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17744 - Posted: 02.02.2013

Ewen Callaway In the mid-1980s, Paul Moorcraft, then a war correspondent, journeyed with a film crew into Afghanistan to produce a documentary about the fifth anniversary of the Soviet invasion. The trip took them behind Soviet lines. “We were attacked every fucking day by the Russians,” says the colourful Welshman. But the real trouble started later, when Moorcraft tried to tally his expenses, such as horses and local garb for his crew. Even with a calculator, the simple sums took him ten times longer than they should have. “It was an absolute nightmare. I spent days and days and days.” When he finally sent the bill to an accountant, he had not realized that after adding a zero he was claiming millions of pounds for a trip that had cost a couple of hundred thousand. “He knew I was an honest guy and assumed that it was just a typo.” Such mistakes were part of a lifelong pattern for Moorcraft, now director of the Centre for Foreign Policy Analysis in London and the author of more than a dozen books. He hasn't changed his phone number or PIN in years for fear that he would never remember new ones, and when working for Britain's Ministry of Defence he put subordinates in charge of remembering safe codes. In 2003, a mistaken phone number — one of hundreds before it — lost him a girlfriend who was convinced he was out gallivanting. That finally convinced him to seek an explanation. At the suggestion of a friend who teaches children with learning disabilities, Moorcraft contacted Brian Butterworth, a cognitive neuroscientist at University College London who studies numerical cognition. After conducting some tests, Butterworth concluded that Moorcraft was “a disaster at arithmetic” and diagnosed him with dyscalculia, a little-known learning disability sometimes called number blindness and likened to dyslexia for maths. Researchers estimate that as much as 7% of the population has dyscalculia, which is marked by severe difficulties in dealing with numbers despite otherwise normal (or, in Moorcraft's case, probably well above normal) intelligence. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17666 - Posted: 01.10.2013

by Virginia Morell Hide some gold coins in your backyard, and you'll probably check around to make sure no one is spying on where you stash them. Eurasian jays are no different. A new study finds that the pinkish-gray birds with striking blue wing patches are not only aware that others may be watching while they stash their nuts and seeds for the winter, but also might be surreptitiously listening, too. In response, they change their behaviors—stashing nuts in quieter places, for example. The findings suggest that the jays may be able to understand another's point of view, an ability rarely seen in animals other than humans. Several species of jays and crows, collectively called corvids, cache food to eat later. They also spy on one another and steal from each other's caches. The behaviors have led to what researchers term an evolutionary arms race, with the birds evolving various strategies to outwit their rivals, such as hiding nuts in the shade or behind barriers, or moving their cache to new locations. In the wild, Eurasian jays are often robbed by other species of birds such as Jackdaws and crows, as well as by their own mates. "They're also very good vocal mimics, imitating the calls of raptors and songbirds in the wild, and our voices in the lab. And that means that auditory information is a big part of their cognitive repertoire," says Rachael Shaw, a behavioral ecologist at the University of Cambridge in the United Kingdom, who led the new study while a graduate student in comparative psychologist Nicola Clayton's lab at Cambridge. But do the birds, which are also very secretive, understand that the scratching and rustling sounds they make while caching their nuts in the ground might draw the attention of another bird? Other researchers working with Clayton had previously shown that Western scrub jays from North America would avoid hiding nuts in noisy gravel if a rival was nearby and could hear them. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17578 - Posted: 12.05.2012

by Elizabeth Norton Despite long experience with the ways of the world, older people are especially vulnerable to fraud. According to the Federal Trade Commission (FTC), up to 80% of scam victims are over 65. One explanation may lie in a brain region that serves as a built-in crook detector. Called the anterior insula, this structure—which fires up in response to the face of an unsavory character—is less active in older people, possibly making them less cagey than younger folks, a new study finds. Both FTC and the Federal Bureau of Investigation have found that older people are easy marks due in part to their tendency to accentuate the positive. According to social neuroscientist Shelley Taylor of the University of California, Los Angeles, research backs up the idea that older people can put a positive spin on things—emotionally charged pictures, for example, and playing virtual games in which they risk the loss of money. "Older people are good at regulating their emotions, seeing things in a positive light, and not overreacting to everyday problems," she says. But this trait may make them less wary. To see if older people really are less able to spot a shyster, Taylor and colleagues showed photos of faces considered trustworthy, neutral, or untrustworthy to a group of 119 older adults (ages 55 to 84) and 24 younger adults (ages 20 to 42). Signs of untrustworthiness include averted eyes; an insincere smile that doesn't reach the eyes; a smug, smirky mouth; and a backward tilt to the head. The participants were asked to rate each face on a scale from -3 (very untrustworthy) to 3 (very trustworthy). © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 17576 - Posted: 12.04.2012

By Kyle Hill You careen headlong into a blinding light. Around you, phantasms of people and pets lost. Clouds billow and sway, giving way to a gilded and golden entrance. You feel the air, thrusted downward by delicate wings. Everything is soothing, comforting, familiar. Heaven. It’s a paradise that some experience during an apparent demise. The surprising consistency of heavenly visions during a “near death experience” (or NDE) indicates for many that an afterlife awaits us. Religious believers interpret these similar yet varying accounts like blind men exploring an elephant—they each feel something different (the tail is a snake and the legs are tree trunks, for example); yet all touch the same underlying reality. Skeptics point to the curious tendency for Heaven to conform to human desires, or for Heaven’s fleeting visage to be so dependent on culture or time period. Heaven, in a theological view, has some kind of entrance. When you die, this entrance is supposed to appear—a Platform 9 ¾ for those running towards the grave. Of course, the purported way to see Heaven without having to take the final run at the platform wall is the NDE. Thrust back into popular consciousness by a surgeon claiming that “Heaven is Real,” the NDE has come under both theological and scientific scrutiny for its supposed ability to preview the great gig in the sky. But getting to see Heaven is hell—you have to die. Or do you? © 2012 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 17570 - Posted: 12.04.2012

Barry Gordon, professor of neurology and cognitive science at the Johns Hopkins University School of Medicine, replies: Forgive your mind this minor annoyance because it has worked to save your life—or more accurately, the lives of your ancestors. Most likely you have not needed to worry whether the rustling in the underbrush is a rabbit or a leopard, or had to identify the best escape route on a walk by the lake, or to wonder whether the funny pattern in the grass is a snake or dead branch. Yet these were life-or-death decisions to our ancestors. Optimal moment-to-moment readiness requires a brain that is working constantly, an effort that takes a great deal of energy. (To put this in context, the modern human brain is only 2 percent of our body weight, but it uses 20 percent of our resting energy.) Such an energy-hungry brain, one that is constantly seeking clues, connections and mechanisms, is only possible with a mammalian metabolism tuned to a constant high rate. Constant thinking is what propelled us from being a favorite food on the savanna—and a species that nearly went extinct—to becoming the most accomplished life-form on this planet. Even in the modern world, our mind always churns to find hazards and opportunities in the data we derive from our surroundings, somewhat like a search engine server. Our brain goes one step further, however, by also thinking proactively, a task that takes even more mental processing. So even though most of us no longer worry about leopards in the grass, we do encounter new dangers and opportunities: employment, interest rates, “70 percent off” sales and swindlers offering $20 million for just a small investment on our part. Our primate heritage brought us another benefit: the ability to navigate a social system. As social animals, we must keep track of who's on top and who's not and who might help us and who might hurt us. To learn and understand this information, our mind is constantly calculating “what if?” scenarios. What do I have to do to advance in the workplace or social or financial hierarchy? What is the danger here? The opportunity? © 2012 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17562 - Posted: 12.03.2012

by Douglas Heaven What is nine plus six, plus eight? You may not realise it, but you already know the answer. It seems that we unconsciously perform more complicated feats of reasoning than previously thought – including reading and basic mathematics. The discovery raises questions about the necessity of consciousness for abstract thought, and supports the idea that maths might not be an exclusively human trait. Previous studies have shown that we can subliminally process single words and numbers. To identify whether we can unconsciously perform more complicated processing, Ran Hassin at the Hebrew University of Jerusalem, Israel, and his colleagues used a technique called continuous flash suppression. The technique works by presenting a volunteer's left eye with a stimulus – a mathematical sum, say – for a short period of time, while bombarding the right eye with rapidly changing colourful shapes. The volunteer's awareness is dominated by what the right eye sees, so they remain unconscious of what is presented to the left eye. In the team's first experiment, a three-part calculation was flashed to the left eye. This was immediately followed by one number being presented to both eyes, which the volunteer had to say as fast as possible. When the number was the same as the answer to the sum, people were quicker to announce it, suggesting that they had subconsciously worked out the answer, and primed themselves with that number. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17493 - Posted: 11.14.2012

One cannot travel far in spiritual circles without meeting people who are fascinated by the “near-death experience” (NDE). The phenomenon has been described as follows: Frequently recurring features include feelings of peace and joy; a sense of being out of one’s body and watching events going on around one’s body and, occasionally, at some distant physical location; a cessation of pain; seeing a dark tunnel or void; seeing an unusually bright light, sometimes experienced as a “Being of Light” that radiates love and may speak or otherwise communicate with the person; encountering other beings, often deceased persons whom the experiencer recognizes; experiencing a revival of memories or even a full life review, sometimes accompanied by feelings of judgment; seeing some “other realm,” often of great beauty; sensing a barrier or border beyond which the person cannot go; and returning to the body, often reluctantly. Such accounts have led many people to believe that consciousness must be independent of the brain. Unfortunately, these experiences vary across cultures, and no single feature is common to them all. One would think that if a nonphysical domain were truly being explored, some universal characteristics would stand out. Hindus and Christians would not substantially disagree—and one certainly wouldn’t expect the after-death state of South Indians to diverge from that of North Indians, as has been reported.⁠ It should also trouble NDE enthusiasts that only 10−20 percent of people who approach clinical death recall having any experience at all.⁠ Copyright 2012 Sam Harris

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17490 - Posted: 11.14.2012

By Fergus Walsh Medical correspondent A Canadian man who was believed to have been in a vegetative state for more than a decade, has been able to tell scientists that he is not in any pain. It's the first time an uncommunicative, severely brain-injured patient has been able to give answers clinically relevant to their care. Scott Routley, 39, was asked questions while having his brain activity scanned in an fMRI machine. His doctor says the discovery means medical textbooks will need rewriting. Vegetative patients emerge from a coma into a condition where they have periods awake, with their eyes open, but have no perception of themselves or the outside world. Mr Routley suffered a severe brain injury in a car accident 12 years ago. None of his physical assessments since then have shown any sign of awareness, or ability to communicate. But the British neuroscientist Prof Adrian Owen - who led the team at the Brain and Mind Institute, University of Western Ontario - said Mr Routley was clearly not vegetative. BBC © 2012

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17487 - Posted: 11.13.2012

By SETH S. HOROWITZ HERE’S a trick question. What do you hear right now? If your home is like mine, you hear the humming sound of a printer, the low throbbing of traffic from the nearby highway and the clatter of plastic followed by the muffled impact of paws landing on linoleum — meaning that the cat has once again tried to open the catnip container atop the fridge and succeeded only in knocking it to the kitchen floor. The slight trick in the question is that, by asking you what you were hearing, I prompted your brain to take control of the sensory experience — and made you listen rather than just hear. That, in effect, is what happens when an event jumps out of the background enough to be perceived consciously rather than just being part of your auditory surroundings. The difference between the sense of hearing and the skill of listening is attention. Hearing is a vastly underrated sense. We tend to think of the world as a place that we see, interacting with things and people based on how they look. Studies have shown that conscious thought takes place at about the same rate as visual recognition, requiring a significant fraction of a second per event. But hearing is a quantitatively faster sense. While it might take you a full second to notice something out of the corner of your eye, turn your head toward it, recognize it and respond to it, the same reaction to a new or sudden sound happens at least 10 times as fast. This is because hearing has evolved as our alarm system — it operates out of line of sight and works even while you are asleep. And because there is no place in the universe that is totally silent, your auditory system has evolved a complex and automatic “volume control,” fine-tuned by development and experience, to keep most sounds off your cognitive radar unless they might be of use as a signal that something dangerous or wonderful is somewhere within the kilometer or so that your ears can detect. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Consciousness
Link ID: 17474 - Posted: 11.11.2012

by Elizabeth Norton The ability to recognize faces is so important in humans that the brain appears to have an area solely devoted to the task: the fusiform gyrus. Brain imaging studies consistently find that this region of the temporal lobe becomes active when people look at faces. Skeptics have countered, however, that these studies show only a correlation, but not proof, that activity in this area is essential for face recognition. Now, thanks to the willingness of an intrepid patient, a new study provides the first cause-and-effect evidence that neurons in this area help humans recognize faces—and only faces, not other body parts or objects. An unusual collaboration between researchers and an epilepsy patient led to the discovery. Ron Blackwell, an engineer in Santa Clara, California, came to Stanford University in Palo Alto, California, in 2011 seeking better treatment for his epilepsy. He had suffered seizures since he was a teenager, and at age 47, his medication was becoming less effective. Stanford neurologist Josef Parvizi suggested some tests to locate the source of the seizures—and also suggested that it might be possible to eliminate the seizures by surgically destroying a tiny area of brain tissue where they occurred. Parvizi used electrodes placed on Blackwell's scalp to trace the seizures to the temporal lobe, about an inch above Blackwell's right ear. Then, surgeons placed more electrodes on the surface of Blackwell's brain, near the suspect point of origin in the temporal lobe. Parvizi stimulated each electrode in turn with a mild current, trying to trigger Blackwell's seizure symptoms under safe conditions. "If we get those symptoms, we know that we are tickling the seizure node," he explains. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17414 - Posted: 10.24.2012

By Maria Konnikova I don’t remember if I had any problems paying attention to Jane Austen’s Mansfield Park when I first read it. I doubt it, though. I devoured all of my Austen in one big gulp, book after book, line after line, sometime around the eighth grade. My mom had given a huge, bright blue hardcover, with text as small as the book was weighty, that contained the Jane Austen oeuvre from start to finish. And from start to finish I went. I’ve since revisited most of the novels—there’s only so much you retain, absorb, and process on a thirteen-year-old’s reading binge—but Mansfield Park hasn’t fared quite as well as some of the others. I’m not sure why. I’ve just never gone back. Until a few weeks ago, that is, when I saw that this somewhat neglected (and often frowned upon) novel had been made the center of an intriguing new study of reading and attention. “This is your brain on Jane Austen,” rang the headline. Oh, no, not another one, went my head. It seems like every day, we get another “your brain on…” announcement, and at this point, an allergic reaction seems in order. This one, however, proved to be different. It’s not about your brain on Jane Austen. Not really. It’s about a far more interesting question: can our brains pay close attention in different ways? The neural correlates of attention are a hot research topic—and with good reason. After all, with the explosion of new media streams, new ways of digesting material, new ways of interacting with the world, it would make sense for us to be curious about how it all affects us at the most basic level of the brain. Usually, though, the research deals with the differences between paying attention, like really paying attention, and not paying attention all that much, be it because of increased cognitive load or other forms of multitasking or divided attention. © 2012 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 17409 - Posted: 10.23.2012