Links for Keyword: Attention
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Wray Herbert The Invisible Gorilla is part of the popular culture nowadays, thanks largely to a widely-read 2010 book of that title. In that book, authors and cognitive psychologists Dan Simons and Christopher Chabris popularized a phenomenon of human perception—known in the jargon as “inattentional blindness”—which they had demonstrated in a study some years before. In the best known version of the experiment, volunteers were told to keep track of how many times some basketball players tossed a basketball. While they did this, someone in a gorilla suit walked across the basketball court, in plain view, yet many of the volunteers failed even to notice the beast. What the invisible gorilla study shows is that, if we are paying very close attention to one thing, we often fail to notice other things in our field of vision—even very obvious things. We all love these quirks of human perception. It’s entertaining to know that our senses can play tricks on us. And that’s no doubt the extent of most people’s familiarity with this psychological phenomenon. But what if this perceptual quirk has serious implications—even life-threatening implications? A new study raises that disturbing possibility. Three psychological scientists at Brigham and Women’s Hospital in Boston—Trafton Drew, Melissa Vo and Jeremy Wolfe—wondered if expert observers are also subject to this perceptual blindness. The subjects in the classic study were “naïve”—untrained in any particular domain of expertise and performing a task nobody does in real life. But what about highly trained professionals who make their living doing specialized kinds of observations? The scientists set out to explore this, and in an area of great importance to many people—cancer diagnosis. © Association for Psychological Science
Ewen Callaway In the mid-1980s, Paul Moorcraft, then a war correspondent, journeyed with a film crew into Afghanistan to produce a documentary about the fifth anniversary of the Soviet invasion. The trip took them behind Soviet lines. “We were attacked every fucking day by the Russians,” says the colourful Welshman. But the real trouble started later, when Moorcraft tried to tally his expenses, such as horses and local garb for his crew. Even with a calculator, the simple sums took him ten times longer than they should have. “It was an absolute nightmare. I spent days and days and days.” When he finally sent the bill to an accountant, he had not realized that after adding a zero he was claiming millions of pounds for a trip that had cost a couple of hundred thousand. “He knew I was an honest guy and assumed that it was just a typo.” Such mistakes were part of a lifelong pattern for Moorcraft, now director of the Centre for Foreign Policy Analysis in London and the author of more than a dozen books. He hasn't changed his phone number or PIN in years for fear that he would never remember new ones, and when working for Britain's Ministry of Defence he put subordinates in charge of remembering safe codes. In 2003, a mistaken phone number — one of hundreds before it — lost him a girlfriend who was convinced he was out gallivanting. That finally convinced him to seek an explanation. At the suggestion of a friend who teaches children with learning disabilities, Moorcraft contacted Brian Butterworth, a cognitive neuroscientist at University College London who studies numerical cognition. After conducting some tests, Butterworth concluded that Moorcraft was “a disaster at arithmetic” and diagnosed him with dyscalculia, a little-known learning disability sometimes called number blindness and likened to dyslexia for maths. Researchers estimate that as much as 7% of the population has dyscalculia, which is marked by severe difficulties in dealing with numbers despite otherwise normal (or, in Moorcraft's case, probably well above normal) intelligence. © 2013 Nature Publishing Group
by Virginia Morell Hide some gold coins in your backyard, and you'll probably check around to make sure no one is spying on where you stash them. Eurasian jays are no different. A new study finds that the pinkish-gray birds with striking blue wing patches are not only aware that others may be watching while they stash their nuts and seeds for the winter, but also might be surreptitiously listening, too. In response, they change their behaviors—stashing nuts in quieter places, for example. The findings suggest that the jays may be able to understand another's point of view, an ability rarely seen in animals other than humans. Several species of jays and crows, collectively called corvids, cache food to eat later. They also spy on one another and steal from each other's caches. The behaviors have led to what researchers term an evolutionary arms race, with the birds evolving various strategies to outwit their rivals, such as hiding nuts in the shade or behind barriers, or moving their cache to new locations. In the wild, Eurasian jays are often robbed by other species of birds such as Jackdaws and crows, as well as by their own mates. "They're also very good vocal mimics, imitating the calls of raptors and songbirds in the wild, and our voices in the lab. And that means that auditory information is a big part of their cognitive repertoire," says Rachael Shaw, a behavioral ecologist at the University of Cambridge in the United Kingdom, who led the new study while a graduate student in comparative psychologist Nicola Clayton's lab at Cambridge. But do the birds, which are also very secretive, understand that the scratching and rustling sounds they make while caching their nuts in the ground might draw the attention of another bird? Other researchers working with Clayton had previously shown that Western scrub jays from North America would avoid hiding nuts in noisy gravel if a rival was nearby and could hear them. © 2010 American Association for the Advancement of Science
by Elizabeth Norton Despite long experience with the ways of the world, older people are especially vulnerable to fraud. According to the Federal Trade Commission (FTC), up to 80% of scam victims are over 65. One explanation may lie in a brain region that serves as a built-in crook detector. Called the anterior insula, this structure—which fires up in response to the face of an unsavory character—is less active in older people, possibly making them less cagey than younger folks, a new study finds. Both FTC and the Federal Bureau of Investigation have found that older people are easy marks due in part to their tendency to accentuate the positive. According to social neuroscientist Shelley Taylor of the University of California, Los Angeles, research backs up the idea that older people can put a positive spin on things—emotionally charged pictures, for example, and playing virtual games in which they risk the loss of money. "Older people are good at regulating their emotions, seeing things in a positive light, and not overreacting to everyday problems," she says. But this trait may make them less wary. To see if older people really are less able to spot a shyster, Taylor and colleagues showed photos of faces considered trustworthy, neutral, or untrustworthy to a group of 119 older adults (ages 55 to 84) and 24 younger adults (ages 20 to 42). Signs of untrustworthiness include averted eyes; an insincere smile that doesn't reach the eyes; a smug, smirky mouth; and a backward tilt to the head. The participants were asked to rate each face on a scale from -3 (very untrustworthy) to 3 (very trustworthy). © 2010 American Association for the Advancement of Science
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 17576 - Posted: 12.04.2012
By Kyle Hill You careen headlong into a blinding light. Around you, phantasms of people and pets lost. Clouds billow and sway, giving way to a gilded and golden entrance. You feel the air, thrusted downward by delicate wings. Everything is soothing, comforting, familiar. Heaven. It’s a paradise that some experience during an apparent demise. The surprising consistency of heavenly visions during a “near death experience” (or NDE) indicates for many that an afterlife awaits us. Religious believers interpret these similar yet varying accounts like blind men exploring an elephant—they each feel something different (the tail is a snake and the legs are tree trunks, for example); yet all touch the same underlying reality. Skeptics point to the curious tendency for Heaven to conform to human desires, or for Heaven’s fleeting visage to be so dependent on culture or time period. Heaven, in a theological view, has some kind of entrance. When you die, this entrance is supposed to appear—a Platform 9 ¾ for those running towards the grave. Of course, the purported way to see Heaven without having to take the final run at the platform wall is the NDE. Thrust back into popular consciousness by a surgeon claiming that “Heaven is Real,” the NDE has come under both theological and scientific scrutiny for its supposed ability to preview the great gig in the sky. But getting to see Heaven is hell—you have to die. Or do you? © 2012 Scientific American
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 17570 - Posted: 12.04.2012
Barry Gordon, professor of neurology and cognitive science at the Johns Hopkins University School of Medicine, replies: Forgive your mind this minor annoyance because it has worked to save your life—or more accurately, the lives of your ancestors. Most likely you have not needed to worry whether the rustling in the underbrush is a rabbit or a leopard, or had to identify the best escape route on a walk by the lake, or to wonder whether the funny pattern in the grass is a snake or dead branch. Yet these were life-or-death decisions to our ancestors. Optimal moment-to-moment readiness requires a brain that is working constantly, an effort that takes a great deal of energy. (To put this in context, the modern human brain is only 2 percent of our body weight, but it uses 20 percent of our resting energy.) Such an energy-hungry brain, one that is constantly seeking clues, connections and mechanisms, is only possible with a mammalian metabolism tuned to a constant high rate. Constant thinking is what propelled us from being a favorite food on the savanna—and a species that nearly went extinct—to becoming the most accomplished life-form on this planet. Even in the modern world, our mind always churns to find hazards and opportunities in the data we derive from our surroundings, somewhat like a search engine server. Our brain goes one step further, however, by also thinking proactively, a task that takes even more mental processing. So even though most of us no longer worry about leopards in the grass, we do encounter new dangers and opportunities: employment, interest rates, “70 percent off” sales and swindlers offering $20 million for just a small investment on our part. Our primate heritage brought us another benefit: the ability to navigate a social system. As social animals, we must keep track of who's on top and who's not and who might help us and who might hurt us. To learn and understand this information, our mind is constantly calculating “what if?” scenarios. What do I have to do to advance in the workplace or social or financial hierarchy? What is the danger here? The opportunity? © 2012 Scientific American
by Douglas Heaven What is nine plus six, plus eight? You may not realise it, but you already know the answer. It seems that we unconsciously perform more complicated feats of reasoning than previously thought – including reading and basic mathematics. The discovery raises questions about the necessity of consciousness for abstract thought, and supports the idea that maths might not be an exclusively human trait. Previous studies have shown that we can subliminally process single words and numbers. To identify whether we can unconsciously perform more complicated processing, Ran Hassin at the Hebrew University of Jerusalem, Israel, and his colleagues used a technique called continuous flash suppression. The technique works by presenting a volunteer's left eye with a stimulus – a mathematical sum, say – for a short period of time, while bombarding the right eye with rapidly changing colourful shapes. The volunteer's awareness is dominated by what the right eye sees, so they remain unconscious of what is presented to the left eye. In the team's first experiment, a three-part calculation was flashed to the left eye. This was immediately followed by one number being presented to both eyes, which the volunteer had to say as fast as possible. When the number was the same as the answer to the sum, people were quicker to announce it, suggesting that they had subconsciously worked out the answer, and primed themselves with that number. © Copyright Reed Business Information Ltd.
One cannot travel far in spiritual circles without meeting people who are fascinated by the “near-death experience” (NDE). The phenomenon has been described as follows: Frequently recurring features include feelings of peace and joy; a sense of being out of one’s body and watching events going on around one’s body and, occasionally, at some distant physical location; a cessation of pain; seeing a dark tunnel or void; seeing an unusually bright light, sometimes experienced as a “Being of Light” that radiates love and may speak or otherwise communicate with the person; encountering other beings, often deceased persons whom the experiencer recognizes; experiencing a revival of memories or even a full life review, sometimes accompanied by feelings of judgment; seeing some “other realm,” often of great beauty; sensing a barrier or border beyond which the person cannot go; and returning to the body, often reluctantly. Such accounts have led many people to believe that consciousness must be independent of the brain. Unfortunately, these experiences vary across cultures, and no single feature is common to them all. One would think that if a nonphysical domain were truly being explored, some universal characteristics would stand out. Hindus and Christians would not substantially disagree—and one certainly wouldn’t expect the after-death state of South Indians to diverge from that of North Indians, as has been reported. It should also trouble NDE enthusiasts that only 10−20 percent of people who approach clinical death recall having any experience at all. Copyright 2012 Sam Harris
By Fergus Walsh Medical correspondent A Canadian man who was believed to have been in a vegetative state for more than a decade, has been able to tell scientists that he is not in any pain. It's the first time an uncommunicative, severely brain-injured patient has been able to give answers clinically relevant to their care. Scott Routley, 39, was asked questions while having his brain activity scanned in an fMRI machine. His doctor says the discovery means medical textbooks will need rewriting. Vegetative patients emerge from a coma into a condition where they have periods awake, with their eyes open, but have no perception of themselves or the outside world. Mr Routley suffered a severe brain injury in a car accident 12 years ago. None of his physical assessments since then have shown any sign of awareness, or ability to communicate. But the British neuroscientist Prof Adrian Owen - who led the team at the Brain and Mind Institute, University of Western Ontario - said Mr Routley was clearly not vegetative. BBC © 2012
By SETH S. HOROWITZ HERE’S a trick question. What do you hear right now? If your home is like mine, you hear the humming sound of a printer, the low throbbing of traffic from the nearby highway and the clatter of plastic followed by the muffled impact of paws landing on linoleum — meaning that the cat has once again tried to open the catnip container atop the fridge and succeeded only in knocking it to the kitchen floor. The slight trick in the question is that, by asking you what you were hearing, I prompted your brain to take control of the sensory experience — and made you listen rather than just hear. That, in effect, is what happens when an event jumps out of the background enough to be perceived consciously rather than just being part of your auditory surroundings. The difference between the sense of hearing and the skill of listening is attention. Hearing is a vastly underrated sense. We tend to think of the world as a place that we see, interacting with things and people based on how they look. Studies have shown that conscious thought takes place at about the same rate as visual recognition, requiring a significant fraction of a second per event. But hearing is a quantitatively faster sense. While it might take you a full second to notice something out of the corner of your eye, turn your head toward it, recognize it and respond to it, the same reaction to a new or sudden sound happens at least 10 times as fast. This is because hearing has evolved as our alarm system — it operates out of line of sight and works even while you are asleep. And because there is no place in the universe that is totally silent, your auditory system has evolved a complex and automatic “volume control,” fine-tuned by development and experience, to keep most sounds off your cognitive radar unless they might be of use as a signal that something dangerous or wonderful is somewhere within the kilometer or so that your ears can detect. © 2012 The New York Times Company
Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Consciousness
Link ID: 17474 - Posted: 11.11.2012
by Elizabeth Norton The ability to recognize faces is so important in humans that the brain appears to have an area solely devoted to the task: the fusiform gyrus. Brain imaging studies consistently find that this region of the temporal lobe becomes active when people look at faces. Skeptics have countered, however, that these studies show only a correlation, but not proof, that activity in this area is essential for face recognition. Now, thanks to the willingness of an intrepid patient, a new study provides the first cause-and-effect evidence that neurons in this area help humans recognize faces—and only faces, not other body parts or objects. An unusual collaboration between researchers and an epilepsy patient led to the discovery. Ron Blackwell, an engineer in Santa Clara, California, came to Stanford University in Palo Alto, California, in 2011 seeking better treatment for his epilepsy. He had suffered seizures since he was a teenager, and at age 47, his medication was becoming less effective. Stanford neurologist Josef Parvizi suggested some tests to locate the source of the seizures—and also suggested that it might be possible to eliminate the seizures by surgically destroying a tiny area of brain tissue where they occurred. Parvizi used electrodes placed on Blackwell's scalp to trace the seizures to the temporal lobe, about an inch above Blackwell's right ear. Then, surgeons placed more electrodes on the surface of Blackwell's brain, near the suspect point of origin in the temporal lobe. Parvizi stimulated each electrode in turn with a mild current, trying to trigger Blackwell's seizure symptoms under safe conditions. "If we get those symptoms, we know that we are tickling the seizure node," he explains. © 2010 American Association for the Advancement of Science.
By Maria Konnikova I don’t remember if I had any problems paying attention to Jane Austen’s Mansfield Park when I first read it. I doubt it, though. I devoured all of my Austen in one big gulp, book after book, line after line, sometime around the eighth grade. My mom had given a huge, bright blue hardcover, with text as small as the book was weighty, that contained the Jane Austen oeuvre from start to finish. And from start to finish I went. I’ve since revisited most of the novels—there’s only so much you retain, absorb, and process on a thirteen-year-old’s reading binge—but Mansfield Park hasn’t fared quite as well as some of the others. I’m not sure why. I’ve just never gone back. Until a few weeks ago, that is, when I saw that this somewhat neglected (and often frowned upon) novel had been made the center of an intriguing new study of reading and attention. “This is your brain on Jane Austen,” rang the headline. Oh, no, not another one, went my head. It seems like every day, we get another “your brain on…” announcement, and at this point, an allergic reaction seems in order. This one, however, proved to be different. It’s not about your brain on Jane Austen. Not really. It’s about a far more interesting question: can our brains pay close attention in different ways? The neural correlates of attention are a hot research topic—and with good reason. After all, with the explosion of new media streams, new ways of digesting material, new ways of interacting with the world, it would make sense for us to be curious about how it all affects us at the most basic level of the brain. Usually, though, the research deals with the differences between paying attention, like really paying attention, and not paying attention all that much, be it because of increased cognitive load or other forms of multitasking or divided attention. © 2012 Scientific American
By John McCarthy Humans can focus on one thing amidst many. “Searchlight of attention” is the metaphor. You recall a childhood friend’s face one moment, then perhaps the dog you loved back then, and then…what you will. Your son’s face on stage rivets your attention; the rest of the cast is unseen. No “ghost” in the brain aims that searchlight. What does? Neurons do, somehow, but how is a mystery that new research actually deepened. The experiment used monkeys. They can focus attention like people do. They can zero in on a red square on a screen full of distractions, for instance. When the square moves, a trained monkey will press a button. Electrodes inserted in a monkey neuron will reveal “firing” (minuscule electrical ripples) simultaneous with attention. This may locate brain areas by which the monkey watched that red square. It’s not only the explosive firing in neurons that instruments detect. They also spot the milder priming to fire, when the monkey expects (from training) that neurons are about to be stimulated. Neurons in a one area of the cortex fire when an object moves (but not, for instance, if it gets brighter but stays still.) If a monkey learns that an onscreen cue (a blip of light) signals that the red square is about to move, the cue alone primes the motion-sensing neurons. They also synchronize more tightly (i.e. reduce random noise among them.) Cues cock neurons, like a gun. It’s like Pavlov’s dogs salivating at the bell that preceded feeding. © 2012 Scientific American
By DAVID P. BARASH ZOMBIE bees? That’s right: zombie bees. First reported in California in 2008, these stranger-than-fiction creatures have spread to North Dakota and, just recently, to my home in Washington State. Of course, they’re not really zombies, although they act disquietingly like them, showing abnormal behavior like flying at night (almost unheard-of in healthy bees), moving erratically and then dying. These “zombees” are victims of a parasitic fly, Apocephalus borealis. The fly lays eggs within honeybees, inducing their hosts to make a nocturnal “flight of the living dead,” after which the larval flies emerge, having consumed the bee from the inside out. These events, although bizarre, aren’t all that unusual in the animal world. Many fly and wasp species lay their eggs inside hosts. What is especially interesting, and a bit more unusual, is the way an internal parasite not only feeds on its host, but also frequently alters its behavior, in a way that favors the continued survival and reproduction of the parasite. Not all internal parasites kill their hosts, of course: pretty much every multicellular animal is home to numerous fellow travelers, each of which has its own agenda, which in some cases involves influencing, or taking control of, part or all of the body in which they temporarily reside. And this, in turn, leads to the question: who’s in charge of your own mind? Think of the morgue scene in the movie “Men in Black,” when a human corpse is revealed to be a robot, its skull inhabited by a little green man from outer space. Science fiction, but less bizarre than you might expect, or want to believe. © 2012 The New York Times Company
By Sarah Estes and Jesse Graham It might be time to pencil in "awe cultivation" on your to-do list. Although religious thinkers like Søren Kierkegaard cast awe as a state of existential fear and trembling, new research by psychologists at Stanford and the University of Minnesota shows that experiencing awe can actually increase well-being, by giving people the sense that they have more time available. That sounds much more enjoyable than trying to power through one more hour on Redbull and fumes. Just what is this elusive emotion, and how can one nurture it in our time-pressed world? Although awe has played a significant role in the histories of religion, art, and other transcendental pursuits, it has received scant attention from emotion researchers. Noting the paucity of data, social psychologists Dacher Keltner and Jonathan Haidt developed a working prototype in a 2003 paper, delineating awe's standing in the research taxonomy. After reviewing accounts of psychological, sociological, religious, artistic, and even primordial awe (awe toward power), the researchers surmised that awe universally involved the perception of vastness and the need to accommodate the experience into one's present worldview. That is, awe is triggered by some experience so expansive (in either a positive or negative way) that one’s mental schemas have to be adjusted in order to process it. Nearly ten years later, awe research is beginning to come into its own. The self-help market has continued to grow quickly, and research on positive emotions has kept apace. Even corporations and politicians have taken note of some of the ways that emotion research links into everything from productivity to voting and buying behavior. So it should come as no surprise that psychologists are now experimenting in domains formerly left to clergy, clinicians, and artists. © 2012 Scientific American,
By Susan Milius Let’s take a minute to turn faces upside down. Pick any face. Ignore beards, glasses, hairdos or lack of any hair to do, and upend the facial features of Charles Darwin, Ray Charles or anyone named Charlotte who reads Science News. People who normally remember or match a face perfectly well have trouble when it is standing on its head. But before there’s a chorus of “well, obviously,” let’s try turning dogs upside down, too. Most people who don’t breed dogs or judge shows don’t recognize an individual dog nearly as well as a person’s face to begin with. And when pictures of poodles and Irish setters flip upside down in quizzes of learning and memory, people struggle a bit more than they do with the natural versions. But scores drop only modestly with these flipped-dog pics, compared with the dramatic drop for facial flips. The disproportionate decline in remembering inverted faces has shown up in a variety of recall tests, with comparison groups from dogs to bridges, airplanes, stick figures, even clothing from 17th and 18th century paintings. Upside-down faces are where quiz scores really slump, and researchers view that slump as one of the signs that test-takers are actually experts at face perception. A dog is a dog in any orientation. Same for other organisms and objects. But right-side-up faces apparently are so compelling that people have become especially masterful at recognizing the human visage. Know-it-at-a-glance holistic techniques behind this mastery fail when the world turns upside down. access © Society for Science & the Public 2000 - 2012
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 17291 - Posted: 09.22.2012
by Douglas Heaven Ever wish you could make better choices? That could one day be possible thanks to an electronic brain implant that can enhance short-term memory and decision-making in primates. The implant can also restore these functions in an animal model of Alzheimer's disease and other types of brain damage, paving the way for the development of new treatments for people with these conditions. Sam Deadwyler at Wake Forest University School of Medicine in Winston-Salem, North Carolina, and colleagues have previously shown that a neural implant can restore some motor and sensory functions in rats. Now they have used a similar implant to stimulate higher-level thinking in monkeys. During normal brain function, neurons "fire" when they receive an input from another neuron via the connection between them, called a synapse. The spatial and temporal pattern of this activity – where and when the neurons fire – can be detected and recorded. To find out if it is possible to hijack and then retune these patterns of activity, Deadwyler's team first trained five rhesus macaques to perform a task that tests their attention, short-term memory and decision-making skills. First, the monkeys were shown a random image from a pool of 5000. The image was then blanked out for an interval of 1 to 90 seconds, before reappearing in a different position, alongside up to seven other images. If the monkey selected the original image once it reappeared it was rewarded with juice. © Copyright Reed Business Information Ltd.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 17283 - Posted: 09.22.2012
by Alex Stone In magic, choices are rarely what they seem. Magicians know how to manipulate us into a false sense of free will while really holding the puppet strings. Here’s a simple but clever example of a false choice used in magic. Imagine, if you will, the face of an analog clock and think of any hour on the dial (one, two, three….all the way to twelve.) You have a totally free choice. You can even change your mind if you like. Now we’re going to inject some randomness into your decision. Imagine that your finger is the hour hand and, starting at midnight, spell out the hour you chose, moving your finger clockwise by one step for each letter. (For instance, if you thought of seven, you’d spell out s-e-v-e-n, moving the time forward a total of five hours.). After you’ve done that, your finger will be on a new number. Starting there, spell this number, following the same procedure as before, moving your finger around the dial until you land on yet another number. Repeat the procedure one last time, starting where you left off. Remember the hour on which your finger finally lands. This is your selection. You arrived at this number randomly after making a free choice, so I think it’s fair to say that it would be impossible for me to know where your finger ended up. And yet I’m getting an impression right now. In my third eye, a vision of an old mahogany grandfather clock with a swinging pendulum and hand-painted Roman numerals on the dial. The image is ghostly and pale. I can barely make out the face. The hour-hand reads: One o’clock. This elementary ruse is known as a force. (Try starting with another number and you’ll see why it’s a force.) A force is a way to control a spectator’s selection, be it of a card, number, word, letter—just about anything—and it’s one of the most powerful weapons in magic. There are hundreds of methods. (See for instance, 202 Methods of Forcing, by the great mentalist Ted Annemann.) Forcing gets way more sophisticated, but the basic idea is always the same. © 2012, Kalmbach Publishing Co.
By Scicurious Scientists like to study choice behavior. It’s an important area of study for lots of different applications, including things like, say, marketing, but also things including mate choice, nutrition, drug addiction, and well…your life is FULL of choices. When you’re at the store facing that huge freaking WALL full of different kinds of cereal? When you decide to hit snooze on your alarm? When you decide to see the dessert menu after dinner? All of these are different kinds of choices, and our brain has different ways of calculating the cost and benefits of each one (or, in the case of mine, going into complete shut down at the sight of that gigantic cereal aisle. I hate that thing). But when scientists study choice and decision making, they often study it in something of a vacuum. Not a literal vacuum, but in an environment with very few variables. You have a rat with a choice of levers or in a maze with a choice of directions. You have a human in a scanner making a choice of two different objects or how much to wager. This is really great for studying how different kinds of decisions are made, but as we get to know more about choice, we have to begin adding more variables. And with choice in real life comes something else: competition. A lot of the most important decisions are made in the presence of competition, like decisions for resources. Find a good patch of berries? Someone was probably there before you. Come across a lovely lady or boy vole you’d like to woo? There’s probably another suitor knocking at the door. So the question now becomes, how does the brain deal with decision making in the presence of competition? © 2012 Scientific American
Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 17247 - Posted: 09.11.2012
Analysis by Sheila Eldred Behavioral control and decision-making take part in different regions of the brain's frontal lobe, new research shows The study effectively created a map of the frontal lobes, making it possible for patients with brain injuries to get an accurate prognosis early in treatment. "That knowledge will be tremendously useful for prognosis after brain injury," Ralph Adolphs, Bren Professor of Psychology and Neuroscience at Caltech and a coauthor of the study published in this week's issue of the Proceedings of the National Academy of Sciences (PNAS), said in a press release. "Many people suffer injury to their frontal lobes -- for instance, after a head injury during an automobile accident -- but the precise pattern of the damage will determine their eventual impairment," he added. When you're making a decision, several different parts of the brain might be activated. How a person functions after a brain injury depends on precisely where a brain injury occurs. Other parts of the brain might compensate, allowing the person to function typically, or the person might be left with a lifelong hardship in making decisions. "We can use our lesion maps and compare the location of damaged brain areas in new patients," Jan Glascher, lead author of the study and a visiting associate in psychology at Caltech, said in an email interview. "This way we can predict what impairments these new patients will likely have. This can facilitate medical diagnoses and spark ideas for treatment strategies." © 2012 Discovery Communications, LLC.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 17192 - Posted: 08.22.2012