Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 416

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20253 - Posted: 10.28.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20244 - Posted: 10.27.2014

By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20219 - Posted: 10.20.2014

by Laura Starecheski From the self-affirmations of Stuart Smalley on Saturday Night Live to countless videos on YouTube, saying nice things to your reflection in the mirror is a self-help trope that's been around for decades, and seems most often aimed at women. The practice, we're told, can help us like ourselves and our bodies more, and even make us more successful — allow us to chase our dreams! Impressed, but skeptical, I took this self-talk idea to one of the country's leading researchers on body image to see if it's actually part of clinical practice. David Sarwer is a psychologist and clinical director at the Center for Weight and Eating Disorders at the University of Pennsylvania. He says that, in fact, a mirror is one of the first tools he uses with some new patients. He stands them in front of a mirror and coaches them to use gentler, more neutral language as they evaluate their bodies. "Instead of saying, 'My abdomen is disgusting and grotesque,' " Sarwer explains, he'll prompt a patient to say, " 'My abdomen is round, my abdomen is big; it's bigger than I'd like it to be.' " The goal, he says, is to remove "negative and pejorative terms" from the patient's self-talk. The underlying notion is that it's not enough for a patient to lose physical weight — or gain it, as some women need to — if she doesn't also change the way her body looks in her mind's eye. This may sound weird. You're either a size 4 or a size 8, right? Not mentally, apparently. In a 2013 study from the Netherlands, scientists watched women with anorexia walk through doorways in a lab. The women, they noticed, turned their shoulders and squeezed sideways, even when they had plenty of room. © 2014 NPR

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20178 - Posted: 10.08.2014

By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was ­texting a girlfriend at the time, adding in unmistakable anguish that he can’t even ­remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20124 - Posted: 09.27.2014

By Melissa Dahl If you are the sort of person who has a hard time just watching TV — if you’ve got to be simultaneously using your iPad or laptop or smartphone — here’s some bad news. New research shows a link between juggling multiple digital devices and a lower-than-usual amount of gray matter, the stuff that’s made up of brain cells, in the region of the brain associated with cognitive and emotional control. More details, via the press release: The researchers at the University of Sussex's Sackler Centre for Consciousness used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media. They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions. But a predilection for using several devices at once isn’t necessarily causing a decrease in gray matter, the authors note — this is a purely correlational finding. As Earl Miller, a neuroscientist at MIT who was not involved in this research, wrote in an email, “It could be (in fact, is possibly more likely) that the relationship is the other way around.” In other words, the people who are least content using just one device at a time may have less gray matter in the first place.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20123 - Posted: 09.27.2014

by Helen Thomson My, what big eyes you have – you must be trying really hard. A study of how pupils dilate with physical effort could allow us to make strenuous tasks seem easier by zapping specific areas of the brain. We know pupils dilate with mental effort, when we think about a difficult maths problem, for example. To see if this was also true of physical exertion, Alexandre Zenon at the Catholic University of Louvain in Belgium, measured the pupils of 18 volunteers as they squeezed a device which reads grip strength. Sure enough, the more force they exerted, the larger their pupils. To see whether pupil size was related to actual or perceived effort, the volunteers were asked to squeeze the device with four different grip strengths. Various tests enabled the researchers to tell how much effort participants felt they used, from none at all to the most effort possible. Comparing the results from both sets of experiments suggested that pupil dilation correlated more closely with perceived effort than actual effort. The fact that both mental effort and perceived physical effort are reflected in pupil size suggests there is a common representation of effort in the brain, says Zenon. To see where in the brain this might be, the team looked at which areas were active while similar grip tasks were being performed. Zenon says they were able to identify areas within the supplementary motor cortex – which plays a role in movement – associated with how effortful a task is perceived to be. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 20121 - Posted: 09.27.2014

Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20115 - Posted: 09.25.2014

By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20088 - Posted: 09.18.2014

By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20049 - Posted: 09.09.2014

By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 19939 - Posted: 08.12.2014

By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19936 - Posted: 08.11.2014

Ian Sample, science correspondent The human brain can judge the apparent trustworthiness of a face from a glimpse so fleeting, the person has no idea they have seen it, scientists claim. Researchers in the US found that brain activity changed in response to how trustworthy a face appeared to be when the face in question had not been consciously perceived. Scientists made the surprise discovery during a series of experiments that were designed to shed light on the the neural processes that underpin the snap judgments people make about others. The findings suggest that parts of our brains are doing more complex subconscious processing of the outside world than many researchers thought. Jonathan Freeman at New York University said the results built on previous work that shows "we form spontaneous judgments of other people that can be largely outside awareness." The study focused on the activity of the amygdala, a small almond-shaped region deep inside the brain. The amygdala is intimately involved with processing strong emotions, such as fear. Its central nucleus sends out the signals responsible for the famous and evolutionarily crucial "fight-or-flight" response. Prior to the study, Freeman asked a group of volunteers to rate the trustworthiness of a series of faces. People tend to agree when they rank trustworthiness – faces with several key features, such as more furrowed brows and shallower cheekbones, are consistently rated as less trustworthy. Freeman then invited a different group of people to take part in the experiments. Each lay in an MRI scanner while images of faces flashed up on a screen before them. Each trustworthy or untrustworthy face flashed up for a matter of milliseconds. Though their eyes had glimpsed the images, the participants were not aware they had seen the faces. © 2014 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19924 - Posted: 08.07.2014

David Robson It’s not often that you look at your meal to find it staring back at you. But when Diane Duyser picked up her cheese toastie, she was in for a shock. “I went to take a bite out of it, and then I saw this lady looking back at me,” she told the Chicago Tribune. “It scared me at first.” As word got around, it soon began to spark more attention, and eventually a casino paid Duyser $28,000 to exhibit the toasted sandwich. For many, the woman’s soft, full features and serene expression recalls famous depictions of the Virgin Mary. But I’ve always thought the curled hair, parted lips and heavy eyelids evoke a more modern idol. Whichever Madonna you think you can see, she joins good company; Jesus has also been seen in toast, as well as a taco, a pancake and a banana peel, while Buzzfeed recently ran photos of peppers that look like British politicians. “If someone reports seeing Jesus in a piece of toast, you’d think they must be nuts,” says Kang Lee, at the University of Toronto, Canada. “But it’s very pervasive... We are primed to see faces in every corner of the visual world.” Lee has shown that rather than being a result of divine intervention, these experiences reflect the powerful influence of our imagination over our perception. Indeed, his explanation may mean that you never trust your eyes again. Pareidolia, as this experience is known, is by no means a recent phenomenon. Leonardo da Vinci described seeing characters in natural markings on stone walls, which he believed could help inspire his artworks. In the 1950s, the Bank of Canada had to withdraw a series of banknotes because a grinning devil leapt from the random curls of the Queen’s hair (although I can’t, for the life of me, see the merest hint of a horn in Her Majesty’s locks). The Viking I spacecraft, meanwhile, appeared to photograph a carved face in the rocky landscape of Mars. BBC © 2014

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19912 - Posted: 08.02.2014

|By Nathan Collins Time, space and social relationships share a common language of distance: we speak of faraway places, close friends and the remote past. Maybe that is because all three share common patterns of brain activity, according to a January study in the Journal of Neuroscience. Curious to understand why the distance metaphor works across conceptual domains, Dartmouth College psychologists used functional MRI scans to analyze the brains of 15 people as they viewed pictures of household objects taken at near or far distances, looked at photographs of friends or acquaintances, and read phrases such as “in a few seconds” or “a year from now.” Patterns of activity in the right inferior parietal lobule, a region thought to handle distance information, robustly predicted whether a participant was thinking about near versus far in any of the categories—indicating that certain aspects of time, space and relationships are all processed in a similar way in the brain. The results, the researchers say, suggest that higher-order brain functions are organized more around computations such as near versus far than conceptual domains such as time or social relationships. © 2014 Scientific American

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 19860 - Posted: 07.21.2014

|By Ferris Jabr You know the exit is somewhere along this stretch of highway, but you have never taken it before and do not want to miss it. As you carefully scan the side of the road for the exit sign, numerous distractions intrude on your visual field: billboards, a snazzy convertible, a cell phone buzzing on the dashboard. How does your brain focus on the task at hand? To answer this question, neuroscientists generally study the way the brain strengthens its response to what you are looking for—jolting itself with an especially large electrical pulse when you see it. Another mental trick may be just as important, according to a study published in April in the Journal of Neuroscience: the brain deliberately weakens its reaction to everything else so that the target seems more important in comparison. Cognitive neuroscientists John Gaspar and John McDonald, both at Simon Fraser University in British Columbia, arrived at the conclusion after asking 48 college students to take attention tests on a computer. The volunteers had to quickly spot a lone yellow circle among an array of green circles without being distracted by an even more eye-catching red circle. All the while the researchers monitored electrical activity in the students' brains using a net of electrodes attached to their scalps. The recorded patterns revealed that their brains consistently suppressed reactions to all circles except the one they were looking for—the first direct evidence of this particular neural process in action. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19788 - Posted: 07.03.2014

By MARIA KONNIKOVA THE absurdity of having had to ask for an extension to write this article isn’t lost on me: It is, after all, a piece on time and poverty, or, rather, time poverty — about what happens when we find ourselves working against the clock to finish something. In the case of someone who isn’t otherwise poor, poverty of time is an unpleasant inconvenience. But for someone whose lack of time is just one of many pressing concerns, the effects compound quickly. We make a mistake when we look at poverty as simply a question of financial constraint. Take what happened with my request for an extension. It was granted, and the immediate time pressure was relieved. But even though I met the new deadline (barely), I’m still struggling to dig myself out from the rest of the work that accumulated in the meantime. New deadlines that are about to whoosh by, a growing list of ignored errands, a rent check and insurance payment that I just realized I haven’t mailed. And no sign of that promised light at the end of the tunnel. My experience is the time equivalent of a high-interest loan cycle, except instead of money, I borrow time. But this kind of borrowing comes with an interest rate of its own: By focusing on one immediate deadline, I neglect not only future deadlines but the mundane tasks of daily life that would normally take up next to no time or mental energy. It’s the same type of problem poor people encounter every day, multiple times: The demands of the moment override the demands of the future, making that future harder to reach. When we think of poverty, we tend to think about money in isolation: How much does she earn? Is that above or below the poverty line? But the financial part of the equation may not be the single most important factor. “The biggest mistake we make about scarcity,” Sendhil Mullainathan, an economist at Harvard who is a co-author of the book “Scarcity: Why Having Too Little Means So Much,” tells me, “is we view it as a physical phenomenon. It’s not.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19735 - Posted: 06.16.2014

—By Indre Viskontas and Chris Mooney We've all been mesmerized by them—those beautiful brain scan images that make us feel like we're on the cutting edge of scientifically decoding how we think. But as soon as one neuroscience study purports to show which brain region lights up when we are enjoying Coca-Cola, or looking at cute puppies, or thinking we have souls, some other expert claims that "it's just a correlation," and you wonder whether researchers will ever get it right. Sam Kean But there's another approach to understanding how our minds work. In his new book, The Tale of the Dueling Neurosurgeons, Sam Kean tells the story of a handful of patients whose unique brains—rendered that way by surgical procedures, rare diseases, and unfortunate, freak accidents—taught us much more than any set of colorful scans. Kean recounts some of their unforgettable stories on the latest episode of the Inquiring Minds podcast. "As I was reading these [case studies] I said, 'That's baloney! There's no way that can possibly be true,'" Kean remembers, referring to one particularly surprising case in which a woman's brain injury left her unable to recognize and distinguish between different kinds of animals. "But then I looked into it, and I realized that, not only is it true, it actually reveals some important things about how the brain works." Here are five patients, from Kean's book, whose stories transformed neuroscience: 1. The man who could not imagine the future: Kent Cochrane (KC), pictured below, was a '70s wild child, playing in a rock band, getting into bar fights, and zooming around Toronto on his motorcycle. But in 1981, a motorcycle accident left him without two critical brain structures. Both of his hippocampi, the parts of the brain that allow us to form new long-term memories for facts and events in our lives, were lost. That's quite different from other amnesiacs, whose damage is either restricted to only one brain hemisphere, or includes large portions of regions outside of the hippocampus. Copyright ©2014 Mother Jones

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 14: Attention and Consciousness
Link ID: 19727 - Posted: 06.14.2014

Jane J. Lee Could've, should've, would've. Everyone has made the wrong choice at some point in life and suffered regret because of it. Now a new study shows we're not alone in our reaction to incorrect decisions. Rats too can feel regret. Regret is thinking about what you should have done, says David Redish, a neuroscientist at the University of Minnesota in Minneapolis. It differs from disappointment, which you feel when you don't get what you expected. And it affects how you make decisions in the future. (See "Hand Washing Wipes Away Regrets?") If you really want to study emotions or feelings like regret, says Redish, you can't just ask people how they feel. So when psychologists and economists study regret, they look for behavioral and neural manifestations of it. Using rats is one way to get down into the feeling's neural mechanics. Redish and colleague Adam Steiner, also at the University of Minneapolis, found that rats expressed regret through both their behavior and their neural activity. Those signals, researchers report today in the journal Nature Neuroscience, were specific to situations the researchers set up to induce regret, which led to specific neural patterns in the brain and in behavior. When Redish and Steiner looked for neural activity, they focused on two areas known in people—and in some animals—to be involved in decision-making and the evaluation of expected outcomes: the orbitofrontal cortex and the ventral striatum. Brain scans have revealed that people with a damaged orbitofrontal cortex, for instance, don't express regret. To record nerve-cell activity, the researchers implanted electrodes in the brains of four rats—a typical sample size in this kind of experiment—then trained them to run a "choice" maze. © 1996-2014 National Geographic Society

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19713 - Posted: 06.09.2014

By JOHN COATES SIX years after the financial meltdown there is once again talk about market bubbles. Are stocks succumbing to exuberance? Is real estate? We thought we had exorcised these demons. It is therefore with something close to despair that we ask: What is it about risk taking that so eludes our understanding, and our control? Part of the problem is that we tend to view financial risk taking as a purely intellectual activity. But this view is incomplete. Risk is more than an intellectual puzzle — it is a profoundly physical experience, and it involves your body. Risk by its very nature threatens to hurt you, so when confronted by it your body and brain, under the influence of the stress response, unite as a single functioning unit. This occurs in athletes and soldiers, and it occurs as well in traders and people investing from home. The state of your body predicts your appetite for financial risk just as it predicts an athlete’s performance. If we understand how a person’s body influences risk taking, we can learn how to better manage risk takers. We can also recognize that mistakes governments have made have contributed to excessive risk taking. Consider the most important risk manager of them all — the Federal Reserve. Over the past 20 years, the Fed has pioneered a new technique of influencing Wall Street. Where before the Fed shrouded its activities in secrecy, it now informs the street in as clear terms as possible of what it intends to do with short-term interest rates, and when. Janet L. Yellen, the chairwoman of the Fed, declared this new transparency, called forward guidance, a revolution; Ben S. Bernanke, her predecessor, claimed it reduced uncertainty and calmed the markets. But does it really calm the markets? Or has eliminating uncertainty in policy spread complacency among the financial community and actually helped inflate market bubbles? We get a fascinating answer to these questions if we turn from economics and look into the biology of risk taking. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 19711 - Posted: 06.09.2014