Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 846

By Dominic Basulto It turns out that the human brain may not be as mysterious as it has always seemed to be. Researchers at George Washington University, led by Mohamad Koubeissi, may have found a way to turn human consciousness on and off by targeting a specific region of the brain with electrical currents. For brain researchers, unlocking the mystery of human consciousness has always been viewed as one of the keys for eventually building an artificial brain, and so this could be a big win for the future of brain research. What the researchers did was deliver a serious of high frequency electrical impulses to the claustrum region of the brain in a woman suffering from epilepsy. Before the electric shocks, the woman was capable of writing and talking. During the electric shocks, the woman faded out of consciousness, and started staring blankly into space, incapable of even the most basic sensory functions. Even her breathing slowed. As soon as the electrical shocks stopped, the woman immediately regained her sensory skills with no memory of the event. The researchers claim that this test case is evidence of being able to turn consciousness on and off. Granted, there’s a lot still to be done. That George Washington test, for example, has only been successfully performed on one person. And that woman had already had part of her hippocampus removed, so at least one researcher says the whole experiment must be interpreted carefully. There have been plenty of scientific experiments that have been “one and done,” so it remains to be seen whether these results can be replicated again.

Keyword: Consciousness
Link ID: 19817 - Posted: 07.12.2014

By ALEX HALBERSTADT Dr. Vint Virga likes to arrive at a zoo several hours before it opens, when the sun is still in the trees and the lanes are quiet and the trash cans empty. Many of the animals haven’t yet slipped into their afternoon ma­laise, when they retreat, appearing to wait out the heat and the visitors and not do much of anything. Virga likes to creep to the edge of their enclosures and watch. He chooses a spot and tries not to vary it, he says, “to give the animals a sense of control.” Sometimes he watches an animal for hours, hardly moving. That’s because what to an average zoo visitor looks like frolicking or restlessness or even boredom looks to Virga like a lot more — looks, in fact, like a veritable Russian novel of truculence, joy, sociability, horniness, ire, protectiveness, deference, melancholy and even humor. The ability to interpret animal behavior, Virga says, is a function of temperament, curiosity and, mostly, decades of practice. It is not, it turns out, especially easy. Do you know what it means when an elephant lowers her head and folds her trunk underneath it? Or when a zebra wuffles, softly blowing air between her lips; or when a colobus monkey snuffles, sounding a little like a hog rooting in the mud; or when a red fox screams, sounding disconcertingly like an infant; or when red fox kits chatter at one another; or when an African wild dog licks and nibbles at the lips of another; or when a California sea lion resting on the water’s surface stretches a fore flipper and one or both rear flippers in the air, like a synchronized swimmer; or when a hippopotamus “dung showers” by defecating while rapidly flapping its tail? Virga knows, because it is his job to know. He is a behaviorist, and what he does, expressed plainly, is see into the inner lives of animals. The profession is an odd one: It is largely unregulated, and declaring that you are an expert is sometimes enough to be taken for one. Most behaviorists are former animal trainers; some come from other fields entirely. Virga happens to be a veterinarian, very likely the only one in the country whose full-time job is tending to the psychological welfare of animals in captivity. © 2014 The New York Times Company

Keyword: Animal Rights; Aggression
Link ID: 19796 - Posted: 07.04.2014

|By Ferris Jabr You know the exit is somewhere along this stretch of highway, but you have never taken it before and do not want to miss it. As you carefully scan the side of the road for the exit sign, numerous distractions intrude on your visual field: billboards, a snazzy convertible, a cell phone buzzing on the dashboard. How does your brain focus on the task at hand? To answer this question, neuroscientists generally study the way the brain strengthens its response to what you are looking for—jolting itself with an especially large electrical pulse when you see it. Another mental trick may be just as important, according to a study published in April in the Journal of Neuroscience: the brain deliberately weakens its reaction to everything else so that the target seems more important in comparison. Cognitive neuroscientists John Gaspar and John McDonald, both at Simon Fraser University in British Columbia, arrived at the conclusion after asking 48 college students to take attention tests on a computer. The volunteers had to quickly spot a lone yellow circle among an array of green circles without being distracted by an even more eye-catching red circle. All the while the researchers monitored electrical activity in the students' brains using a net of electrodes attached to their scalps. The recorded patterns revealed that their brains consistently suppressed reactions to all circles except the one they were looking for—the first direct evidence of this particular neural process in action. © 2014 Scientific American

Keyword: Attention
Link ID: 19788 - Posted: 07.03.2014

by Helen Thomson ONE moment you're conscious, the next you're not. For the first time, researchers have switched off consciousness by electrically stimulating a single brain area. Scientists have been probing individual regions of the brain for over a century, exploring their function by zapping them with electricity and temporarily putting them out of action. Despite this, they have never been able to turn off consciousness – until now. Although only tested in one person, the discovery suggests that a single area – the claustrum – might be integral to combining disparate brain activity into a seamless package of thoughts, sensations and emotions. It takes us a step closer to answering a problem that has confounded scientists and philosophers for millennia – namely how our conscious awareness arises. Many theories abound but most agree that consciousness has to involve the integration of activity from several brain networks, allowing us to perceive our surroundings as one single unifying experience rather than isolated sensory perceptions. One proponent of this idea was Francis Crick, a pioneering neuroscientist who earlier in his career had identified the structure of DNA. Just days before he died in July 2004, Crick was working on a paper that suggested our consciousness needs something akin to an orchestra conductor to bind all of our different external and internal perceptions together. With his colleague Christof Koch, at the Allen Institute for Brain Science in Seattle, he hypothesised that this conductor would need to rapidly integrate information across distinct regions of the brain and bind together information arriving at different times. For example, information about the smell and colour of a rose, its name, and a memory of its relevance, can be bound into one conscious experience of being handed a rose on Valentine's day. © Copyright Reed Business Information Ltd.

Keyword: Consciousness
Link ID: 19787 - Posted: 07.03.2014

Claire McCarthy I have many patients with ADHD (Attention Deficit Hyperactivity Disorder) and it seems like I have the same conversation over and over again with their parents: to medicate or not to medicate. I completely understand the hesitation I hear from so many parents. I have to admit, I'm not entirely happy myself about prescribing a medication that has side effects and can be abused or misused, and one for which there is a black market. I also worry that too often when a child is on medication and so learning and behaving better, parents and teachers lose the incentive to help the child learn the organizational and other skills that could make all the difference later in life. Since ADHD often persists into adulthood, we have to have the long view with these kids. But....the long view works the other way, too. Not treating ADHD with medication can lead to problems. Like drug abuse. ADHD is really common. It affects 8 percent of children and youth--that's about 2 in every classroom of 20. Kids with ADHD can have real problems with both learning and behavior, problems that can haunt them for a lifetime (if you end up dropping out of high school because of poor grades or behavior, or end up getting arrested, it has a way of interfering with your future income and quality of life). But another thing we know is that kids with ADHD have a higher risk of drug abuse. We don't know exactly why this is the case. Some of it is likely the impulsivity that is so common in people with ADHD; they don't always make the best decisions. It may also be that people with ADHD are more prone to addiction. Whatever it is, the risk is very real. Not only are kids with ADHD 2.5 times more likely to abuse drugs, they are more likely to start earlier, use more types of drugs, and continue into adulthood. ©2014 Boston Globe Media Partners, LLC

Keyword: ADHD; Aggression
Link ID: 19782 - Posted: 07.02.2014

By HELENE STAPINSKI A few months ago, my 10-year-old daughter, Paulina, was suffering from a bad headache right before bedtime. She went to lie down and I sat beside her, stroking her head. After a few minutes, she looked up at me and said, “Everything in the room looks really small.” And I suddenly remembered: When I was young, I too would “see things far away,” as I once described it to my mother — as if everything in the room were at the wrong end of a telescope. The episodes could last anywhere from a few minutes to an hour, but they eventually faded as I grew older. I asked Paulina if this was the first time she had experienced such a thing. She shook her head and said it happened every now and then. When I was a little girl, I told her, it would happen to me when I had a fever or was nervous. I told her not to worry and that it would go away on its own. Soon she fell asleep, and I ran straight to my computer. Within minutes, I discovered that there was an actual name for what turns out to be a very rare affliction — Alice in Wonderland Syndrome. Episodes usually include micropsia (objects appear small) or macropsia (objects appear large). Some sufferers perceive their own body parts to be larger or smaller. For me, and Paulina, furniture a few feet away seemed small enough to fit inside a dollhouse. Dr. John Todd, a British psychiatrist, gave the disorder its name in a 1955 paper, noting that the misperceptions resemble Lewis Carroll’s descriptions of what happened to Alice. It’s also known as Todd’s Syndrome. Alice in Wonderland Syndrome is not an optical problem or a hallucination. Instead, it is most likely caused by a change in a portion of the brain, likely the parietal lobe, that processes perceptions of the environment. Some specialists consider it a type of aura, a sensory warning preceding a migraine. And the doctors confirmed that it usually goes away by adulthood. © 2014 The New York Times Company

Keyword: Vision; Aggression
Link ID: 19766 - Posted: 06.24.2014

Nicola Davis The old adage that we eat with our eyes appears to be correct, according to research that suggests diners rate an artistically arranged meal as more tasty – and are prepared to pay more for it. The team at Oxford University tested the idea by gauging the reactions of diners to food presented in different ways. Inspired by Wassily Kandinsky's "Painting Number 201" Franco-Columbian chef and one of the authors of the study, Charles Michel, designed a salad resembling the abstract artwork to explore how the presentation of food affects the dining experience. "A number of chefs now are realising that they are being judged by how their foods photograph – be it in the fancy cookbooks [or], more often than not, when diners instagram their friends," explains Professor Charles Spence, experimental psychologist at the University of Oxford and a co-author of the study. Thirty men and 30 women were each presented with one of three salads containing identical ingredients, arranged either to resemble the Kandinsky painting, a regular tossed salad, or a "neat" formation where each component was spaced away from the others. Seated alone at a table mimicking a restaurant setting, and unaware that other versions of the salad were on offer, each participant was given two questionnaires asking them to rate various aspects of the dish on a 10-point scale, before and after tucking into the salad. Before participants sampled their plateful, the Kandinsky-inspired dish was rated higher for complexity, artistic presentation and general liking. Participants were prepared to pay twice as much for the meal as for either the regular or "neat arrangements". © 2014 Guardian News and Media Limited

Keyword: Chemical Senses (Smell & Taste); Aggression
Link ID: 19759 - Posted: 06.23.2014

By Gary Stix James DiCarlo: We all have this intuitive feel for what object recognition is. It’s the ability to discriminate your face from other faces, a car from other cars, a dog from a camel, that ability we all intuitively feel. But making progress in understanding how our brains are able to accomplish that is a very challenging problem and part of the reason is that it’s challenging to define what it isn’t and is. We take this problem for granted because it seems effortless to us. However, a computer vision person would tell you is that this is an extremely challenging problem because each object presents an essentially infinite number of images to your retina so you essentially never see the same image of each object twice. SA: It seems like object recognition is actually one of the big problems both in neuroscience and in the computational science of machine learning? DiCarlo: That’s right., not only machine learning but also in psychology or cognitive science because the objects that we see are the sources in the world of what we use to build higher cognition, things like memory and decision-making. Should I reach for this, should I avoid it? Our brains can’t do what you would call higher cognition without these foundational elements that we often take for granted. SA: Maybe you can talk about what’s actually happening in the brain during this process. DiCarlo: It’s been known for several decades that there’s a portion of the brain, the temporal lobe down the sides of our head, that, when lost or damaged in humans and non-human primates, leads to deficits of recognition. So we had clues that that’s where these algorithms for object recognition are living. But just saying that part of your brain solves the problem is not really specific. It’s still a very large piece of tissue. Anatomy tells us that there’s a whole network of areas that exist there, and now the tools of neurophysiology and still more advanced tools allow us to go in and look more closely at the neural activity, especially in non-human primates. We can then begin to decipher the actual computations to the level that an engineer might, for instance, in order to emulate what’s going on in our heads. © 2014 Scientific American

Keyword: Vision; Aggression
Link ID: 19754 - Posted: 06.21.2014

by Tania Lombrozo Science doesn't just further technology and help us predict and control our environment. It also changes the way we understand ourselves and our place in the natural world. This understanding can and a sense of . But it can also be , especially when it calls into question our basic assumptions about the kinds of creatures we are and the universe we inhabit. Current developments in neuroscience seem to be triggering precisely this jumble of reactions: wonder alongside disquiet, hope alongside alarm. A recent at Salon.com, for example, promises an explanation for "how neuroscience could save addicts from relapse," while an by Nathan Greenslit at The Atlantic, published less than a week later, raises worries that neuroscience is being used to reinforce racist drug policy. Obama's hails "," but with it comes the need to rapidly work out the of what we're learning about the brain and about ourselves. We're ; but we're not always sure what to make of it. In at the journal Psychological Science, psychologists Azim Shariff, Joshua Greene and six of their colleagues bring these heady issues down to earth by considering whether learning about neuroscience can influence judgments in a real-world situation: deciding how someone who commits a crime should be punished. The motivating intuition is this: to hold someone responsible for her actions, she must have acted with free will. ©2014 NPR

Keyword: Consciousness
Link ID: 19737 - Posted: 06.17.2014

By MARIA KONNIKOVA THE absurdity of having had to ask for an extension to write this article isn’t lost on me: It is, after all, a piece on time and poverty, or, rather, time poverty — about what happens when we find ourselves working against the clock to finish something. In the case of someone who isn’t otherwise poor, poverty of time is an unpleasant inconvenience. But for someone whose lack of time is just one of many pressing concerns, the effects compound quickly. We make a mistake when we look at poverty as simply a question of financial constraint. Take what happened with my request for an extension. It was granted, and the immediate time pressure was relieved. But even though I met the new deadline (barely), I’m still struggling to dig myself out from the rest of the work that accumulated in the meantime. New deadlines that are about to whoosh by, a growing list of ignored errands, a rent check and insurance payment that I just realized I haven’t mailed. And no sign of that promised light at the end of the tunnel. My experience is the time equivalent of a high-interest loan cycle, except instead of money, I borrow time. But this kind of borrowing comes with an interest rate of its own: By focusing on one immediate deadline, I neglect not only future deadlines but the mundane tasks of daily life that would normally take up next to no time or mental energy. It’s the same type of problem poor people encounter every day, multiple times: The demands of the moment override the demands of the future, making that future harder to reach. When we think of poverty, we tend to think about money in isolation: How much does she earn? Is that above or below the poverty line? But the financial part of the equation may not be the single most important factor. “The biggest mistake we make about scarcity,” Sendhil Mullainathan, an economist at Harvard who is a co-author of the book “Scarcity: Why Having Too Little Means So Much,” tells me, “is we view it as a physical phenomenon. It’s not.” © 2014 The New York Times Company

Keyword: Attention; Aggression
Link ID: 19735 - Posted: 06.16.2014

By Brian Palmer Maureen Dowd, a 62-year-old Pulitzer Prize–winning columnist for the New York Times, had a bad marijuana trip earlier this year. As part of her research into the legalization of recreational cannabis in Colorado, she ate a few too many bites of a pot-infused candy bar, entered a “hallucinatory state,” and spent eight paranoid hours curled up on her hotel room bed. Dowd used the experience as a jumping-off point to discuss the risks of overdosing on edible marijuana, which has become a major issue in pot-friendly states. It’s also possible, however, that Dowd just doesn’t handle cannabis very well. While pot mellows most people out, everyone has heard of someone who barricaded himself or herself in a dorm room after a few bongs hits in college. (Or maybe that someone is you.) Why do people react so differently to the same drug? The question itself may be something of a fallacy. Cannabis is not a single drug—it contains dozens of compounds, and they appear to have varying, and sometimes opposing, effects on the brain. Tetrahydrocannabinol, or THC, and cannabidiol, or CBD, have been the subject of some intriguing research. In 2010, researchers showed that pretreating people with a dose of CBD can protect against the less pleasant effects of THC, such as paranoia. In a similar 2012 study, participants took pills that contained only one of the two chemicals, rather than the combination that you receive in cannabis. The subjects who took THC pills were more likely to suffer paranoia and delusion than those who took CBD. The researchers went one step further to investigate which specific cognitive effects of THC are likely to lead to paranoia and other symptoms of psychosis. After taking either THC or CBD, participants watched a series of arrows appear on a screen and responded by indicating which direction the arrows were pointing. Most of the arrows pointed directly left or right, but occasionally a tilted arrow appeared. (Researchers called the tilted arrows “oddballs.”) Subjects who took the CBD had a heightened brain activity response to the oddballs. That’s the way a nondrugged person typically reacts—repetitions of the same stimulus don’t interest us, but a sudden change grabs our attention. The THC-takers had an abnormal response: They found the left and right arrows, which constituted the overwhelming majority of the images, more noteworthy than the oddballs. © 2014 The Slate Group LLC

Keyword: Drug Abuse; Aggression
Link ID: 19734 - Posted: 06.16.2014

—By Indre Viskontas and Chris Mooney We've all been mesmerized by them—those beautiful brain scan images that make us feel like we're on the cutting edge of scientifically decoding how we think. But as soon as one neuroscience study purports to show which brain region lights up when we are enjoying Coca-Cola, or looking at cute puppies, or thinking we have souls, some other expert claims that "it's just a correlation," and you wonder whether researchers will ever get it right. Sam Kean But there's another approach to understanding how our minds work. In his new book, The Tale of the Dueling Neurosurgeons, Sam Kean tells the story of a handful of patients whose unique brains—rendered that way by surgical procedures, rare diseases, and unfortunate, freak accidents—taught us much more than any set of colorful scans. Kean recounts some of their unforgettable stories on the latest episode of the Inquiring Minds podcast. "As I was reading these [case studies] I said, 'That's baloney! There's no way that can possibly be true,'" Kean remembers, referring to one particularly surprising case in which a woman's brain injury left her unable to recognize and distinguish between different kinds of animals. "But then I looked into it, and I realized that, not only is it true, it actually reveals some important things about how the brain works." Here are five patients, from Kean's book, whose stories transformed neuroscience: 1. The man who could not imagine the future: Kent Cochrane (KC), pictured below, was a '70s wild child, playing in a rock band, getting into bar fights, and zooming around Toronto on his motorcycle. But in 1981, a motorcycle accident left him without two critical brain structures. Both of his hippocampi, the parts of the brain that allow us to form new long-term memories for facts and events in our lives, were lost. That's quite different from other amnesiacs, whose damage is either restricted to only one brain hemisphere, or includes large portions of regions outside of the hippocampus. Copyright ©2014 Mother Jones

Keyword: Attention; Aggression
Link ID: 19727 - Posted: 06.14.2014

Claudia M. Gold Tom Insel, director of the National Institute of Mental Health (NIMH,) in his recent blog post "Are Children Overmedicated?" seems to suggest that perhaps more psychiatric medication is in order. Comparing mental illness in children to food allergies, he dismisses the "usual" explanations given for the increase medication prescribing patterns. In his view, these explanations are: Blaming psychiatrists who are too busy to provide therapy, parents who are too busy to provide a stable home environment, drug companies for marketing their products, and schools for lack of recess. By concluding that perhaps the explanation for the increase in prescribing of psychiatric medication to children is a greater number of children with serious psychiatric illness, Insel shows a lack of recognition of the complexity of the situation. When a recent New York Times article, that Insel makes reference to, reported on the rise in prescribing of psychiatric medication for toddlers diagnosed with ADHD, with a disproportionate number coming from families of poverty, one clinician remarked that if this is an attempt to medicate social and economic issues, then we have a huge problem. He was on to something. In conversations with pediatricians (the main prescribers of these medications) and child psychiatrists on the front lines, I find many in a reactive stance. When people feel overwhelmed, they go into survival mode, with their immediate aim just to get through the day. They find themselves prescribing medication because they have no other options.

Keyword: ADHD; Aggression
Link ID: 19715 - Posted: 06.10.2014

Jane J. Lee Could've, should've, would've. Everyone has made the wrong choice at some point in life and suffered regret because of it. Now a new study shows we're not alone in our reaction to incorrect decisions. Rats too can feel regret. Regret is thinking about what you should have done, says David Redish, a neuroscientist at the University of Minnesota in Minneapolis. It differs from disappointment, which you feel when you don't get what you expected. And it affects how you make decisions in the future. (See "Hand Washing Wipes Away Regrets?") If you really want to study emotions or feelings like regret, says Redish, you can't just ask people how they feel. So when psychologists and economists study regret, they look for behavioral and neural manifestations of it. Using rats is one way to get down into the feeling's neural mechanics. Redish and colleague Adam Steiner, also at the University of Minneapolis, found that rats expressed regret through both their behavior and their neural activity. Those signals, researchers report today in the journal Nature Neuroscience, were specific to situations the researchers set up to induce regret, which led to specific neural patterns in the brain and in behavior. When Redish and Steiner looked for neural activity, they focused on two areas known in people—and in some animals—to be involved in decision-making and the evaluation of expected outcomes: the orbitofrontal cortex and the ventral striatum. Brain scans have revealed that people with a damaged orbitofrontal cortex, for instance, don't express regret. To record nerve-cell activity, the researchers implanted electrodes in the brains of four rats—a typical sample size in this kind of experiment—then trained them to run a "choice" maze. © 1996-2014 National Geographic Society

Keyword: Attention; Aggression
Link ID: 19713 - Posted: 06.09.2014

By JOHN COATES SIX years after the financial meltdown there is once again talk about market bubbles. Are stocks succumbing to exuberance? Is real estate? We thought we had exorcised these demons. It is therefore with something close to despair that we ask: What is it about risk taking that so eludes our understanding, and our control? Part of the problem is that we tend to view financial risk taking as a purely intellectual activity. But this view is incomplete. Risk is more than an intellectual puzzle — it is a profoundly physical experience, and it involves your body. Risk by its very nature threatens to hurt you, so when confronted by it your body and brain, under the influence of the stress response, unite as a single functioning unit. This occurs in athletes and soldiers, and it occurs as well in traders and people investing from home. The state of your body predicts your appetite for financial risk just as it predicts an athlete’s performance. If we understand how a person’s body influences risk taking, we can learn how to better manage risk takers. We can also recognize that mistakes governments have made have contributed to excessive risk taking. Consider the most important risk manager of them all — the Federal Reserve. Over the past 20 years, the Fed has pioneered a new technique of influencing Wall Street. Where before the Fed shrouded its activities in secrecy, it now informs the street in as clear terms as possible of what it intends to do with short-term interest rates, and when. Janet L. Yellen, the chairwoman of the Fed, declared this new transparency, called forward guidance, a revolution; Ben S. Bernanke, her predecessor, claimed it reduced uncertainty and calmed the markets. But does it really calm the markets? Or has eliminating uncertainty in policy spread complacency among the financial community and actually helped inflate market bubbles? We get a fascinating answer to these questions if we turn from economics and look into the biology of risk taking. © 2014 The New York Times Company

Keyword: Attention; Aggression
Link ID: 19711 - Posted: 06.09.2014

By Meeri Kim, Many of us find ourselves swimming along in the tranquil sea of life when suddenly a crisis hits — a death in the family, the loss of a job, a bad breakup. Some power through and find calm waters again, while others drown in depression. Scientists continue to search for the underlying genes and neurobiology that dictate our reactions to stress. Now, a study using mice has found a switch-like mechanism between resilience and defeat in an area of the brain that plays an important role in regulating emotions and has been linked with mood and anxiety disorders. (Bo Li/Cold Spring Harbor Laboratory) - Researchers at Cold Spring Harbor Laboratory identify the neurons in the brain that determine if a mouse will learn to cope with stress or become depressed. These neurons, located in a region of the brain known as the medial prefrontal cortex (top, green image) become hyperactive in depressed mice. The bottom panel is close-up of above image - yellow indicates activation. The team showed that this enhanced activity causes depression. After artificially enhancing the activity of neurons in that part of the brain — the medial prefrontal cortex — mice that previously fought to avoid electric shocks started to act helpless. Rather than leaping for an open escape route, they sat in a corner taking the pain — presumably out of a belief that nothing they could do would change their circumstances. “This helpless behavior is quite similar to what clinicians see in depressed individuals — an inability to take action to avoid or correct a difficult situation,” said study author and neuroscientist Bo Li of the Cold Spring Harbor Laboratory in New York. The results were published online May 27 in the Journal of Neuroscience. © 1996-2014 The Washington Post

Keyword: Depression
Link ID: 19704 - Posted: 06.06.2014

Neil Levy Can human beings still be held responsible in the age of neuroscience? Some people say no: they say once we understand how the brain processes information and thereby causes behaviour, there’s nothing left over for the person to do. This argument has not impressed philosophers, who say there doesn’t need to be anything left for the person to do in order to be responsible. People are not anything over and above the causal systems involved in information processing, we are our brains (plus some other, equally physical stuff). We are responsible if our information processing systems are suitably attuned to reasons, most philosophers think. There are big philosophical debates concerning what it takes to be suitably attuned to reasons, and whether this is really enough for responsibility. But I want to set those debates aside here. It’s more interesting to ask what we can learn from neuroscience about the nature of responsibility and about when we’re responsible. Even if neuroscience doesn’t tell us that no one is ever responsible, it might be able to tell us if particular people are responsible for particular actions. A worthy case study Consider a case like this: early one morning in 1987, a Canadian man named Ken Parks got up from the sofa where he had fallen asleep and drove to his parents’-in-law house. There he stabbed them both before driving to the police station, where he told police he thought he had killed someone. He had: his mother-in-law died from her injuries. © 2010–2014, The Conversation Trust (UK)

Keyword: Consciousness; Aggression
Link ID: 19702 - Posted: 06.06.2014

By Sadie Dingfelder Want to become famous in the field of neuroscience? You could go the usual route, spending decades collecting advanced degrees, slaving away in science labs and publishing your results. Or you could simply fall victim to a freak accident. The stars of local science writer Sam Kean’s new book, “The Tale of the Dueling Neurosurgeons,” (which he’ll discuss Saturday at Politics and Prose) took the latter route. Be it challenging the wrong guy to a joust, spinning out on a motorcycle, or suffering from a stroke, these folks sustained brain injuries with bizarre and fascinating results. One man, for instance, lost the ability to identify different kinds of animals but had no trouble naming plants and objects. Another man lost his short-term memory. The result? A diary filled with entries like: “I am awake for the very first time.” “Now, I’m really awake.” “Now, I’m really, completely awake.” Unfortunate mishaps like these have advanced our understanding of how the gelatinous gray mass that (usually) stays hidden inside our skulls gives rise to thoughts, feelings and ideas, Kean says. “Traditionally, every major discovery in the history of neuroscience came about this way,” he says. “We had no other way of looking at the brain for centuries and centuries, because we didn’t have things like MRI machines.” Rather than covering the case studies textbook-style, Kean provides all the gory details. Consider Phineas Gage. You may remember from Psych 101 that Gage, a railroad worker, survived having a metal rod launched through his skull. You might not know, however, that one doctor “shaved Gage’s scalp and peeled off the dried blood and gelatinous brains. He then extracted skull fragments from the wound by sticking his fingers in from both ends, Chinese-finger-trap-style,” as Kean writes in his new book. © 1996-2014 The Washington Post

Keyword: Learning & Memory; Aggression
Link ID: 19701 - Posted: 06.06.2014

By Denali Tietjen Meditation has long been known for its mental health benefits, but new research shows that just a few minutes of mindfulness can improve physical health and personal life as well. A recent study conducted by researchers at INSEAD and The Wharton School found that 15 minutes of mindful meditation can help you make better decisions. The research, published in the Association for Psychological Science’s journal Psychological Science, comes from four studies (varying in sample size from 69 to 178 adults) in which participants responded to sunk-cost scenarios at different degrees of mindful awareness. The results consistently showed that increased mindfulness decreases the sunk-cost bias. WOAH, hold the phone. What’s a sunk cost and what’s a sunk-cost bias?? Sunk cost is an economics term that psychologists have adopted. In economics, sunk costs are defined as non-recoverable investment costs like the cost of employee training or a lease on office space. In psychology, sunk costs are basically the same thing: The time and energy we put into our personal lives. Though we might not sit down with a calculator at the kitchen table when deciding who to take as our plus one to our second cousin’s wedding next weekend, we do a cost-benefit analysis every time we make a decision. And we take these sunk costs into account. The sunk-cost bias, then, is the tendency to allow sunk costs to overly influence current decisions. Mindfulness meditation can provide improved clarity, which helps you stay present and make better decisions, the study says. This protects you from that manipulative sunk-cost bias.

Keyword: Stress
Link ID: 19693 - Posted: 06.05.2014

By Matthew R. Francis Possibly no subject in science has inspired more nonsense than quantum mechanics. Sure, it’s a complicated field of study, with a few truly mysterious facets that are not settled to everyone’s satisfaction after nearly a century of work. At the same time, though, using quantum to mean “we just don’t know” is ridiculous—and simply wrong. Quantum mechanics is the basis for pretty much all our modern technology, from smartphones to fluorescent lights, digital cameras to fiber-optic communications. If I had to pick a runner-up in the nonsense sweepstakes, it would be human consciousness, another subject with a lot of mysterious aspects. We are made of ordinary matter yet are self-aware, capable of abstractly thinking about ourselves and of recognizing others (including nonhumans) as separate entities with their own needs. As a physicist, I’m fascinated by the notion that our consciousness can imagine realities other than our own: The universe is one way, but we are perfectly happy to think of how it might be otherwise. I hold degrees in physics and have spent a lot of time learning and teaching quantum mechanics. Nonphysicists seem to have the impression that quantum physics is really esoteric, with those who study it spending their time debating the nature of reality. In truth, most of a quantum mechanics class is lots and lots of math, in the service of using a particle’s quantum state—the bundle of physical properties such as position, energy, spin, and the like—to describe the outcomes of experiments. Sure, there’s some weird stuff and it’s fun to talk about, but quantum mechanics is aimed at being practical (ideally, at least). © 2014 The Slate Group LLC.

Keyword: Consciousness
Link ID: 19671 - Posted: 05.31.2014