Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1111

By Stephen L. Macknik, Susana Martinez-Conde The renowned Slydini holds up an empty box for all to see. It is not really a box—just four connected cloth-covered cardboard walls, forming a floppy parallelogram with no bottom or top. Yet when the magician sets it down on a table, it looks like an ordinary container. Now he begins to roll large yellow sheets of tissue paper into balls. He claps his hands—SMACK!—as he crumples each new ball in a fist and then straightens his arm, wordlessly compelling the audience to gaze after his closed hand. He opens it, and ... the ball is still there. Nothing happened. Huh. Slydini's hand closes once more around the tissue, and it starts snaking around, slowly and gracefully, like a belly dancer's. The performance is mesmerizing. With his free hand, he grabs an imaginary pinch of pixie dust from the box to sprinkle on top of the other hand. This time he opens his hand to reveal that the tissue is gone! Four balls disappear in this fashion. Then, for the finale, Slydini tips the box forward and shows the impossible: all four balls have mysteriously reappeared inside. Slydini famously performed this act on The Dick Cavett Show in 1978. It was one of his iconic tricks. Despite the prestidigitator's incredible showmanship, though, the sleight only works because your brain cannot multitask. © 2016 Scientific American,

Keyword: Attention
Link ID: 22114 - Posted: 04.19.2016

By JEFFREY M. ZACKS and REBECCA TREIMAN OUR favorite Woody Allen joke is the one about taking a speed-reading course. “I read ‘War and Peace’ in 20 minutes,” he says. “It’s about Russia.” The promise of speed reading — to absorb text several times faster than normal, without any significant loss of comprehension — can indeed seem too good to be true. Nonetheless, it has long been an aspiration for many readers, as well as the entrepreneurs seeking to serve them. And as the production rate for new reading matter has increased, and people read on a growing array of devices, the lure of speed reading has only grown stronger. The first popular speed-reading course, introduced in 1959 by Evelyn Wood, was predicated on the idea that reading was slow because it was inefficient. The course focused on teaching people to make fewer back-and-forth eye movements across the page, taking in more information with each glance. Today, apps like SpeedRead With Spritz aim to minimize eye movement even further by having a digital device present you with a stream of single words one after the other at a rapid rate. Unfortunately, the scientific consensus suggests that such enterprises should be viewed with suspicion. In a recent article in Psychological Science in the Public Interest, one of us (Professor Treiman) and colleagues reviewed the empirical literature on reading and concluded that it’s extremely unlikely you can greatly improve your reading speed without missing out on a lot of meaning. Certainly, readers are capable of rapidly scanning a text to find a specific word or piece of information, or to pick up a general idea of what the text is about. But this is skimming, not reading. We can definitely skim, and it may be that speed-reading systems help people skim better. Some speed-reading systems, for example, instruct people to focus only on the beginnings of paragraphs and chapters. This is probably a good skimming strategy. Participants in a 2009 experiment read essays that had half the words covered up — either the beginning of the essay, the end of the essay, or the beginning or end of each individual paragraph. Reading half-paragraphs led to better performance on a test of memory for the passage’s meaning than did reading only the first or second half of the text, and it worked as well as skimming under time pressure. © 2016 The New York Times Company

Keyword: Language; Attention
Link ID: 22113 - Posted: 04.18.2016

By Matthew Hutson Bad news for believers in clairvoyance. Our brains appear to rewrite history so that the choices we make after an event seem to precede it. In other words, we add loops to our mental timeline that let us feel we can predict things that in reality have already happened. Adam Bear and Paul Bloom at Yale University conducted some simple tests on volunteers. In one experiment, subjects looked at white circles and silently guessed which one would turn red. Once one circle had changed colour, they reported whether or not they had predicted correctly. Over many trials, their reported accuracy was significantly better than the 20 per cent expected by chance, indicating that the volunteers either had psychic abilities or had unwittingly played a mental trick on themselves. The researchers’ study design helped explain what was really going on. They placed different delays between the white circles’ appearance and one of the circles turning red, ranging from 50 milliseconds to one second. Participants’ reported accuracy was highest – surpassing 30 per cent – when the delays were shortest. That’s what you would expect if the appearance of the red circle was actually influencing decisions still in progress. This suggests it’s unlikely that the subjects were merely lying about their predictive abilities to impress the researchers. The mechanism behind this behaviour is still unclear. It’s possible, the researchers suggest, that we perceive the order of events correctly – one circle changes colour before we have actually made our prediction – but then we subconsciously swap the sequence in our memories so the prediction seems to come first. Such a switcheroo could be motivated by a desire to feel in control of our lives. © Copyright Reed Business Information Ltd.

Keyword: Consciousness
Link ID: 22109 - Posted: 04.16.2016

By Simon Makin Everyone's brain is different. Until recently neuroscience has tended to gloss this over by averaging results from many brain scans in trying to elicit general truths about how the organ works. But in a major development within the field researchers have begun documenting how brain activity differs between individuals. Such differences had been largely thought of as transient and uninteresting but studies are starting to show that they are innate properties of people's brains, and that knowing them better might ultimately help treat neurological disorders. The latest study, published April 8 in Science, found that the brain activity of individuals who were just biding their time in a brain scanner contained enough information to predict how their brains would function during a range of ordinary activities. The researchers used these at-rest signatures to predict which regions would light up—which groups of brain cells would switch on—during gambling, reading and other tasks they were asked to perform in the scanner. The technique might be used one day to assess whether certain areas of the brains of people who are paralyzed or in a comatose state are still functional, the authors say. The study capitalizes on a relatively new method of brain imaging that looks at what is going on when a person essentially does nothing. The technique stems from the mid-1990s work of biomedical engineer Bharat Biswal, now at New Jersey Institute of Technology. Biswal noticed that scans he had taken while participants were resting in a functional magnetic resonance imaging (fMRI) scanner displayed orderly, low-frequency oscillations. He had been looking for ways to remove background noise from fMRI signals but quickly realized these oscillations were not noise. His work paved the way for a new approach known as resting-state fMRI. © 2016 Scientific American

Keyword: Brain imaging; Consciousness
Link ID: 22105 - Posted: 04.14.2016

Zoe Cormier Researchers have published the first images showing the effects of LSD on the human brain, as part of a series of studies to examine how the drug causes its characteristic hallucinogenic effects1. David Nutt, a neuropsychopharmacologist at Imperial College London who has previously examined the neural effects of mind-altering drugs such as the hallucinogen psilocybin, found in magic mushrooms, was one of the study's leaders. He tells Nature what the research revealed, and how he hopes LSD (lysergic acid diethylamide) might ultimately be useful in therapies. Why study the effects of LSD on the brain? For brain researchers, studying how psychedelic drugs such as LSD alter the ‘normal’ brain state is a way to study the biological phenomenon that is consciousness. We ultimately would also like to see LSD deployed as a therapeutic tool. The idea has old roots. In the 1950s and 60s thousands of people took LSD for alcoholism; in 2012, a retrospective analysis of some of these studies suggested that it helped cut down on drinking. Since the 1970s there have been lots of studies with LSD on animals, but not on the human brain. We need that data to validate the trial of this drug as a potential therapy for addiction or depression. Why hasn’t anyone done brain scans before? Before the 1960s, LSD was studied for its potential therapeutic uses, as were other hallucinogens. But the drug was heavily restricted in the UK, the United States and around the world after 1967 — in my view, due to unfounded hysteria over its potential dangers. The restrictions vary worldwide, but in general, countries have insisted that LSD has ‘no medical value’, making it tremendously difficult to work with. © 2016 Nature Publishing Group

Keyword: Drug Abuse; Brain imaging
Link ID: 22099 - Posted: 04.12.2016

By Sandhya Somashekhar African Americans are routinely under-treated for their pain compared with whites, according to research. A study released Monday sheds some disturbing light on why that might be the case. Researchers at the University of Virginia quizzed white medical students and residents to see how many believed inaccurate and at times "fantastical" differences about the two races -- for example, that blacks have less sensitive nerve endings than whites or that black people's blood coagulates more quickly. They found that fully half thought at least one of the false statements presented was possibly, probably or definitely true. Moreover, those who held false beliefs often rated black patients' pain as lower than that of white patients and made less appropriate recommendations about how they should be treated. The study, published in the Proceedings of the National Academy of Sciences, could help illuminate one of the most vexing problems in pain treatment today: That whites are more likely than blacks to be prescribed strong pain medications for equivalent ailments. A 2000 study out of Emory University found that at a hospital emergency department in Atlanta, 74 percent of white patients with bone fractures received painkillers compared with 50 percent of black patients. Similarly, a paper last year found that black children with appendicitis were less likely to receive pain medication than their white counterparts. And a 2007 study found that physicians were more likely to underestimate the pain of black patients compared with other patients.

Keyword: Pain & Touch; Attention
Link ID: 22074 - Posted: 04.06.2016

Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.

Keyword: Attention; Emotions
Link ID: 22058 - Posted: 04.01.2016

By PAM BELLUCK When people make risky decisions, like doubling down in blackjack or investing in volatile stocks, what happens in the brain? Scientists have long tried to understand what makes some people risk-averse and others risk-taking. Answers could have implications for how to treat, curb or prevent destructively risky behavior, like pathological gambling or drug addiction. Now, a study by Dr. Karl Deisseroth, a prominent Stanford neuroscientist and psychiatrist, and his colleagues gives some clues. The study, published Wednesday in the journal Nature, reports that a specific type of neuron or nerve cell, in a certain brain region helps galvanize whether or not a risky choice is made. The study was conducted in rats, but experts said it built on research suggesting the findings could be similar in humans. If so, they said, it could inform approaches to addiction, which involves some of the same neurons and brain areas, as well as treatments for Parkinson’s disease because one class of Parkinson’s medications turns some patients into problem gamblers. In a series of experiments led by Kelly Zalocusky, a doctoral student, researchers found that a risk-averse rat made decisions based on whether its previous choice involved a loss (in this case, of food). Rats whose previous decision netted them less food were prompted to behave conservatively next time by signals from certain receptors in a brain region called the nucleus accumbens, the scientists discovered. These receptors, which are proteins attached to neurons, are part of the dopamine system, a neurochemical important to emotion, movement and thinking. In risk-taking rats, however, those receptors sent a much fainter signal, so the rats kept making high-stakes choices even if they lost out. But by employing optogenetics, a technique that uses light to manipulate neurons, the scientists stimulated brain cells with those receptors, heightening the “loss” signal and turning risky rats into safer rats. © 2016 The New York Times Company

Keyword: Attention; Emotions
Link ID: 22025 - Posted: 03.24.2016

By Daniel Barron It’s unnerving when someone with no criminal record commits a disturbingly violent crime. Perhaps he stabs his girlfriend 40 times and dumps her body in the desert. Perhaps he climbs to the top of a clock tower and guns down innocent passers-by. Or perhaps he climbs out of a car at a stoplight and nearly decapitates an unsuspecting police officer with 26 rounds from an assault rifle. Perhaps he even drowns his own children. Or shoots the President of the United States. The shock is palpable (NB: those are all actual cases). The very notion that someone—our neighbor, the guy ahead of us in the check-out line, we (!)—could do something so terrible rubs at our minds. We wonder, “What happened? What in this guy snapped?” After all, for the last 20 years, the accused went home to his family after work—why did he go rob that liquor store? What made him pull that trigger? The subject hit home for me this week when I was called to jury duty. As I made my way to the county courthouse, I wondered whether I would be asked to decide a capital murder case like the ones above. As a young neuroscientist, the prospect made me uneasy. At the trial, the accused’s lawyers would probably argue that, at the time of the crime, he had diminished capacity to make decisions, that somehow he wasn’t entirely free to choose whether or not to commit the crime. They might cite some form of neuroscientific evidence to argue that, at the time of the crime, his brain wasn’t functioning normally. And the jury and judge have to decide what to make of it. © 2016 Scientific American

Keyword: Consciousness
Link ID: 22024 - Posted: 03.24.2016

Giant manta rays have been filmed checking out their reflections in a way that suggests they are self-aware. Only a small number of animals, mostly primates, have passed the mirror test, widely used as a tentative test of self-awareness. “This new discovery is incredibly important,” says Marc Bekoff, of the University of Colorado in Boulder. “It shows that we really need to expand the range of animals we study.” But not everyone is convinced that the new study proves conclusively that manta rays, which have the largest brains of any fish, can do this – or indeed, that the mirror test itself is an appropriate measure of self-awareness. Csilla Ari, of the University of South Florida in Tampa, filmed two giant manta rays in a tank, with and without a mirror inside.The fish changed their behaviour in a way that suggested that they recognised the reflections as themselves as opposed to another manta ray. They did not show signs of social interaction with the image, which is what you would expect if they perceived it to be another individual. Instead, the rays repeatedly moved their fins and circled in front of the mirror (click on image below to see one in action). This suggests they could see whether their reflection moved when they moved. The frequency of these movements was much higher when the mirror was in the tank than when it was not. manta © Copyright Reed Business Information Ltd.

Keyword: Consciousness; Evolution
Link ID: 22015 - Posted: 03.22.2016

By BARBARA K. LIPSKA AS the director of the human brain bank at the National Institute of Mental Health, I am surrounded by brains, some floating in jars of formalin and others icebound in freezers. As part of my work, I cut these brains into tiny pieces and study their molecular and genetic structure. My specialty is schizophrenia, a devastating disease that often makes it difficult for the patient to discern what is real and what is not. I examine the brains of people with schizophrenia whose suffering was so acute that they committed suicide. I had always done my work with great passion, but I don’t think I really understood what was at stake until my own brain stopped working. In the first days of 2015, I was sitting at my desk when something freakish happened. I extended my arm to turn on the computer, and to my astonishment realized that my right hand disappeared when I moved it to the right lower quadrant of the keyboard. I tried again, and the same thing happened: The hand disappeared completely as if it were cut off at the wrist. It felt like a magic trick — mesmerizing, and totally inexplicable. Stricken with fear, I kept trying to find my right hand, but it was gone. I had battled breast cancer in 2009 and melanoma in 2012, but I had never considered the possibility of a brain tumor. I knew immediately that this was the most logical explanation for my symptoms, and yet I quickly dismissed the thought. Instead I headed to a conference room. My colleagues and I had a meeting scheduled to review our new data on the molecular composition of schizophrenia patients’ frontal cortex, a brain region that shapes who we are — our thoughts, emotions, memories. But I couldn’t focus on the meeting because the other scientists’ faces kept vanishing. Thoughts about a brain tumor crept quietly into my consciousness again, then screamed for attention. © 2016 The New York Times Company

Keyword: Consciousness; Vision
Link ID: 21984 - Posted: 03.14.2016

How is the brain able to use past experiences to guide decision-making? A few years ago, researchers supported by the National Institutes of Health discovered in rats that awake mental replay of past experiences is critical for learning and making informed choices. Now, the team has discovered key secrets of the underlying brain circuitry – including a unique system that encodes location during inactive periods. “Advances such as these in understanding cellular and circuit-level processes underlying such basic functions as executive function, social cognition, and memory fit into NIMH’s mission of discovering the roots of complex behaviors,” said NIMH acting director Bruce Cuthbert, Ph.D. While a rat is moving through a maze — or just mentally replaying the experience — an area in the brain’s memory hub, or hippocampus, specialized for locations, called CA1, communicates with a decision-making area in the executive hub or prefrontal cortex (PFC). A distinct subset of PFC neurons excited during mental replay of the experience are activated during movement, while another distinct subset, less engaged during movement in the maze – and therefore potentially distracting – are inhibited during replay. “Such strongly coordinated activity within this CA1-PFC circuit during awake replay is likely to optimize the brain’s ability to consolidate memories and use them to decide on future action” explained Shantanu Jadhav, Ph.D. (link is external), now an assistant professor at Brandeis University, Waltham, MA., the study’s co-first author. His contributions to this line of research were made possible, in part, by a Pathway to Independence award from the Office of Research Training and Career Development of the NIH’s National Institute of Mental Health (NIMH).

Keyword: Learning & Memory; Attention
Link ID: 21978 - Posted: 03.12.2016

By Kj Dell’Antonia New research shows that the youngest students in a classroom are more likely to be given a diagnosis of attention deficit hyperactivity disorder than the oldest. The findings raise questions about how we regard those wiggly children who just can’t seem to sit still – and who also happen to be the youngest in their class. Researchers in Taiwan looked at data from 378,881 children ages 4 to 17 and found that students born in August, the cut-off month for school entry in that country, were more likely to be given diagnoses of A.D.H.D. than students born in September. The children born in September would have missed the previous year’s cut-off date for school entry, and thus had nearly a full extra year to mature before entering school. The findings were published Thursday in The Journal of Pediatrics. While few dispute that A.D.H.D. is a legitimate disability that can impede a child’s personal and school success and that treatment can be effective, “our findings emphasize the importance of considering the age of a child within a grade when diagnosing A.D.H.D. and prescribing medication for treating A.D.H.D.,” the authors concluded. Dr. Mu-Hong Chen, a member of the department of psychiatry at Taipei Veterans General Hospital in Taiwan and the lead author of the study, hopes that a better understanding of the data linking relative age at school entry to an A.D.H.D. diagnosis will encourage parents, teachers and clinicians to give the youngest children in a grade enough time and help to allow them to prove their ability. Other research has shown similar results. An earlier study in the United States, for example, found that roughly 8.4 percent of children born in the month before their state’s cutoff date for kindergarten eligibility are given A.D.H.D. diagnoses, compared to 5.1 percent of children born in the month immediately afterward. © 2016 The New York Times Company

Keyword: ADHD; Development of the Brain
Link ID: 21977 - Posted: 03.12.2016

By Daniel Engber Nearly 20 years ago, psychologists Roy Baumeister and Dianne Tice, a married couple at Case Western Reserve University, devised a foundational experiment on self-control. “Chocolate chip cookies were baked in the room in a small oven,” they wrote in a paper that has been cited more than 3,000 times. “As a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Here’s how that experiment worked. Baumeister and Tice stacked their fresh-baked cookies on a plate, beside a bowl of red and white radishes, and brought in a parade of student volunteers. They told some of the students to hang out for a while unattended, eating only from the bowl of radishes, while another group ate only cookies. Afterward, each volunteer tried to solve a puzzle, one that was designed to be impossible to complete. Baumeister and Tice timed the students in the puzzle task, to see how long it took them to give up. They found that the ones who’d eaten chocolate chip cookies kept working on the puzzle for 19 minutes, on average—about as long as people in a control condition who hadn’t snacked at all. The group of kids who noshed on radishes flubbed the puzzle test. They lasted just eight minutes before they quit in frustration. The authors called this effect “ego depletion” and said it revealed a fundamental fact about the human mind: We all have a limited supply of willpower, and it decreases with overuse. © 2016 The Slate Group LLC.

Keyword: Attention
Link ID: 21965 - Posted: 03.08.2016

Angus Chen We know we should put the cigarettes away or make use of that gym membership, but in the moment, we just don't do it. There is a cluster of neurons in our brain critical for motivation, though. What if you could hack them to motivate yourself? These neurons are located in the middle of the brain, in a region called the ventral tegmental area. A paper published Thursday in the journal Neuron suggests that we can activate the region with a little bit of training. The researchers stuck 73 people into an fMRI, a scanner that can detect what part of the brain is most active, and focused on that area associated with motivation. When the researchers said "motivate yourself and make this part of your brain light up," people couldn't really do it. "They weren't that reliable when we said, 'Go! Get psyched. Turn on your VTA,' " says Dr. Alison Adcock, a psychiatrist at Duke and senior author on the paper. That changed when the participants were allowed to watch a neurofeedback meter that displayed activity in their ventral tegmental area. When activity ramps up, the participants see the meter heat up while they're in the fMRI tube. "Your whole mind is allowed to speak to a specific part of your brain in a way you never imagined before. Then you get feedback that helps you discover how to turn that part of the brain up or down," says John Gabrieli, a neuroscientist at the Massachusetts Institute of Technology who was not involved with the work. © 2016 npr

Keyword: Attention
Link ID: 21954 - Posted: 03.05.2016

Monya Baker Is psychology facing a ‘replication crisis’? Last year, a crowdsourced effort that was able to validate fewer than half of 98 published findings1 rang alarm bells about the reliability of psychology papers. Now a team of psychologists has reassessed the study and say that it provides no evidence for a crisis. “Our analysis completely invalidates the pessimistic conclusions that many have drawn from this landmark study,” says Daniel Gilbert, a psychologist at Harvard University in Cambridge, Massachusetts, and a co-author of the reanalysis, published on 2 March in Science2. But a response3 in the same issue of Science counters that the reanalysis itself depends on selective assumptions. And others say that psychology still urgently needs to improve its research practices. Statistical criticism In August 2015, a team of 270 researchers reported the largest ever single-study audit of the scientific literature. Led by Brian Nosek, executive director of the Center for Open Science in Charlottesville, Virginia, the Reproducibility Project attempted to replicate studies in 100 psychology papers. (It ended up with 100 replication attempts for 98 papers because of problems assigning teams to two papers.) According to one of several measures of reproducibility, just 36% could be confirmed; by another statistical measure, 47% could1. Either way, the results looked worryingly feeble. “Both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted” Not so fast, says Gilbert. © 2016 Nature Publishing Group

Keyword: Attention
Link ID: 21953 - Posted: 03.05.2016

By Christian Jarrett Most of us like to think that we’re independent-minded — we tell ourselves we like Adele’s latest album because it suits our taste, not because millions of other people bought it, or that we vote Democrat because we’re so enlightened, not because all our friends vote that way. The reality, of course, is that humans are swayed in all sorts of different ways — some of them quite subtle — by other people’s beliefs and expectations. Our preferences don’t form in a vacuum, but rather in something of a social pressure-cooker. This has been demonstrated over and over, perhaps most famously in the classic Asch conformity studies from the ‘50s. In those experiments, many participants went along with a blatantly wrong majority judgment about the lengths of different lines — simply, it seems, to fit in. (Although the finding is frequently exaggerated, the basic point about the power of social influence holds true.) But that doesn’t mean all humans are susceptible to peer pressure in the same way. You only have to look at your own friends and family to know that some people always seem to roll with the crowd, while others are much more independent-minded. What accounts for these differences? A new study in Frontiers in Human Neuroscience led by Dr. Juan Dominguez of Monash University in Melbourne, Australia, offers the first hint that part of the answer may come down to certain neural mechanisms. In short, the study suggests that people have a network in their brains that is attuned to disagreement with other people. When this network is activated, it makes us feel uncomfortable (we experience “cognitive dissonance,” to use the psychological jargon) and it’s avoiding this state that motivates us to switch our views as much as possible. It appears the network is more sensitive in some people than in others, and that this might account for varying degrees of pushover-ness. © 2016, New York Media LLC.

Keyword: Emotions; Attention
Link ID: 21948 - Posted: 03.03.2016

By Meeri Kim Teenagers tend to have a bad reputation in our society, and perhaps rightly so. When compared to children or adults, adolescents are more likely to engage in binge drinking, drug use, unprotected sex, criminal activity, and reckless driving. Risk-taking is like second nature to youth of a certain age, leading health experts to cite preventable and self-inflicted causes as the biggest threats to adolescent well-being in industrialized societies. But before going off on a tirade about groups of reckless young hooligans, consider that a recent study may have revealed a silver lining to all that misbehavior. While adolescents will take more risks in the presence of their peers than when alone, it turns out that peers can also encourage them to learn faster and engage in more exploratory acts. A group of 101 late adolescent males were randomly assigned to play the Iowa Gambling Task, a psychological game used to assess decision making, either alone or observed by their peers. The task involves four decks of cards: two are “lucky” decks that will generate long-term gain if the player continues to draw from them, while the other two are “unlucky” decks that have the opposite effect. The player chooses to play or pass cards drawn from one of these decks, eventually catching on to which of the decks are lucky or unlucky — and subsequently only playing from the lucky ones.

Keyword: Development of the Brain; Attention
Link ID: 21929 - Posted: 02.24.2016

By David Z. Hambrick We all make stupid mistakes from time to time. History is replete with examples. Legend has it that the Trojans accepted the Greek’s “gift” of a huge wooden horse, which turned out to be hollow and filled with a crack team of Greek commandos. The Tower of Pisa started to lean even before construction was finished—and is not even the world’s farthest leaning tower. NASA taped over the original recordings of the moon landing, and operatives for Richard Nixon’s re-election committee were caught breaking into a Watergate office, setting in motion the greatest political scandal in U.S. history. More recently, the French government spent $15 billion on a fleet of new trains, only to discover that they were too wide for some 1,300 station platforms. We readily recognize these incidents as stupid mistakes—epic blunders. On a more mundane level, we invest in get-rich-quick schemes, drive too fast, and make posts on social media that we later regret. But what, exactly, drives our perception of these actions as stupid mistakes, as opposed to bad luck? Their seeming mindlessness? The severity of the consequences? The responsibility of the people involved? Science can help us answer these questions. In a study just published in the journal Intelligence, using search terms such as “stupid thing to do”, Balazs Aczel and his colleagues compiled a collection of stories describing stupid mistakes from sources such as The Huffington Post and TMZ. One story described a thief who broke into a house and stole a TV and later returned for the remote; another described burglars who intended to steal cell phones but instead stole GPS tracking devices that were turned on and gave police their exact location. The researchers then had a sample of university students rate each story on the responsibility of the people involved, the influence of the situation, the seriousness of the consequences, and other factors. © 2016 Scientific American,

Keyword: Attention
Link ID: 21928 - Posted: 02.24.2016

Alison Abbott. More than 50 years after a controversial psychologist shocked the world with studies that revealed people’s willingness to harm others on order, a team of cognitive scientists has carried out an updated version of the iconic ‘Milgram experiments’. Their findings may offer some explanation for Stanley Milgram's uncomfortable revelations: when following commands, they say, people genuinely feel less responsibility for their actions — whether they are told to do something evil or benign. “If others can replicate this, then it is giving us a big message,” says neuroethicist Walter Sinnot-Armstrong of Duke University in Durham, North Carolina, who was not involved in the work. “It may be the beginning of an insight into why people can harm others if coerced: they don’t see it as their own action.” The study may feed into a long-running legal debate about the balance of personal responsibility between someone acting under instruction and their instructor, says Patrick Haggard, a cognitive neuroscientist at University College London, who led the work, published on 18 February in Current Biology1. Milgram’s original experiments were motivated by the trial of Nazi Adolf Eichmann, who famously argued that he was ‘just following orders’ when he sent Jews to their deaths. The new findings don’t legitimize harmful actions, Haggard emphasizes, but they do suggest that the ‘only obeying orders’ excuse betrays a deeper truth about how a person feels when acting under command. © 2016 Nature Publishing Group

Keyword: Attention; Emotions
Link ID: 21915 - Posted: 02.19.2016