Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1094

By KATE MURPHY Eavesdrop on any conversation or pay close attention to your own and you’ll hear laughter. From explosive bursts to muffled snorts, some form of laughter punctuates almost all verbal communication. Electronic communication, too, LOL. You’ll probably also notice that, more often than not, the laughter is in response to something that wasn’t very funny — or wasn’t funny at all. Observational studies suggest this is the case 80 percent to 90 percent of the time. Take Hillary Clinton’s strategic laughter during heated exchanges with Donald J. Trump during the presidential debates. Or Jimmy Fallon’s exaggerated laughter when interviewing guests on “The Tonight Show.” Or employees at Fox News reporting that they tried to “laugh off” unwanted sexual advances by Roger Ailes and others within the organization. How laughter went from a primal signal of safety (the opposite of a menacing growl) to an odd assortment of vocalizations that smooth as much as confuse social interactions is poorly understood. But researchers who study laughter say reflecting on when and why you titter, snicker or guffaw is a worthy exercise, given that laughter can harm as much as help you. “It’s a hall of mirrors of inferences and intentions every time you encounter laughter,” said Sophie Scott, a neuroscientist at University College London who studies how the brain produces and processes laughter. “You think it’s so simple. It’s just jokes and ha-ha but laughter is really sophisticated and complicated.” Laughter at its purest and most spontaneous is affiliative and bonding. To our forebears it meant, “We’re not going to kill each other! What a relief!” But as we’ve developed as humans so has our repertoire of laughter, unleashed to achieve ends quite apart from its original function of telling friend from foe. Some of it is social lubrication — the warm chuckles we give one another to be amiable and polite. Darker manifestations include dismissive laughter, which makes light of something someone said sincerely, and derisive laughter, which shames. © 2016 The New York Times Company

Keyword: Emotions; Attention
Link ID: 22781 - Posted: 10.24.2016

By Kensy Cooperrider, Rafael Núñez “What is the difference between yesterday and tomorrow?” The Yupno man we were interviewing, Danda, paused to consider his answer. A group of us sat on a hillside in the Yupno Valley, a remote nook high in the mountains of Papua New Guinea. Only days earlier we had arrived on a single-engine plane. After a steep hike from the grass airstrip, we found ourselves in the village of Gua, one of about 20 Yupno villages dotting the rugged terrain. We came all the way here because we are interested in time—in how Yupno people understand concepts such as past, present and future. Are these ideas universal, or are they products of our language, our culture and our environment? As we interviewed Danda and others in the village, we listened to what they said about time, but we paid even closer attention to what they did with their hands as they spoke. Gestures can be revealing. Ask English speakers about the difference between yesterday and tomorrow, and they might thrust a hand over the shoulder when referring to the past and then forward when referring to the future. Such unreflective movements reveal a fundamental way of thinking in which the past is at our backs, something that we “leave behind,” and the future is in front of us, something to “look forward” to. Would a Yupno speaker do the same? Danda was making just the kinds of gestures we were hoping for. As he explained the Yupno word for “yesterday,” his hand swept backward; as he mentioned “tomorrow,” it leaped forward. We all sat looking up a steep slope toward a jagged ridge, but as the light faded, we changed the camera angle, spinning around so that we and Danda faced in the opposite direction, downhill. With our backs now to the ridge, we looked over the Yupno River meandering toward the Bismarck Sea. “Let's go over that one more time,” we suggested. © 2016 Scientific American,

Keyword: Attention
Link ID: 22778 - Posted: 10.22.2016

By Catherine Caruso Imagine you are faced with the classic thought experiment dilemma: You can take a pile of money now or wait and get an even bigger stash of cash later on. Which option do you choose? Your level of self-control, researchers have found, may have to do with a region of the brain that lets us take the perspective of others—including that of our future self. A study, published today in Science Advances, found that when scientists used noninvasive brain stimulation to disrupt a brain region called the temporoparietal junction (TPJ), people appeared less able to see things from the point of view of their future selves or of another person, and consequently were less likely to share money with others and more inclined to opt for immediate cash instead of waiting for a larger bounty at a later date. The TPJ, which is located where the temporal and parietal lobes meet, plays an important role in social functioning, particularly in our ability to understand situations from the perspectives of other people. However, according to Alexander Soutschek, an economist at the University of Zurich and lead author on the study, previous research on self-control and delayed gratification has focused instead on the prefrontal brain regions involved in impulse control. “When you have a closer look at the literature, you sometimes find in the neuroimaging data that the TPJ is also active during delay of gratification,” Soutschek says, “but it's never interpreted.” © 2016 Scientific American

Keyword: Attention
Link ID: 22772 - Posted: 10.20.2016

By DONNA DE LA CRUZ Some of the most troubling images of the opioid crisis involve parents buying or using drugs with their children in tow. Now new research offers a glimpse into the addicted brain, finding that the drugs appear to blunt a person’s natural parenting instincts. Researchers at the Perelman School of Medicine at the University of Pennsylvania scanned the brains of 47 men and women before and after they underwent treatment for opioid dependence. While in the scanner, the study subjects looked at various images of babies, and the researchers measured the brain’s response. The brain scans were compared with the responses of 25 healthy people. What the study subjects didn’t know was that the photos had been manipulated to adjust the “baby schema,” the term used to describe the set of facial and other features like round faces and big eyes that make our brains register babies as irresistible, kicking in our instinct to care for them. Sometimes the babies’ features were exaggerated to make them even more adorable; in others, the chubby cheeks and big eyes were reduced, making the faces less appealing. Studies show that a higher baby schema activates the part of the brain called the ventral striatum, a key component of the brain reward pathway. Compared with the brains of healthy people, the brains of people with opioid dependence didn’t produce strong responses to the cute baby pictures. But once the opioid-dependent people received a drug called naltrexone, which blocks the effects of opioids, their brains produced a more normal response. “When the participants were given an opioid blocker, their baby schema became more similar to that of healthy people,” said Dr. Daniel D. Langleben, one of the researchers. “The data also raised in question whether opioid medications may affect social cognition in general.” © 2016 The New York Times Company

Keyword: Drug Abuse; Attention
Link ID: 22760 - Posted: 10.15.2016

By CASEY SCHWARTZ Have you ever been to Enfield? I had never even heard of it until I was 23 and living in London for graduate school. One afternoon, I received notification that a package whose arrival I had been anticipating for days had been bogged down in customs and was now in a FedEx warehouse in Enfield, an unremarkable London suburb. I was outside my flat within minutes of receiving this news and on the train to Enfield within the hour, staring through the window at the gray sky. The package in question, sent from Los Angeles, contained my monthly supply of Adderall. Adderall, the brand name for a mixture of amphetamine salts, is more strictly regulated in Britain than in the United States, where, the year before, in 2005, I became one of the millions of Americans to be prescribed a stimulant medication. The train to Enfield was hardly the greatest extreme to which I would go during the decade I was entangled with Adderall. I would open other people’s medicine cabinets, root through trash cans where I had previously disposed of pills, write friends’ college essays for barter. Once, while living in New Hampshire, I skipped a day of work to drive three hours each way to the health clinic where my prescription was still on file. Never was I more resourceful or unswerving than when I was devising ways to secure more Adderall. Adderall is prescribed to treat Attention Deficit Hyperactivity Disorder, a neurobehavioral condition marked by inattention, hyperactivity and impulsivity that was first included in the D.S.M. in 1987 and predominantly seen in children. That condition, which has also been called Attention Deficit Disorder, has been increasingly diagnosed over recent decades: In the 1990s, an estimated 3 to 5 percent of school-age American children were believed to have A.D.H.D., according to the Centers for Disease Control and Prevention; by 2013, that figure was 11 percent. It continues to rise. And the increase in diagnoses has been followed by an increase in prescriptions. In 1990, 600,000 children were on stimulants, usually Ritalin, an older medication that often had to be taken multiple times a day. By 2013, 3.5 million children were on stimulants, and in many cases, the Ritalin had been replaced by Adderall, officially brought to market in 1996 as the new, upgraded choice for A.D.H.D. — more effective, longer lasting. © 2016 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 22748 - Posted: 10.12.2016

Bruce Bower Apes understand what others believe to be true. What’s more, they realize that those beliefs can be wrong, researchers say. To make this discovery, researchers devised experiments involving a concealed, gorilla-suited person or a squirreled-away rock that had been moved from their original hiding places — something the apes knew, but a person looking for King Kong or the stone didn’t. “Apes anticipated that an individual would search for an object where he last saw it, even though the apes knew that the object was no longer there,” says evolutionary anthropologist Christopher Krupenye. If this first-of-its-kind finding holds up, it means that chimpanzees, bonobos and orangutans can understand that others’ actions sometimes reflect mistaken assumptions about reality. Apes’ grasp of others’ false beliefs roughly equals that of human 2-year-olds tested in much the same way, say Krupenye of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and his colleagues. Considering their targeted gazes during brief experiments, apes must rapidly assess others’ beliefs about the world in wild and captive communities, the researchers propose in the October 7 Science. Understanding the concept of false beliefs helps wild and captive chimps deceive their comrades, such as hiding food from those who don’t share, Krupenye suggests. |© Society for Science & the Public 2000 - 2016.

Keyword: Intelligence; Evolution
Link ID: 22733 - Posted: 10.08.2016

In his memoir Do No Harm, Henry Marsh confesses to the uncertainties he's dealt with as a surgeon and reflects on the enigmas of the brain and consciousness. Originally broadcast May 26, 2015. DAVE DAVIES, HOST: This is FRESH AIR. I'm Dave Davies, sitting in for Terry Gross. Our guest has opened heads and cut into brains, performing delicate and risky surgery on the part of the body that controls everything - breathing, movement, memory, and consciousness. In his work as a neurosurgeon, Dr. Henry Marsh has fixed aneurysms and spinal problems and spent many years operating on brain tumors. In his memoir, Dr. Marsh discusses some of his most challenging cases, triumphs and failures and confesses to the fears and uncertainties he's dealt with. He explains the surgical instruments he uses and how procedures have changed since he started practicing. And he reflects on the state of his profession and the mysteries of the brain and consciousness. Last year, he retired as the senior consulting neurosurgeon at St. George's Hospital in London, where he practiced for 28 years. He was the subject of the Emmy Award-winning 2007 documentary "The English Surgeon," which followed him in Ukraine, trying to help patients and improve conditions at a rundown hospital. Marsh's book, "Do No Harm," is now out in paperback. Terry spoke to him when it was published in hardback. © 2016 npr

Keyword: Consciousness
Link ID: 22732 - Posted: 10.08.2016

Emily Badger One of the newest chew toys in the presidential campaign is “implicit bias,” a term Mike Pence repeatedly took exception to in the vice-presidential debate on Tuesday. Police officers hear all this badmouthing, said Mr. Pence, Donald J. Trump’s running mate, in response to a question about whether society demands too much of law enforcement. They hear politicians painting them with one broad brush, with disdain, with automatic cries of implicit bias. He criticized Hillary Clinton for saying, in the first presidential debate, that everyone experiences implicit bias. He suggested a black police officer who shoots a black civilian could not logically experience such bias. “Senator, please,” Mr. Pence said, addressing his Democratic opponent, Tim Kaine, “enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias every time tragedy occurs.” The concept, in his words, came across as an insult, a put-down on par with branding police as racists. Many Americans may hear it as academic code for “racist.” But that connotation does not line up with scientific research on what implicit bias is and how it really operates. Researchers in this growing field say it isn’t just white police officers, but all of us, who have biases that are subconscious, hidden even to ourselves. Implicit bias is the mind’s way of making uncontrolled and automatic associations between two concepts very quickly. In many forms, implicit bias is a healthy human adaptation — it’s among the mental tools that help you mindlessly navigate your commute each morning. It crops up in contexts far beyond policing and race (if you make the rote assumption that fruit stands have fresher produce, that’s implicit bias). But the same process can also take the form of unconsciously associating certain identities, like African-American, with undesirable attributes, like violence. © 2016 The New York Times Company

Keyword: Attention
Link ID: 22730 - Posted: 10.08.2016

James Gorman When the leader of a flock goes the wrong way, what will the flock do? With human beings, nobody can be sure. But with homing pigeons, the answer is that they find their way home anyway. Either the lead pigeon recognizes that it has no clue and falls back into the flock, letting birds that know where they are going take over, or the flock collectively decides that the direction that it is taking just doesn’t feel right, and it doesn’t follow. Several European scientists report these findings in a stirring report in Biology Letters titled, “Misinformed Leaders Lose Influence Over Pigeon Flocks.” Isobel Watts, a doctoral student in zoology at Oxford, conducted the study with her advisers, Theresa Burt de Perera and Dora Biro, and with the participation of Mate Nagy, a statistical physicist from Hungary, who is affiliated with several institutions, including Oxford and the Hungarian Academy of Sciences. Dr. Biro, who studies social behavior in primates as well as pigeons, said that the common questions that ran through her work were “about group living and what types of challenges and opportunities it brings.” She and her colleagues at Oxford have pioneered a method of studying flock behavior that uses very-fine-resolution GPS units, which the birds wear in pigeon-size backpacks. The devices record a detailed position for each bird a number of times a second. Researchers in Budapest and Oxford developed software to analyze small movements and responses of every bird in a flock. With this method, the scientists can identify which pigeons are leading the way. They can build a picture of how each bird responds to changes in the flight of other birds. © 2016 The New York Times Company

Keyword: Animal Migration; Attention
Link ID: 22696 - Posted: 09.26.2016

George Paxinos Many people today believe they possess a soul. While conceptions of the soul differ, many would describe it as an “invisible force that appears to animate us”. It’s often believed the soul can survive death and is intimately associated with a person’s memories, passions and values. Some argue the soul has no mass, takes no space and is localised nowhere. But as a neuroscientist and psychologist, I have no use for the soul. On the contrary, all functions attributable to this kind of soul can be explained by the workings of the brain. Psychology is the study of behaviour. To carry out their work of modifying behaviour, such as in treating addiction, phobia, anxiety and depression, psychologists do not need to assume people have souls. For the psychologists, it is not so much that souls do not exist, it is that there is no need for them. It is said psychology lost its soul in the 1930s. By this time, the discipline fully became a science, relying on experimentation and control rather than introspection. What is the soul? It is not only religious thinkers who have proposed that we possess a soul. Some of the most notable proponents have been philosophers, such as Plato (424-348 BCE) and René Descartes in the 17th century. Plato believed we do not learn new things but recall things we knew before birth. For this to be so, he concluded, we must have a soul. Centuries later, Descartes wrote his thesis Passions of the Soul, where he argued there was a distinction between the mind, which he described as a “thinking substance”, and the body, “the extended substance”. He wrote: © 2010–2016, The Conversation US, Inc.

Keyword: Consciousness
Link ID: 22692 - Posted: 09.26.2016

By CATHERINE SAINT LOUIS Attention deficit disorder is the most common mental health diagnosis among children under 12 who die by suicide, a new study has found. Very few children aged 5 to 11 take their own lives, and little is known about these deaths. The new study, which included deaths in 17 states from 2003 to 2012, compared 87 children aged 5 to 11 who committed suicide with 606 adolescents aged 12 to 14 who did, to see how they differed. The research was published on Monday in the journal Pediatrics. About a third of the children of each group had a known mental health problem. The very young who died by suicide were most likely to have had attention deficit disorder, or A.D.D., with or without accompanying hyperactivity. By contrast, nearly two-thirds of early adolescents who took their lives struggled with depression. Suicide prevention has focused on identifying children struggling with depression; the new study provides an early hint that this strategy may not help the youngest suicide victims. “Maybe in young children, we need to look at behavioral markers,” said Jeffrey Bridge, the paper’s senior author and an epidemiologist at the Research Institute at Nationwide Children’s Hospital in Columbus, Ohio. Jill Harkavy-Friedman, the vice president of research at the American Foundation for Suicide Prevention, agreed. “Not everybody who is at risk for suicide has depression,” even among adults, said Dr. Harkavy-Friedman, who was not involved in the new research. Yet the new research does not definitively establish that attention deficit disorder and attention deficit hyperactivity disorder, or A.D.H.D., are causal risk factors for suicide in children, Dr. Bridge said. Instead, the findings suggest that “suicide is potentially a more impulsive act among children.” © 2016 The New York Times Company

Keyword: ADHD; Depression
Link ID: 22668 - Posted: 09.19.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Keyword: Intelligence; Attention
Link ID: 22666 - Posted: 09.19.2016

By Colin Barras Subtract 8 from 52. Did you see the calculation in your head? While a leading theory suggests our visual experiences are linked to our understanding of numbers, a study of people who have been blind from birth suggests the opposite. The link between vision and number processing is strong. Sighted people can estimate the number of people in a crowd just by looking, for instance, while children who can mentally rotate an object and correctly imagine how it might look from a different angle often develop better mathematical skills. “It’s actually hard to think of a situation when you might process numbers through any modality other than vision,” says Shipra Kanjlia at Johns Hopkins University in Baltimore, Maryland. But blind people can do maths too. To understand how they might compensate for their lack of visual experience, Kanjlia and her colleagues asked 36 volunteers – 17 of whom had been blind at birth – to do simple mental arithmetic inside an fMRI scanner. To level the playing field, the sighted participants wore blindfolds. We know that a region of the brain called the intraparietal sulcus (IPS) is, and brain scans revealed that the same area is similarly active in blind people too. “It’s really surprising,” says Kanjlia. “It turns out brain activity is remarkably similar, at least in terms of classic number processing.” This may mean we have a deep understanding of how to handle numbers that is entirely independent of visual experience. This suggests we are all born with a natural understanding of numbers – an idea many researchers find difficult to accept. © Copyright Reed Business Information Ltd.

Keyword: Vision; Attention
Link ID: 22664 - Posted: 09.17.2016

Dean Burnett You remember that time a children’s TV presenter, one who has been working in children’s television for decades and is now employed on a channel aimed at under-8-year-olds, decided to risk it all and say one of the worst possible swear words on a show for pre-schoolers that he is famous for co-hosting? Remember how he took a huge risk for no appreciable gain and uttered a context-free profanity to an audience of toddlers? How he must have wanted to swear on children’s TV but paradoxically didn’t want anyone to notice so “snuck it in” as part of a song, where it would be more ambiguous? How all the editors and regulators at the BBC happened to completely miss it and allow it to be aired? Remember this happening? Well you shouldn’t, because it clearly didn’t. No presenter and/or channel would risk their whole livelihood in such a pointless, meaningless way, especially not the ever-pressured BBC. And, yet, an alarming number of people do think it happened. Apparently, there have been some “outraged parents” who are aghast at the whole thing. This seems reasonable in some respects; if your toddler was subjected to extreme cursing then as a parent you probably would object. On the other hand, if your very small child is able to recognise strong expletives, then perhaps misheard lyrics on cheerful TV shows aren’t the most pressing issue in their life. Regardless, a surprising number of people report that they did genuinely “hear” the c-word. This is less likely to be due to a TV presenter having some sort of extremely-fleeting breakdown, and more likely due to the quirks and questionable processing of our senses by our powerful yet imperfect brains. © 2016 Guardian News and Media Limited

Keyword: Hearing; Attention
Link ID: 22662 - Posted: 09.17.2016

André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited

Keyword: Attention
Link ID: 22653 - Posted: 09.15.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Keyword: Attention; Learning & Memory
Link ID: 22646 - Posted: 09.12.2016

By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist

Keyword: Learning & Memory; Attention
Link ID: 22640 - Posted: 09.10.2016

Chris Chambers One of the most compelling impressions in everyday life is that wherever we look, we “see” everything that is happening in front of us – much like a camera. But this impression is deceiving. In reality our senses are bombarded by continual waves of stimuli, triggering an avalanche of sensations that far exceed the brain’s capacity. To make sense of the world, the brain needs to determine which sensations are the most important for our current goals, focusing resources on the ones that matter and throwing away the rest. These computations are astonishingly complex, and what makes attention even more remarkable is just how effortless it is. The mammalian attention system is perhaps the most efficient and precisely tuned junk filter we know of, refined through millions of years of annoying siblings (and some evolution). Attention is amazing but no system is ever perfect. Our brain’s computational reserves are large but not infinite, and under the right conditions we can “break it” and peek behind the curtain. This isn’t just a fun trick – understanding these limits can yield important insights into psychology and neurobiology, helping us to diagnose and treat impairments that follow brain injury and disease. Thanks to over a hundred years of psychology research, it’s relatively easy to reveal attention in action. One way is through the phenomenon of change blindness. Try it yourself by following the instructions in the short video below (no sound). When we think of the term “blindness” we tend to assume a loss of vision caused by damage to the eye or optic nerves. But as you saw in the video, change blindness is completely normal and is caused by maxing out your attentional capacity. © 2016 Guardian News and Media Limited

Keyword: Attention; Vision
Link ID: 22633 - Posted: 09.06.2016

A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily

Keyword: Attention; Vision
Link ID: 22627 - Posted: 09.05.2016

By STEVE SILBERMAN In the late 1930s, Charles Bradley, the director of a home for “troublesome” children in Rhode Island, had a problem. The field of neuroscience was still in its infancy, and one of the few techniques available to allow psychiatrists like Bradley to ponder the role of the brain in emotional disorders was a procedure that required replacing a volume of cerebrospinal fluid in the patient’s skull with air. This painstaking process allowed any irregularities to stand out clearly in X-ray images, but many patients suffered excruciating headaches that lasted for weeks afterward. Meanwhile, a pharmaceutical company called Smith, Kline & French was facing a different sort of problem. The firm had recently acquired the rights to sell a powerful stimulant then called “benzedrine sulfate” and was trying to create a market for it. Toward that end, the company made quantities of the drug available at no cost to doctors who volunteered to run studies on it. Bradley was a firm believer that struggling children needed more than a handful of pills to get better; they also needed psychosocial therapy and the calming and supportive environment that he provided at the home. But he took up the company’s offer, hoping that the drug might eliminate his patients’ headaches. It did not. But the Benzedrine did have an effect that was right in line with Smith, Kline & French’s aspirations for its new product: The drug seemed to boost the children’s eagerness to learn in the classroom while making them more amenable to following the rules. The drug seemed to calm the children’s mood swings, allowing them to become, in the words of their therapists, more “attentive” and “serious,” able to complete their schoolwork and behave. Bradley was amazed that Benzedrine, a forerunner of Ritalin and Adderall, was such a great normalizer, turning typically hard-to-manage kids into models of complicity and decorum. But even after marveling at the effects of the drug, he maintained that medication should be considered for children only in addition to other forms of therapy. © 2016 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 22612 - Posted: 08.30.2016