Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 917

|By Esther Landhuis As we age, we seem to get worse at ignoring irrelevant stimuli. It's what makes restaurant conversations challenging—having to converse while also shutting out surrounding chatter. New research bears out the aging brain's distractibility but also suggests that training may help us tune out interference. Scientists at Brown University recruited seniors and twentysomethings for a visual experiment. Presented with a sequence of letters and numbers, participants were asked to report back only the numbers—all the while disregarding a series of meaningless dots. Sometimes the dots moved randomly, but other times they traveled in a clear direction, making them harder to ignore. Older participants ended up accidentally learning the dots' patterns, based on the accuracy of their answers when asked which way the dots were moving, whereas young adults seemed able to suppress that information and focus on the numbers, the researchers reported last November in Current Biology. In a separate study published in Neuron, scientists at the University of California, San Francisco, showed they could train aging brains to become less distractible. Their regimen helped aging rats as well as older people. The researchers played three different sounds and rewarded trainees for identifying a target tone while ignoring distracter frequencies. As the subjects improved, the task grew more challenging—the distracting tone became harder to discriminate from the target. © 2015 Scientific American,

Keyword: Attention; Alzheimers
Link ID: 20681 - Posted: 03.12.2015

Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.

Keyword: Attention
Link ID: 20672 - Posted: 03.10.2015

By TIMOTHY WILLIAMS In January 1972, Cecil Clayton was cutting wood at his family’s sawmill in southeastern Missouri when a piece of lumber flew off the circular saw blade and struck him in the forehead. The impact caved in part of Mr. Clayton’s skull, driving bone fragments into his brain. Doctors saved his life, but in doing so had to remove 20 percent of his frontal lobe, which psychiatrists say led Mr. Clayton to be tormented for years by violent impulses, schizophrenia and extreme paranoia. In 1996, his lawyers say, those impulses drove Mr. Clayton to kill a law enforcement officer. Today, as Mr. Clayton, 74, sits on death row, his lawyers have returned to that 1972 sawmill accident in a last-ditch effort to save his life, arguing that Missouri’s death penalty law prohibits the execution of severely brain-damaged people. Lawyers for Mr. Clayton, who has an I.Q. of 71, say he should be spared because his injury has made it impossible for him to grasp the significance of his death sentence, scheduled for March 17. “There was a profound change in him that he doesn’t understand, and neither did his family,” said Elizabeth Unger Carlyle, one of Mr. Clayton’s lawyers. While several rulings by the United States Supreme Court in recent years have narrowed the criteria for executing people who have a mental illness, states continue to hold wide sway in establishing who is mentally ill. The debate surrounding Mr. Clayton involves just how profoundly his impairment has affected his ability to understand what is happening to him. Mr. Clayton is missing about 7.7 percent of his brain. © 2015 The New York Times Company

Keyword: Aggression; Attention
Link ID: 20669 - Posted: 03.09.2015

|By Christof Koch In the Dutch countryside, a tall, older man, dressed in a maroon sports coat, his back slightly stooped, stands out because of his height and a pair of extraordinarily bushy eyebrows. His words, inflected by a British accent, are directed at a middle-aged man with long, curly brown hair, penetrating eyes and a dark, scholarly gown, who talks in only a halting English that reveals his native French origins. Their strangely clashing styles of speaking and mismatched clothes do not seem to matter to them as they press forward, with Eyebrows peering down intently at the Scholar. There is something distinctly odd about the entire meeting—a crossing of time, place and disciplines. Eyebrows: So I finally meet the man who doubts everything. The Scholar: (not missing a beat) At this time, I admit nothing that is not necessarily true. I'm famous for that! Eyebrows: Is there anything that you are certain of? (sotto voce) Besides your own fame? The Scholar: (evading the sarcastic jibe) I can't be certain of my fame. Indeed, I can't even be certain that there is a world out there, for I could be dreaming or hallucinating it. I can't be certain about the existence of my own body, its shape and extension, its corporality, for again I might be fooling myself. But now what am I, when I suppose that there is some supremely powerful and, if I may be permitted to say so, malicious deceiver who deliberately tries to fool me in any way he can? Given this evil spirit, how do I know that my sensations about the outside world—that is, it looks, feels and smells in a particular way—are not illusions, conjured up by Him to deceive me? It seems to me that therefore I can never know anything truly about the world. Nothing, rien du tout. I have to doubt everything. © 2015 Scientific American

Keyword: Consciousness
Link ID: 20640 - Posted: 03.03.2015

By Neuroskeptic In an interesting short paper just published in Trends in Cognitive Science, Caltech neuroscientist Ralph Adolphs offers his thoughts on The Unsolved Problems of Neuroscience. Here’s Adolphs’ list of the top 23 questions (including 3 “meta” issues), which, he says, was inspired by Hilbert’s famous set of 23 mathematical problems: Problems that are solved, or soon will be: I. How do single neurons compute? II. What is the connectome of a small nervous system, like that of Caenorhabitis elegans (300 neurons)? III. How can we image a live brain of 100,000 neurons at cellular and millisecond resolution? IV. How does sensory transduction work? Problems that we should be able to solve in the next 50 years: V. How do circuits of neurons compute? VI. What is the complete connectome of the mouse brain (70,000,000 neurons)? VII. How can we image a live mouse brain at cellular and millisecond resolution? VIII. What causes psychiatric and neurological illness? IX. How do learning and memory work? X. Why do we sleep and dream? XI. How do we make decisions? XII. How does the brain represent abstract ideas? Problems that we should be able to solve, but who knows when: XIII. How does the mouse brain compute? XIV. What is the complete connectome of the human brain (80,000,000,000 neurons)? XV. How can we image a live human brain at cellular and millisecond resolution? XVI. How could we cure psychiatric and neurological diseases? XVII. How could we make everybody’s brain function best? Problems we may never solve: XVIII. How does the human brain compute? XIX. How can cognition be so flexible and generative? XX. How and why does conscious experience arise? Meta-questions: XXI. What counts as an explanation of how the brain works? (and which disciplines would be needed to provide it?) XXII. How could we build a brain? (how do evolution and development do it?) XXIII. What are the different ways of understanding the brain? (what is function, algorithm, implementation?) Adolphs R (2015). The unsolved problems of neuroscience. Trends in cognitive sciences PMID: 25703689

Keyword: Consciousness
Link ID: 20637 - Posted: 03.02.2015

By Adam Rogers The fact that a single image could polarize the entire Internet into two aggressive camps is, let’s face it, just another Thursday. But for the past half-day, people across social media have been arguing about whether a picture depicts a perfectly nice bodycon dress as blue with black lace fringe or white with gold lace fringe. And neither side will budge. This fight is about more than just social media—it’s about primal biology and the way human eyes and brains have evolved to see color in a sunlit world. Light enters the eye through the lens—different wavelengths corresponding to different colors. The light hits the retina in the back of the eye where pigments fire up neural connections to the visual cortex, the part of the brain that processes those signals into an image. Critically, though, that first burst of light is made of whatever wavelengths are illuminating the world, reflecting off whatever you’re looking at. Without you having to worry about it, your brain figures out what color light is bouncing off the thing your eyes are looking at, and essentially subtracts that color from the “real” color of the object. “Our visual system is supposed to throw away information about the illuminant and extract information about the actual reflectance,” says Jay Neitz, a neuroscientist at the University of Washington. “But I’ve studied individual differences in color vision for 30 years, and this is one of the biggest individual differences I’ve ever seen.” (Neitz sees white-and-gold.) Usually that system works just fine. This image, though, hits some kind of perceptual boundary. That might be because of how people are wired. Human beings evolved to see in daylight, but daylight changes color. WIRED.com © 2015 Condé Nast

Keyword: Vision; Attention
Link ID: 20632 - Posted: 02.28.2015

by Helen Thomson We meet in a pub, we have a few drinks, some dinner and then you lean in for a kiss. You predict, based on our previous interactions, that the kiss will be reciprocated – rather than landing you with a slap in the face. All our social interactions require us to anticipate another person's undecided intentions and actions. Now, researchers have discovered specific brain cells that allow monkeys to do this. It is likely that the cells do the same job in humans. Keren Haroush and Ziv Williams at Harvard Medical School trained monkeys to play a version of the prisoner's dilemma, a game used to study cooperation. The monkeys sat next to each other and decided whether or not they wanted to cooperate with their companion, by moving a joystick to pick either option. Moving the joystick towards an orange circle meant cooperate, a blue triangle meant "not this time". Neither monkey could see the other's face, or receive any clues about their planned action. If the monkeys cooperated, both received four drops of juice. If one cooperated and the other decided not to, the one who cooperated received one drop, and the other received six drops of juice. If both declined to work together they both received two drops of juice. Once both had made their selections, they could see what the other monkey had chosen and hear the amount of juice their companion was enjoying. © Copyright Reed Business Information Ltd.

Keyword: Attention
Link ID: 20627 - Posted: 02.27.2015

Elizabeth Gibney DeepMind, the Google-owned artificial-intelligence company, has revealed how it created a single computer algorithm that can learn how to play 49 different arcade games, including the 1970s classics Pong and Space Invaders. In more than half of those games, the computer became skilled enough to beat a professional human player. The algorithm — which has generated a buzz since publication of a preliminary version in 2013 (V. Mnih et al. Preprint at http://arxiv.org/abs/1312.5602; 2013) — is the first artificial-intelligence (AI) system that can learn a variety of tasks from scratch given only the same, minimal starting information. “The fact that you have one system that can learn several games, without any tweaking from game to game, is surprising and pretty impressive,” says Nathan Sprague, a machine-learning scientist at James Madison University in Harrisonburg, Virginia. DeepMind, which is based in London, says that the brain-inspired system could also provide insights into human intelligence. “Neuroscientists are studying intelligence and decision-making, and here’s a very clean test bed for those ideas,” says Demis Hassabis, co-founder of DeepMind. He and his colleagues describe the gaming algorithm in a paper published this week (V. Mnih et al. Nature 518, 529–533; 2015. Games are to AI researchers what fruit flies are to biology — a stripped-back system in which to test theories, says Richard Sutton, a computer scientist who studies reinforcement learning at the University of Alberta in Edmonton, Canada. “Understanding the mind is an incredibly difficult problem, but games allow you to break it down into parts that you can study,” he says. But so far, most human-beating computers — such as IBM’s Deep Blue, which beat chess world champion Garry Kasparov in 1997, and the recently unveiled algorithm that plays Texas Hold ’Em poker essentially perfectly (see Nature http://doi.org/2dw; 2015)—excel at only one game. © 2015 Nature Publishing Group

Keyword: Robotics; Learning & Memory
Link ID: 20626 - Posted: 02.27.2015

People with attention deficit hyperactivity disorder are about twice as likely to die prematurely as those without the disorder, say researchers. Researchers followed 1.92 million Danes, including 32,000 with ADHD, from birth through to 2013. "In this nationwide prospective cohort study with up to 32-year followup, children, adolescents and adults with ADHD had decreased life expectancy and more than double the risk of death compared with people without ADHD," Soren Dalsgaard, from Aarhus University in Denmark, and his co-authors concluded in Wednesday's online issue of Lancet. Actress Kirstie Alley holds a picture of Raymond Perone while testifying in favour of a bill designed to curb the over-prescribing of psychotropic drugs. Danish researchers studying ADHD say medications can reduce symptoms of inattention and impulsivity. (Phil Coale/Associated Press) "People diagnosed with ADHD in adulthood had a greater risk of death than did those diagnosed in childhood and adolescence. This finding could be caused by persistent ADHD being a more severe form of the disorder." Of the 107 individuals with ADHD who died, information on cause of death was available for 79. Of those, 25 died from natural causes and 54 from unnatural causes, including 42 from accidents. Being diagnosed with ADHD along with oppositional defiant disorder, conduct disorder and substance use disorder also increased the risk of death, the researchers found. Mortality risk was also higher for females than males, which led Dalsgaard to stress the need for early diagnosis, especially in girls and women, and to treat co-existing disorders. Although talk of premature death will worry parents and patients, they can seek solace in knowing the absolute risk of premature death at an individual level is low and can be greatly reduced with treatment, Stephen Faraone, a professor of psychiatry and director of child and adolescent psychiatry research at SUNY Upstate Medical University in New York, said in a journal commentary published with the study. ©2015 CBC/Radio-Canada.

Keyword: ADHD; Attention
Link ID: 20623 - Posted: 02.26.2015

|By Matthew Hutson We like to think of our moral judgments as consistent, but they can be as capricious as moods. Research reveals that such judgments are swayed by incidental emotions and perceptions—for instance, people become more moralistic when they feel dirty or sense contamination, such as in the presence of moldy food. Now a series of studies shows that hippies, the obese and “trailer trash” suffer prejudicial treatment because they tend to elicit disgust. Researchers asked volunteers to read short paragraphs about people committing what many consider to be impure acts, such as watching pornography, swearing or being messy. Some of the paragraphs described the individuals as being a hippie, obese or trailer trash—and the volunteers judged these fictional sinners more harshly, according to the paper in the Journal of Experimental Psychology: General. Questionnaires revealed that feelings of disgust toward these groups were driving the volunteers' assessments. A series of follow-up studies solidified the link, finding that these groups also garnered greater praise for purity-related virtues, such as keeping a neat cubicle. If the transgression in question did not involve purity, such as not tipping a waiter, the difference in judgment disappeared. “The assumption people have is that we draw on values that are universal and important,” says social psychologist E. J. Masicampo of Wake Forest University, who led the study, “but something like mentioning that a person is overweight can really push that judgment around. It's triggering these gut-level emotions.” The researchers also looked for real-world effects. © 2015 Scientific American

Keyword: Emotions; Attention
Link ID: 20622 - Posted: 02.26.2015

By Christie Aschwanden Paul Offit likes to tell a story about how his wife, pediatrician Bonnie Offit, was about to give a child a vaccination when the kid was struck by a seizure. Had she given the injection a minute sooner, Paul Offit says, it would surely have appeared as though the vaccine had caused the seizure and probably no study in the world would have convinced the parent otherwise. (The Offits have such studies at the ready — Paul is the director of the Vaccine Education Center at the Children’s Hospital of Philadelphia and author of “Deadly Choices: How the Anti-Vaccine Movement Threatens Us All.”) Indeed, famous anti-vaxxer Jenny McCarthy has said her son’s autism and seizures are linked to “so many shots” because vaccinations preceded his symptoms. But, as Offit’s story suggests, the fact that a child became sick after a vaccine is not strong evidence that the immunization was to blame. Psychologists have a name for the cognitive bias that makes us prone to assigning a causal relationship to two events simply because they happened one after the other: the “illusion of causality.” A study recently published in the British Journal of Psychology investigates how this illusion influences the way we process new information. Its finding: Causal illusions don’t just cement erroneous ideas in the mind; they can also prevent new information from correcting them. Helena Matute, a psychologist at Deusto University in Bilbao, Spain, and her colleagues enlisted 147 college students to take part in a computer-based task in which they each played a doctor who specializes in a fictitious rare disease and assessed whether new medications could cure it. ©2015 ESPN Internet Ventures.

Keyword: Attention; Emotions
Link ID: 20595 - Posted: 02.19.2015

Tom Stafford Trusting your instincts may help you to make better decisions than thinking hard, a study suggests. It is a common misconception that we know our own minds. As I move around the world, walking and talking, I experience myself thinking thoughts. "What shall I have for lunch?", I ask myself. Or I think, "I wonder why she did that?" and try and figure it out. It is natural to assume that this experience of myself is a complete report of my mind. It is natural, but wrong. There's an under-mind, all psychologists agree – an unconscious which does a lot of the heavy lifting in the process of thinking. If I ask myself what is the capital of France the answer just comes to mind – Paris! If I decide to wiggle my fingers, they move back and forth in a complex pattern that I didn't consciously prepare, but which was delivered for my use by the unconscious. The big debate in psychology is exactly what is done by the unconscious, and what requires conscious thought. Or to use the title of a notable paper on the topic, 'Is the unconscious smart or dumb?' One popular view is that the unconscious can prepare simple stimulus-response actions, deliver basic facts, recognise objects and carry out practised movements. Complex cognition involving planning, logical reasoning and combining ideas, on the other hand, requires conscious thought. A recent experiment by a team from Israel scores points against this position. Ran Hassin and colleagues used a neat visual trick called Continuous Flash Suppression to put information into participants’ minds without them becoming consciously aware of it.

Keyword: Attention
Link ID: 20594 - Posted: 02.19.2015

Carl Zimmer In 2010, a graduate student named Tamar Gefen got to know a remarkable group of older people. They had volunteered for a study of memory at the Feinberg School of Medicine at Northwestern University. Although they were all over age 80, Ms. Gefen and her colleagues found that they scored as well on memory tests as people in their 50s. Some complained that they remembered too much. She and her colleagues referred to them as SuperAgers. Many were also friends. “A couple tried to set me up with their grandsons,” Ms. Gefen said. She was impressed by their resilience and humor: “It takes wisdom to a whole new level.” Recently, Ms. Gefen’s research has taken a sharp turn. At the outset of the study, the volunteers agreed to donate their brains for medical research. Some of them have died, and it has been Ms. Gefen’s job to look for anatomical clues to their extraordinary minds. “I had this enormous privilege I can’t even begin to describe,” she said. “I knew them and tested them in life and in death. At the end, I was the one looking at them through a microscope.” Ms. Gefen and her colleagues are now starting to publish the results of these post-mortem studies. Last month in The Journal of Neuroscience, the scientists reported that one of the biggest differences involves peculiar, oversize brain cells known as von Economo neurons. SuperAgers have almost five times as many of them as other people. Learning what makes these brains special could help point researchers to treatments for Alzheimer’s disease and other kinds of mental decline. But it is hard to say how an abundance of von Economo neurons actually helps the brain. © 2015 The New York Times Company

Keyword: Learning & Memory; Alzheimers
Link ID: 20577 - Posted: 02.13.2015

By Virginia Morell To prevent their hives from being attacked by invaders, wasps must quickly distinguish friend from foe. They typically do this by sniffing out foreigners, as outsiders tend to have a different scent than the home colony. Now researchers have discovered that, like a few other wasp species, a tiny social wasp (Liostenogaster flavolineata) from Malaysia employs an additional security measure: facial recognition. The wasps’ nests are typically found in large aggregations with as many as 150 built close together, and each colony faces persistent landing attempts by outsiders from these other nests. To find out why and how these wasps employ both vision and scent to determine if an incoming wasp is a comrade, scientists carried out a series of experiments on 50 colonies (see photo above) in the wild. Close to the nests, the researchers dangled lures made of captured and killed wasps. The lures had been given different treatments. For instance, some lures made from nest mates were coated with a foe’s scent, whereas outsiders were painted with the colony’s odor. The wasps, it turns out, pay more attention to facial markings than to scent when faced with a possible intruder, the team reports online today in the Proceedings of the Royal Society B. Indeed, in tests where the wasps could assess both an intruder’s face and scent, they relied solely on facial recognition and immediately attacked those whose faces they didn’t know, ignoring their odor. That’s the safest strategy, the scientists note, because the wasps can recognize another’s face at a distance, but need to actually touch another wasp to detect her scent—not a bad ploy for a tiny-brained insect. © 2015 American Association for the Advancement of Science

Keyword: Attention
Link ID: 20547 - Posted: 02.05.2015

By Katherine Ellison Dr. Mark Bertin is no A.D.H.D. pill-pusher. The Pleasantville, N.Y., developmental pediatrician won’t allow drug marketers in his office, and says he doesn’t always prescribe medication for children diagnosed with attention deficit hyperactivity disorder. Yet Dr. Bertin has recently changed the way he talks about medication, offering parents a powerful argument. Recent research, he says, suggests the pills may “normalize” the child’s brain over time, rewiring neural connections so that a child would feel more focused and in control, long after the last pill was taken. “There might be quite a profound neurological benefit,” he said in an interview. A growing number of doctors who treat the estimated 6.4 million American children diagnosed with A.D.H.D. are hearing that stimulant medications not only help treat the disorder but may actually be good for their patients’ brains. In an interview last spring with Psych Congress Network, http://www.psychcongress.com/video/are-A.D.H.D.-medications-neurotoxic-or-neuroprotective-16223an Internet news site for mental health professionals, Dr. Timothy Wilens, chief of child and adolescent psychiatry at Massachusetts General Hospital, said “we have enough data to say they’re actually neuroprotective.” The pills, he said, help “normalize” the function and structure of brains in children with A.D.H.D., so that, “over years, they turn out to look more like non-A.D.H.D. kids.” Medication is already by far the most common treatment for A.D.H.D., with roughly 4 million American children taking the pills — mostly stimulants, such as amphetamines and methylphenidate. Yet the decision can be anguishing for parents who worry about both short-term and long-term side effects. If the pills can truly produce long-lasting benefits, more parents might be encouraged to start their children on these medications early and continue them for longer. Leading A.D.H.D. experts, however, warn the jury is still out. © 2015 The New York Times Company

Keyword: ADHD; Development of the Brain
Link ID: 20544 - Posted: 02.03.2015

|By Esther Landhuis One in nine Americans aged 65 and older has Alzheimer's disease, a fatal brain disorder with no cure or effective treatment. Therapy could come in the form of new drugs, but some experts suspect drug trials have failed so far because compounds were tested too late in the disease's progression. By the time people show signs of dementia, their brains have lost neurons. No therapy can revive dead cells, and little can be done to create new ones. So researchers running trials now seek participants who still pass as cognitively normal but are on the verge of decline. These “preclinical” Alzheimer's patients may represent a window of opportunity for therapeutic intervention. How to identify such individuals before they have symptoms presents a challenge, however. Today most Alzheimer's patients are diagnosed after a detailed medical workup and extensive tests that gauge mental function. Other tests, such as spinal fluid analyses and positron-emission tomography (PET) scans, can detect signs of approaching disease and help pinpoint the preclinical window but are cumbersome or expensive. “There's no cheap, fast, noninvasive test that can identify people at risk of Alzheimer's,” says Brad Dolin, chief technology officer of Neurotrack in Palo Alto, Calif.—a company developing a computerized visual screening test for Alzheimer's. Unlike other cognitive batteries, the Neurotrack test requires no language or motor skills. Participants view images on a monitor while a camera tracks their eye movements. The test draws on research by co-founder Stuart Zola of Emory University, who studies learning and memory in monkeys. When presented with a pair of images—one novel, the other familiar—primates fixate longer on the novel one. But if the hippocampus is damaged, as it is in people with Alzheimer's, the subject does not show a clear preference for the novel images. © 2015 Scientific American

Keyword: Alzheimers; Attention
Link ID: 20541 - Posted: 02.02.2015

By ERICA GOODE A study suggests that newborn chicks map numbers spatially, associating low numerical values with space to their left. Credit Rosa Rugani/University of Padova Asked to picture the numbers from one to 10, most people will imagine a straight line with one at the left end and 10 at the right. This “mental number line,” as researchers have termed it, is so pervasive that some scientists have argued that the spatial representation of numbers is hard-wired into the brain, part of a primitive number system that underlies humans’ capacity for higher mathematics. Now a team of Italian researchers has found that newborn chicks, like humans, appear to map numbers spatially, associating smaller amounts with the left side and larger amounts with the right side. The chicks, trained to seek out mealworms behind white plastic panels printed with varying numbers of identical red squares, repeatedly demonstrated a preference for the left when the number of squares was small and for the right when the number was larger. The research, led by Rosa Rugani, a psychologist who at the time was at the University of Padova, will appear in Friday’s issue of the journal Science. Researchers demonstrated that chickens naturally order numbers left to right. When the number five is in the middle, chickens naturally go left for lower numbers and to the right for higher numbers. Publish Date January 29, 2015. In their report, the researchers said the findings supported the idea that the left-right orientation for numbers is innate rather than determined by culture or education — a possibility that was raised by some studies that found that in Arabic-speaking countries where letters and numbers are read right to left, the mental number scale was reversed. But the new research, Dr. Rugani and her colleagues wrote, indicates that orienting numbers in space may represent “a universal cognitive strategy available soon after birth.” Tyler Marghetis, a doctoral candidate in psychology at the University of California, San Diego, who has published research on the spatial association of numbers, called the researcher’s studies “very cool.” © 2015 The New York Times Company

Keyword: Attention; Laterality
Link ID: 20538 - Posted: 01.31.2015

by Clare Wilson Once only possible in an MRI scanner, vibrating pads and electrode caps could soon help locked-in people communicate on a day-to-day basis YOU wake up in hospital unable to move, to speak, to twitch so much as an eyelid. You hear doctors telling your relatives you are in a vegetative state – unaware of everything around you – and you have no way of letting anyone know this is not the case. Years go by, until one day, you're connected to a machine that allows you to communicate through your brain waves. It only allows yes or no answers, but it makes all the difference – now you can tell your carers if you are thirsty, if you'd like to sit up, even which TV programmes you want to watch. In recent years, breakthroughs in mind-reading technology have brought this story close to reality for a handful of people who may have a severe type of locked-in syndrome, previously diagnosed as being in a vegetative state. So far, most work has required a lab and a giant fMRI scanner. Now two teams are developing devices that are portable enough to be taken out to homes, to help people communicate on a day-to-day basis. The technology might also be able to identify people who have been misdiagnosed. People with "classic" locked-in syndrome are fully conscious but completely paralysed apart from eye movements. Adrian Owen of Western University in London, Canada, fears that there is another form of the condition where the paralysis is total. He thinks that a proportion of people diagnosed as being in a vegetative state – in which people are thought to have no mental awareness at all – are actually aware but unable to let anyone know. "The possibility is that we are missing people with some sort of complete locked-in syndrome," he says. © Copyright Reed Business Information Ltd.

Keyword: Consciousness; Brain imaging
Link ID: 20537 - Posted: 01.31.2015

Alison Abbott If you have to make a complex decision, will you do a better job if you absorb yourself in, say, a crossword puzzle instead of ruminating about your options? The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink. But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA — and find no evidence for it. Their conclusion, published this week in Judgement and Decision Making, is based on a large experiment that they designed to provide the best chance of capturing the effect should it exist, along with a sophisticated statistical analysis of previously published data1. The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour. “The bigger debate is about how clever our unconscious is,” says cognitive psychol­ogist David Shanks of University College London. “This carefully constructed paper makes a great contribution.” Shanks published a review last year that questioned research claiming that various unconscious influences, including UTA, affect decision making2. © 2015 Nature Publishing Group

Keyword: Attention; Consciousness
Link ID: 20528 - Posted: 01.28.2015

|By Christof Koch Faces are the glue that holds us together and that gives us our identity. All of us but the visually impaired and blind are experts at recognizing people's identity, gender, age and ethnicity from looking at their faces. First impressions of attractiveness or competence take but a brief glimpse of somebody's face. Newly born infants already tend to fixate on faces. This bias also turns up in art. Paintings and movies are filled with faces staring at the viewer. Who can forget the endless close-ups of the feuding husband and wife in Ingmar Bergman's Cimmerian masterpiece Scenes from a Marriage? Because recognizing a face is so vital to our social lives, it comes as no surprise that a lot of real estate in the cerebral cortex—the highly convoluted region that makes up the bulk of our brain—is devoted to a task crucial to processing faces and their identity. We note whether someone looks our way or not. We discern emotional expressions, whether they register joy, fear or anger. Indeed, functional brain imaging has identified a set of adjacent regions, referred to as the fusiform face area (FFA), that are situated on the left and the right sides of the brain, at the bottom of the temporal lobe of the cerebral cortex. The FFA turns up its activity when subjects look at portraits or close-ups of faces or even when they just think about these images. Two just published studies of the brain's visual networks, including the FFA, enlarge what we know about the physical basis of face perception. Both explore the unique access to the brain afforded by patients whose epileptic seizures have proved resistant to drugs. A surgical treatment finds the locations in the brain where the hypersynchronized activity that characterizes a seizure begins before spreading from its point of origin to engulf one or sometimes both hemispheres. If a single point—a focus where the seizure begins—can be found, it can be removed. After this procedure, a patient usually has significantly fewer seizures—and some remain seizure-free. To triangulate the location of the focus, neurosurgeons insert electrodes into the brain to monitor electrical activity that occurs during a seizure. © 2015 Scientific American

Keyword: Attention
Link ID: 20523 - Posted: 01.27.2015