Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1126

By Rachael Lallensack A video game is helping researchers learn more about how tiny European starlings keep predators at bay. Their massive flocks, consisting of hundreds to thousands of birds, fly together in a mesmerizing, pulsating pattern called a murmuration. For a long time, researchers have suspected that the bigger the flock, the harder it is for predators like falcons and hawks to take down any one member, something known as “confusion effect.” Now, researchers have analyzed that effect—in human hunters. Using the first 3D computer program to simulate a murmuration, scientists tested how well 25 players, acting as flying predators, could target and pursue virtual starlings, whose movements were simulated based on data from real starling flocks (see video above). The team’s findings reaffirmed the confusion effect: The larger the simulated flocks, the harder it was for the “predators” to single out and catch individual prey, the researchers report this week in Royal Society Open Science. So maybe sometimes, it’s not so bad to get lost in a crowd. © 2017 American Association for the Advancement of Science.

Keyword: Attention; Vision
Link ID: 23115 - Posted: 01.18.2017

By Alan Burdick Some nights—more than I like, lately—I wake to the sound of the bedside clock. The room is dark, without detail, and it expands in such a way that it seems as if I’m outdoors, under an empty sky, or underground, in a cavern. I might be falling through space. I might be dreaming. I could be dead. Only the clock moves, its tick steady, unhurried. At these moments I have the most chilling understanding that time moves in only one direction. I’m tempted to look at the clock, but I already know that it’s the same time it always is: 4 A.M., or 4:10 A.M., or once, for a disconcerting stretch of days, 4:27 A.M. Even without looking, I could deduce the time from the ping of the bedroom radiator gathering steam in winter or the infrequency of the cars passing by on the street outside. In 1917, the psychologist Edwin G. Boring and his wife, Lucy, described an experiment in which they woke people at intervals to see if they knew what time it was; the average estimate was accurate to within fifty minutes, although almost everyone thought it was later than it actually was. They found that subjects were relying on internal or external signals: their degree of sleepiness or indigestion (“The dark brown taste in your mouth is never bad when you have been asleep only a short time”), the moonlight, “bladder cues,” the sounds of cars or roosters. “When a man is asleep, he has in a circle round him the chain of the hours, the sequence of the years, the order of the heavenly bodies,” Proust wrote. “Instinctively he consults them when he awakes, and in an instant reads off his own position on the earth’s surface and the time that has elapsed during his slumbers.” © 2017 Condé Nast.

Keyword: Attention; Sleep
Link ID: 23109 - Posted: 01.16.2017

By Maggie Koerth-Baker “The president can’t have a conflict of interest,” Donald Trump told The New York Times in November. He appears to have meant that in the legal sense — the president isn’t bound by the same conflict-of-interest laws that loom over other executive branch officials and employees.1 But that doesn’t mean the president’s interests can’t be in conflict. When he takes office Jan. 20, Trump will be tangled in a wide array of situations in which his personal connections and business coffers are pulling him in one direction while the interests of the American presidency and people pull him in another. For example, Trump is the president of a vineyard in Virginia that’s requesting foreign worker visas from the government he’ll soon lead. He’s also involved in an ongoing business partnership with the Philippines’ diplomatic trade envoy — a relationship that could predispose Trump to accepting deals that are more favorable to that country than he otherwise might. Once he’s in office, he will appoint some members of the labor board that could hear disputes related to his hotels. Neither Trump nor his transition team replied to interview requests for this article, but his comments to the Times suggest that he genuinely believes he can be objective and put the country first, despite financial and social pressures to do otherwise. Unfortunately, science says he’s probably wrong.

Keyword: Attention
Link ID: 23108 - Posted: 01.16.2017

By Ellen Hendriksen Pop quiz: what’s the first thing that comes to mind when I say “ADHD”? a. Getting distracted b. Ants-in-pants c. Elementary school boys d. Women and girls Most likely, you didn’t pick D. If that’s the case, you’re not alone. For most people, ADHD conjures a mental image of school-aged boys squirming at desks or bouncing off walls, not a picture of adults, girls, or especially adult women. Both scientists and society have long pinned ADHD on males, even though girls and women may be just as likely to suffer from this neurodevelopmental disorder. Back in 1987, the American Psychiatric Association stated that the male to female ratio for ADHD was 9 to 1. Twenty years later, however, an epidemiological study of almost 4,000 kids found the ratio was more like 1 to 1—half girls, half boys. © 2017 Scientific American

Keyword: ADHD; Sexual Behavior
Link ID: 23069 - Posted: 01.09.2017

By Drake Baer Philosophers have been arguing about the nature of will for at least 2,000 years. It’s at the core of blockbuster social-psychology findings, from delayed gratification to ego depletion to grit. But it’s only recently, thanks to the tools of brain imaging, that the act of willing is starting to be captured at a mechanistic level. A primary example is “cognitive control,” or how the brain selects goal-serving behavior from competing processes like so many unruly third-graders with their hands in the air. It’s the rare neuroscience finding that’s immediately applicable to everyday life: By knowing the way the brain is disposed to behaving or misbehaving in accordance to your goals, it’s easier to get the results you’re looking for, whether it’s avoiding the temptation of chocolate cookies or the pull of darkly ruminative thoughts. Jonathan Cohen, who runs a neuroscience lab dedicated to cognitive control at Princeton, says that it underlies just about every other flavor of cognition that’s thought to “make us human,” whether it’s language, problem solving, planning, or reasoning. “If I ask you not to scratch the mosquito bite that you have, you could comply with my request, and that’s remarkable,” he says. Every other species — ape, dog, cat, lizard — will automatically indulge in the scratching of the itch. (Why else would a pup need a post-surgery cone?) It’s plausible that a rat or monkey could be taught not to scratch an itch, he says, but that would probably take thousands of trials. But any psychologically and physically able human has the capacity to do so. “It’s a hardwired reflex that is almost certainly coded genetically,” he says. “But with three words — don’t scratch it — you can override those millions of years of evolution. That’s cognitive control.” © 2017, New York Media LLC.

Keyword: Consciousness
Link ID: 23067 - Posted: 01.07.2017

By Michael Price As we age, we get progressively better at recognizing and remembering someone’s face, eventually reaching peak proficiency at about 30 years old. A new study suggests that’s because brain tissue in a region dedicated to facial recognition continues to grow and develop throughout childhood and into adulthood, a process known as proliferation. The discovery may help scientists better understand the social evolution of our species, as speedy recollection of faces let our ancestors know at a glance whether to run, woo, or fight. The results are surprising because most scientists have assumed that brain development throughout one’s life depends almost exclusively on “synaptic pruning,” or the weeding out of unnecessary connections between neurons, says Brad Duchaine, a psychologist at Dartmouth College who was not involved with the study. “I expect these findings will lead to much greater interest in the role of proliferation in neural development.” Ten years ago, Kalanit Grill-Spector, a psychologist at Stanford University in Palo Alto, California, first noticed that several parts of the brain’s visual cortex, including a segment known as the fusiform gyrus that’s known to be involved in facial recognition, appeared to develop at different rates after birth. To get more detailed information on how the size of certain brain regions changes over time, she turned to a recently developed brain imaging technology known as quantitative magnetic resonance imaging (qMRI). The technique tracks how long it takes for protons, excited by the imaging machine’s strong magnetic field, to calm down. Like a top spinning on a crowded table, these protons will slow down more quickly if they’re surrounded by a lot of molecules—a proxy for measuring volume. © 2017 American Association for the Advancement of Science

Keyword: Development of the Brain; Attention
Link ID: 23063 - Posted: 01.06.2017

Alexander Fornito, The human brain is an extraordinarily complex network, comprising an estimated 86 billion neurons connected by 100 trillion synapses. A connectome is a comprehensive map of these links—a wiring diagram of the brain. With current technology, it is not possible to map a network of this size at the level of every neuron and synapse. Instead researchers use techniques such as magnetic resonance imaging to map connections between areas of the human brain that span several millimeters and contain many thousands of neurons. At this macroscopic scale, each area comprises a specialized population of neurons that work together to perform particular functions that contribute to cognition. For example, different parts of your visual cortex contain cells that process specific types of information, such as the orientation of a line and the direction in which it moves. Separate brain regions process information from your other senses, such as sound, smell and touch, and other areas control your movements, regulate your emotional responses, and so on. These specialized functions are not processed in isolation but are integrated to provide a unitary and coherent experience of the world. This integration is hypothesized to occur when different populations of cells synchronize their activity. The fiber bundles that connect different parts of the brain—the wires of the connectome—provide the substrate for this communication. These connections ensure that brain activity unfolds through time as a rhythmic symphony rather than a disordered cacophony. © 2017 Scientific American

Keyword: Brain imaging
Link ID: 23049 - Posted: 01.03.2017

By Susana Martinez-Conde Our perceptual and cognitive systems like to keep things simple. We describe the line drawings below as a circle and a square, even though their imagined contours consist—in reality—of discontinuous line segments. The Gestalt psychologists of the 19th and early 20th century branded this perceptual legerdemain as the Principle of Closure, by which we tend to recognize shapes and concepts as complete, even in the face of fragmentary information. Now at the end of the year, it is tempting to seek a cognitive kind of closure: we want to close the lid on 2016, wrap it with a bow and start a fresh new year from a blank slate. Of course, it’s just an illusion, the Principle of Closure in one of its many incarnations. The end of the year is just as arbitrary as the end of the month, or the end of the week, or any other date we choose to highlight in the earth’s recurrent journey around the sun. But it feels quite different. That’s why we have lists of New Year’s resolutions, or why we start new diets or exercise regimes on Mondays rather than Thursdays. Researchers have also found that, even though we measure time in a continuous scale, we assign special meaning to idiosyncratic milestones such as entering a new decade. What should we do about our brain’s oversimplification tendencies concerning the New Year—if anything? One strategy would be to fight our feelings of closure and rebirth as we (in truth) seamlessly move from the last day of 2016 to the first day of 2017. But that approach is likely to fail. Try as we might, the Principle of Closure is just too ingrained in our perceptual and cognitive systems. In fact, if you already have the feeling that the beginning of the year is somewhat special (hey, it only happens once a year!), you might as well decide that resistance is futile, and not just embrace the illusion, but do your best to channel it. © 2017 Scientific American

Keyword: Vision; Attention
Link ID: 23042 - Posted: 01.02.2017

Perry Link People who study other cultures sometimes note that they benefit twice: first by learning about the other culture and second by realizing that certain assumptions of their own are arbitrary. In reading Colin McGinn’s fine recent piece, “Groping Toward the Mind,” in The New York Review, I was reminded of a question I had pondered in my 2013 book Anatomy of Chinese: whether some of the struggles in Western philosophy over the concept of mind—especially over what kind of “thing” it is—might be rooted in Western language. The puzzles are less puzzling in Chinese. Indo-European languages tend to prefer nouns, even when talking about things for which verbs might seem more appropriate. The English noun inflation, for example, refers to complex processes that were not a “thing” until language made them so. Things like inflation can even become animate, as when we say “we need to combat inflation” or “inflation is killing us at the check-out counter.” Modern cognitive linguists like George Lakoff at Berkeley call inflation an “ontological metaphor.” (The inflation example is Lakoff’s.) When I studied Chinese, though, I began to notice a preference for verbs. Modern Chinese does use ontological metaphors, such as fāzhăn (literally “emit and unfold”) to mean “development” or xὶnxīn (“believe mind”) for “confidence.” But these are modern words that derive from Western languages (mostly via Japanese) and carry a Western flavor with them. “I firmly believe that…” is a natural phrase in Chinese; you can also say “I have a lot of confidence that…” but the use of a noun in such a phrase is a borrowing from the West. © 1963-2016 NYREV, Inc

Keyword: Consciousness; Language
Link ID: 23031 - Posted: 12.28.2016

By Susana Martinez-Conde, Stephen L. Macknik We think we know what we want—but do we, really? In 2005 Lars Hall and Petter Johansson, both at Lund University in Sweden, ran an experiment that transformed how cognitive scientists think about choice. The experimental setup looked deceptively simple. A study participant and researcher faced each other across a table. The scientist offered two photographs of young women deemed equally attractive by an independent focus group. The subject then had to choose which portrait he or she found more appealing. Next, the experimenter turned both pictures over, moved them toward the subjects and asked them to pick up the photo they just chose. Subjects complied, unaware that the researcher had just performed a swap using a sleight-of-hand technique known to conjurers as black art. Because your visual neurons are built to detect and enhance contrast, it is very hard to see black on black: a magician dressed in black against a black velvet backdrop can look like a floating head. Hall and Johansson deliberately used a black tabletop in their experiment. The first photos their subjects saw all had black backs. Behind those, however, they hid a second picture of the opposite face with a red back. When the experimenter placed the first portrait face down on the table, he pushed the second photo toward the subject. When participants picked up the red-backed photos, the black-backed ones stayed hidden against the table's black surface—that is, until the experimenter could surreptitiously sweep them into his lap. © 2016 Scientific American

Keyword: Consciousness; Attention
Link ID: 23021 - Posted: 12.26.2016

By Victoria Gill Science reporter, BBC News Direct recordings have revealed what is happening in our brains as we make sense of speech in a noisy room. Focusing on one conversation in a loud, distracting environment is called "the cocktail party effect". It is a common festive phenomenon and of interest to researchers seeking to improve speech recognition technology. Neuroscientists recorded from people's brains during a test that recreated the moment when unintelligible speech suddenly makes sense. A team measured people's brain activity as the words of a previously unintelligible sentence suddenly became clear when a subject was told the meaning of the "garbled speech". The findings are published in the journal Nature Communications. Lead researcher Christopher Holdgraf from the University of California, Berkeley, and his colleagues were able to work with epilepsy patients, who had had a portion of their skull removed and electrodes placed on the brain surface to track their seizures. First, the researchers played a very distorted, garbled sentence to each subject, which almost no-one was able to understand. They then played a normal, easy to understand version of the same sentence and immediately repeated the garbled version. "After hearing the intact sentence" the researchers explained in their paper, all the subjects understood the subsequent "noisy version". The brain recordings showed this moment of recognition as brain activity patterns in the areas of the brain that are known to be associated with processing sound and understanding speech. When the subjects heard the very garbled sentence, the scientists reported that they saw little activity in those parts of the brain. Hearing the clearly understandable sentence then triggered patterns of activity in those brain areas. © 2016 BBC.

Keyword: Attention; Hearing
Link ID: 23004 - Posted: 12.22.2016

A little over a decade ago, neuroscientists began using a new technique to inspect what was going on in the brains of their subjects. Rather than giving their subjects a task to complete and watching their brains to see which parts lit up, they’d tell them to lie back, let their minds wander, and try not to fall asleep for about six minutes. That technique is called resting state functional magnetic resonance imaging, and it shares a problem with other types of fMRI: It only tracks changes in the blood in the brain, not the neurons sending the signals in the first place. Researchers have recently called fMRI into question for its reliance on possibly-faulty statistics. And things get even less certain when the brain isn’t engaged in any particular task. “These signals are, by definition, random,” says Elizabeth Hillman, a biomedical engineer at Columbia’s Zuckerman Institute. “And when you’re trying to measure something that’s random amidst a whole bunch of noise, it becomes very hard to tell what’s actually random and what isn’t.” Six years ago, Hillman, along with many others in the field, was deeply skeptical of resting state fMRI’s ability to measure what it promised to. But this week, in a paper in Proceedings of the National Academy of Sciences, she presents compelling evidence to the contrary: a comprehensive visualization of neural activity throughout the entire brain at rest, and evidence that the blood rushing around in your brain is actually a good indicator of what your neurons are doing. Ever since 1992, when researcher Bharat Biswal first started scanning people who were just sitting around, resting state fMRI has become increasingly popular. Partly, that’s because it’s just way simpler than regular, task-based fMRI.

Keyword: Brain imaging; Attention
Link ID: 22986 - Posted: 12.14.2016

Answer by Paul King, Director of Data Science, on Quora: There are hundreds of surprising, perspective-shifting insights about the nature of reality that come from neuroscience. Every bizarre neurological syndrome, every visual illusion, and every clever psychological experiment reveals something entirely unexpected about our experience of the world that we take for granted. Here are a few to give a flavor: 1. Perceptual reality is entirely generated by our brain. We hear voices and meaning from air pressure waves. We see colors and objects, yet our brain only receives signals about reflected photons. The objects we perceive are a construct of the brain, which is why optical illusions can fool the brain. Recommended by Forbes 2. We see the world in narrow disjoint fragments. We think we see the whole world, but we are looking through a narrow visual portal onto a small region of space. You have to move your eyes when you read because most of the page is blurry. We don't see this, because as soon as we become curious about part of the world, our eyes move there to fill in the detail before we see it was missing. While our eyes are in motion, we should see a blank blur, but our brain edits this out. 3. Body image is dynamic and flexible. Our brain can be fooled into thinking a rubber arm or a virtual reality hand is actually a part of our body. In one syndrome, people believe one of their limbs does not belong to them. One man thought a cadaver limb had been sewn onto his body as a practical joke by doctors. 4. Our behavior is mostly automatic, even though we think we are controlling it.

Keyword: Attention
Link ID: 22980 - Posted: 12.13.2016

By DANIEL A. YUDKIN and JAY VAN BAVEL During the first presidential debate, Hillary Clinton argued that “implicit bias is a problem for everyone, not just police.” Her comment moved to the forefront of public conversation an issue that scientists have been studying for decades: namely, that even well-meaning people frequently harbor hidden prejudices against members of other racial groups. Studies have shown that these subtle biases are widespread and associated with discrimination in legal, economic and organizational settings. Critics of this notion, however, protest what they see as a character smear — a suggestion that everybody, deep down, is racist. Vice President-elect Mike Pence has said that an “accusation of implicit bias” in cases where a white police officer shoots a black civilian serves to “demean law enforcement.” Writing in National Review, David French claimed that the concept of implicit bias lets people “indict entire communities as bigoted.” But implicit bias is not about bigotry per se. As new research from our laboratory suggests, implicit bias is grounded in a basic human tendency to divide the social world into groups. In other words, what may appear as an example of tacit racism may actually be a manifestation of a broader propensity to think in terms of “us versus them” — a prejudice that can apply, say, to fans of a different sports team. This doesn’t make the effects of implicit bias any less worrisome, but it does mean people should be less defensive about it. Furthermore, our research gives cause for optimism: Implicit bias can be overcome with rational deliberation. In a series of experiments whose results were published in The Journal of Experimental Psychology: General, we set out to determine how severely people would punish someone for stealing. Our interest was in whether a perpetrator’s membership in a particular group would influence the severity of the punishment he or she received. © 2016 The New York Times Company

Keyword: Attention; Emotions
Link ID: 22979 - Posted: 12.12.2016

Children don’t usually have the words to communicate even the darkest of thoughts. As a result, some children aged 5 to 11 take their own lives. It’s a rare and often overlooked phenomenon—and one that scientists are only just beginning to understand. A study published today in the journal Pediatrics reveals that attention deficit disorder (A.D.D.), not depression, may be the most common mental health diagnosis among children who die by suicide. By contrast, the researchers found that two-thirds of the 606 early adolescents studied (aged 12 to 14) had suffered from depression. While the finding isn’t necessarily causal, it does suggest that impulsive behavior might contribute to incidences of child suicide. Alternatively, some of these cases could be attributed to early-onset bipolar disorder, misdiagnosed as A.D.D. or A.D.H.D. Here’s Catherine Saint Louis, reporting for The New York Times: Suicide prevention has focused on identifying children struggling with depression; the new study provides an early hint that this strategy may not help the youngest suicide victims. “Maybe in young children, we need to look at behavioral markers,” said Jeffrey Bridge, the paper’s senior author and an epidemiologist at the Research Institute at Nationwide Children’s Hospital in Columbus, Ohio. Jill Harkavy-Friedman, the vice president of research at the American Foundation for Suicide Prevention, agreed. “Not everybody who is at risk for suicide has depression,” even among adults, said Dr. Harkavy-Friedman, who was not involved in the new research. © 1996-2016 WGBH Educational Foundation

Keyword: ADHD; Depression
Link ID: 22964 - Posted: 12.08.2016

Rosie Mestel The 2016 US election was a powerful reminder that beliefs tend to come in packages: socialized medicine is bad, gun ownership is a fundamental right, and climate change is a myth — or the other way around. Stances that may seem unrelated can cluster because they have become powerful symbols of membership of a group, says Dan Kahan, who teaches law and psychology at Yale Law School in New Haven, Connecticut. And the need to keep believing can further distort people’s perceptions and their evaluation of evidence. Here, Kahan tells Nature about the real-world consequences of group affinity and cognitive bias, and about research that may point to remedies. This interview has been edited for length and clarity. One measure is how individualistic or communitarian people are, and how egalitarian or hierarchical. Hierarchical and individualistic people tend to have confidence in markets and industry: those represent human ingenuity and power. People who are egalitarian and communitarian are suspicious of markets and industry. They see them as responsible for social disparity. It’s natural to see things you consider honourable as good for society, and things that are base, as bad. Such associations will motivate people’s assessment of evidence. Can you give an example? In a study, we showed people data from gun-control experiments and varied the results1. People who were high in numeracy always saw when a study supported their view. If it didn’t support their view, they didn’t notice — or argued their way out of it. © 2016 Macmillan Publishers Limited

Keyword: Attention; Emotions
Link ID: 22946 - Posted: 12.03.2016

By Alison Howell What could once only be imagined in science fiction is now increasingly coming to fruition: Drones can be flown by human brains' thoughts. Pharmaceuticals can help soldiers forget traumatic experiences or produce feelings of trust to encourage confession in interrogation. DARPA-funded research is working on everything from implanting brain chips to "neural dust" in an effort to alleviate the effects of traumatic experience in war. Invisible microwave beams produced by military contractors and tested on U.S. prisoners can produce the sensation of burning at a distance. What all these techniques and technologies have in common is that they're recent neuroscientific breakthroughs propelled by military research within a broader context of rapid neuroscientific development, driven by massive government-funded projects in both America and the European Union. Even while much about the brain remains mysterious, this research has contributed to the rapid and startling development of neuroscientific technology. And while we might marvel at these developments, it is also undeniably true that this state of affairs raises significant ethical questions. What is the proper role – if any – of neuroscience in national defense or war efforts? My research addresses these questions in the broader context of looking at how international relations, and specifically warfare, are shaped by scientific and medical expertise and technology. 2016 © U.S. News & World Report L.P.

Keyword: Attention; Sleep
Link ID: 22944 - Posted: 12.03.2016

Amanda Gefter As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like. Not so, says Donald D. Hoffman, a professor of cognitive science at the University of California, Irvine. Hoffman has spent the past three decades studying perception, artificial intelligence, evolutionary game theory and the brain, and his conclusion is a dramatic one: The world presented to us by our perceptions is nothing like reality. What’s more, he says, we have evolution itself to thank for this magnificent illusion, as it maximizes evolutionary fitness by driving truth to extinction. Getting at questions about the nature of reality, and disentangling the observer from the observed, is an endeavor that straddles the boundaries of neuroscience and fundamental physics. On one side you’ll find researchers scratching their chins raw trying to understand how a three-pound lump of gray matter obeying nothing more than the ordinary laws of physics can give rise to first-person conscious experience. This is the aptly named “hard problem.”

Keyword: Consciousness
Link ID: 22937 - Posted: 12.01.2016

By Melissa Dahl Considering its origin story, it’s not so surprising that hypnosis and serious medical science have often seemed at odds. The man typically credited with creating hypnosis, albeit in a rather primitive form, is Franz Mesmer, a doctor in 18th-century Vienna. (Mesmer, mesmerize. Get it?) Mesmer developed a general theory of disease he called “animal magnetism,” which held that every living thing carries within it an internal magnetic force, in liquid form. Illness arises when this fluid becomes blocked, and can be cured if it can be coaxed to flow again, or so Mesmer’s thinking went. To get that fluid flowing, as science journalist Jo Marchant describes in her recent book, Cure, Mesmer “simply waved his hands to direct it through his patients’ bodies” — the origin of those melodramatic hand motions that stage hypnotists use today.” After developing a substantial following — “mesmerism” became “the height of fashion” in late 1780s Paris, writes Marchant — Mesmer became the subject of what was essentially the world’s first clinical trial. King Louis XVI pulled together a team of the world’s top scientists, including Benjamin Franklin, who tested mesmerism and found its capacity to “cure” was, essentially, a placebo effect. “Not a shred of evidence exists for any fluid,” Franklin wrote. “The practice … is the art of increasing the imagination by degrees.” Maybe so. But that doesn’t mean it doesn’t work. © 2016, New York Media LLC.

Keyword: Attention; Pain & Touch
Link ID: 22931 - Posted: 11.30.2016

By Yasemin Saplakoglu Even if you don’t have rhythm, your pupils do. In a new study, neuroscientists played drumming patterns from Western music, including beats typical in pop and rock, while asking volunteers to focus on computer screens for an unrelated fast-paced task that involved pressing the space bar as quickly as possible in response to a signal on the screen. Unbeknownst to the participants, the music omitted strong and weak beats at random times. (You can listen below for an example of a music clip they used. If you listen carefully, you can hear bass and hi-hat beats omitted throughout.) Eye scanners tracked the dilations of the subjects’ pupils as the music played. Their pupils enlarged when the rhythms dropped certain beats, even though the participants weren’t paying attention to the music. The biggest dilations matched the omissions of the beats in the most prominent locations in the music, usually the important first beat in a repeated set of notes. The results suggest that we may have an automatic sense of “hierarchical meter”—a pattern of strong and weak beats—that governs our expectations of music, the researchers write in the February 2017 issue of Brain and Cognition. Perhaps, the authors say, our eyes reveal clues into the importance that music and rhythm plays in our lives. © 2016 American Association for the Advancement of Science

Keyword: Attention; Hearing
Link ID: 22920 - Posted: 11.29.2016