Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1237

By Abby Olena Our brains quickly characterize everything we see as familiar or new, and scientists have been investigating this connection between vision and cognition for years. Now, research in Japanese macaques (Macaca fuscata) reveals that the activation of neurons in a part of the primate brain called the perirhinal cortex can cause monkeys to recognize new objects as familiar and vice versa. The study was published today (August 17) in Science. “There are a lot of really exciting aspects to this paper,” says neuroscientist David Sheinberg of Brown University, who did not participate in the work. “This group continues to make advances that are helping us understand how we convert visual impressions into things we know.” Primate brains process visual information through several brain structures that make up the ventral visual stream. The last stop in this stream is the perirhinal cortex, part of the medial temporal lobe. Scientists know that this brain structure plays roles in visual memory and object discrimination. But one open question is whether the perirhinal cortex represents objects’ physical traits or whether it might also communicate information about nonphysical attributes, such as whether an object has been seen before. “In the primate, the perirhinal cortex is the link between the visual pathway and the limbic memory system,” coauthor and University of Tokyo neuroscientist Yasushi Miyashita writes in an email to The Scientist. “Therefore, the perirhinal cortex is one of the most likely candidates in the brain where visual information is transformed to subjective semantic values by referring to one’s own memory.” © 1986-2017 The Scientist

Keyword: Attention
Link ID: 23979 - Posted: 08.19.2017

By Aylin Woodward Two newly identified brain areas in rhesus monkeys seem to help the animals recognise familiar faces. Primates, Homo sapiens included, must be able to differentiate between faces and recognise friend from foe because social hierarchies play a large role in daily life. But exactly how primate brains deal with faces is not completely clear. One idea is that the same parts of the brain are involved in recognising both familiar and unfamiliar faces, just with varying efficiency. But Sofia Landi and Winrich Freiwald at Rockefeller University in New York have now cast doubt on that thinking. Their work shows that distinct brain areas are responsible for recognising the primates you know. Many researchers have already shown that certain areas of the temporal and prefrontal cortex are involved in unfamiliar face perception in rhesus monkey brains. Using whole-brain fMRI scans of four monkeys, Landi and Freiwald have now identified two additional brain areas that play a role not only in unfamiliar face perception but also in recognising familiar faces. The two new areas are in the anterior temporal lobe – the part of our brains above and in front of our ears. One is in the perirhinal cortex and one is in the temporal pole. These regions lit up far more when the monkeys recognised a familiar face in a photograph, as opposed to when they were presented with images of a stranger. © Copyright New Scientist Ltd.

Keyword: Attention
Link ID: 23949 - Posted: 08.11.2017

By Erin Blakemore Do you talk to yourself? Don’t sweat it: Scientists say you’re not alone. And the ways in which you chatter to yourself, both in your head and out loud, are changing what neuroscientists know about the human brain. Writing in Scientific American, psychologist Charles Fernyhough reveals why we’re our best conversational partners. Scientists have only recently learned how to study self-talk — and it’s opening up exciting new avenues of research. It turns out there are two ways of chatting yourself up. In “inner speech,” you speak to yourself without making sound. With “private speech,” you do the same thing, just out loud. This chatter serves varied purposes: It can help people control themselves and relate to others. But it’s notoriously hard to study. So Fernyhough and colleagues figured out some inventive ways to prompt people to talk to themselves as they lay inside a functional magnetic resonance imaging, or fMRI, scanner. When they studied the brains of people who talked to themselves internally, the team noticed that spontaneous inner speech activates a different part of the brain than words that the participants were asked to say aloud. And people whose self-talk takes the form of a monologue seem to activate different brain areas than those who carry on a dialogue in their heads. © 1996-2017 The Washington Post

Keyword: Consciousness; Language
Link ID: 23924 - Posted: 08.07.2017

By Amanda Onion, While driving and accelerating in his car, a patient in France suddenly had a bizarre sensation. He felt like he was outside his car, looking in at his physical self, which was still at the wheel. The patient was part of a new study that links problems of the inner ear with eerie "out-of-body" experiences. These experiences arecurious, usually brief sensations in which a person's consciousness seems to exitthe body and then view the body from the outside. The study analyzed 210 patients who had visited their doctors with so-called vestibular disorders. The vestibular system, which is made up of several structures in the inner ear, provides the body with a sense of balance and spatial orientation. Problems with this system can cause dizziness or a floating sensation, among other symptoms. [7 Weird Facts About Balance] Maya Elzière, an ear, nose and throat specialist at Hôpital Européen in Marseille, France, and co-author of the study, enlisted patients who had experienced a range of issues, from recurrent vertigo and tinnitus to infections in the ear. Among these patients, 14 percent reported out-of-body experiences, compared with only 5 percent of healthy people without vestibular disorders who said the same. "Out-of-body experiences were about three times more frequent" in patients with vestibular disorders, versus those without these disorders, said Christophe Lopez, lead author of the study and a neuroscientist at Aix-Marseille Université in France. © 2017 Scientific American,

Keyword: Attention
Link ID: 23922 - Posted: 08.07.2017

By Knvul Sheikh The brain has evolved to recognize and remember many different faces. We can instantly identify a friend's countenance among dozens in a crowded restaurant or on a busy street. And a brief glance tells us whether that person is excited or angry, happy or sad. Brain-imaging studies have revealed that several blueberry-size regions in the temporal lobe—the area under the temple—specialize in responding to faces. Neuroscientists call these areas “face patches.” But neither brain scans nor clinical studies of patients with implanted electrodes explained exactly how the cells in these patches work. Now, using a combination of brain imaging and single-neuron recording in macaques, biologist Doris Tsao and her colleagues at the California Institute of Technology appear to have finally cracked the neural code for primate face recognition. The researchers found the firing rate of each face patch cell corresponds to a separate facial feature. Like a set of dials, the cells can be fine-tuned to respond to bits of information, which they can then combine in various ways to create an image of every face the animal encounters. “This was mind-blowing,” Tsao says. “The values of each dial are so predictable that we can re-create the face that a monkey sees by simply tracking the electrical activity of its face cells.” Previous studies had hinted at the specificity of these brain areas for encoding faces. In the early 2000s, when Tsao was a postdoctoral researcher at Harvard Medical School, she and electrophysiologist Winrich Freiwald showed that neurons in a monkey's face patches would fire electrical signals every time the animal saw pictures of a face. But the same brain cells showed little or no response to other objects, such as images of vegetables, radios or nonfacial body parts. Other experiments indicated that neurons in these regions could also distinguish among individual faces, even if they were cartoons. © 2017 Scientific American

Keyword: Attention
Link ID: 23913 - Posted: 08.03.2017

By Karl Gruber Are you good with faces? So is the Japanese rice fish – at least, it is if the faces are the right way up. Just like humans, the tiny fish has no problem recognising faces orientated the usual way, but, again like us, it struggles when they are inverted. The finding indicates that the fish may have developed a unique brain pathway for face recognition, just as humans have. We have no problem identifying most objects in our environment – say, a chair – no matter what way up they are. But faces are different. It is relatively easy for us to spot the differences between two faces, even if they are physically similar, if we see them in photographs the right way up. But if the images are upside down, telling them apart gets a bit tricky. “This is because we have a specific brain area for processing faces, and when the face is upside down, we process the image through object processing pathways, and not the face-processing pathways any more,” says Mu-Yun Wang at the University of Tokyo, Japan. Until now, this face-inversion effect was considered exclusive to mammals as it has only been observed in primates and sheep. Enter the Japanese rice fish, also known as the medaka (Oryzias latipes), a 3.5-centimetre-long shoaling fish commonly found in rice paddies, marshes, ponds and slow-moving streams in East Asia. These fish are very social, so identifying the right individuals to associate with is important. © Copyright New Scientist Ltd.

Keyword: Attention; Evolution
Link ID: 23891 - Posted: 07.28.2017

Sarah Zhang In 1958, Robert Monroe floated out of his body for the first time. It began “without any apparent cause,” he wrote. His doctor, finding no physical ailment, prescribed tranquilizers. A psychologist friend, meanwhile, told me him to try leaving his body again. After all, the friend said, “some of the fellows who practice yoga and those Eastern religions claim they can do it whenever they want to.” Monroe did try it again—and again and again. He recalls these experiences in his classic 1971 book Journeys out of the Body, which launched the phrase “out-of-body experiences” into the public conversation. Monroe died in 1995, but the fascination with out-of-body experiences endures. Out-of-body experience can vary person to person, but they often involve the sense of floating above one’s actual body and looking down. For neuroscientists, the phenomenon is a puzzle and an opportunity: Understanding how the brain goes awry can also illuminate how it is supposed to work. Neuroscientists now think that out-of-body experiences involve the vestibular system—made up of canals in the inner ear that track a person’s locations in space—and how that information gets integrated with other senses in the brain. In a recent study from France, Christophe Lopez, a neuroscientist at Aix-Marseille Université, teamed up with Maya Elzière, a doctor who sees patient with vestibular disorders. Some of these patients complained of dizziness, with physical causes that ranged from fluid leaking out of the inner ear to an infection of a nearby nerve. Of 210 patients who reported dizziness, 14 percent said they have had out-of-body experiences. In contrast, only 5 percent of healthy participants in the study reported such sensations. © 2017 by The Atlantic Monthly Group

Keyword: Attention
Link ID: 23884 - Posted: 07.27.2017

By Aylin Woodward Keep your head up. Today, navigating the urban jungle can be challenging, with uneven sidewalks and errant kerbs presenting obstacles to easy walking. So why do we rarely trip up even though we hardly ever give walking our full attention? It seems that all we need is a brief glimpse of what’s coming next on the road in front of us, just one step ahead of time, to keep up upright. Humans have a unique kind of locomotion – we’re bipedal, meaning we move around on two legs rather than four. Scientists are still struggling to unravel the mystery behind our shift to two legs – for instance, some suggest it freed up our hands to carry food. Others point out that our human gait is much more energetically efficient. Our walking style exploits external forces like gravity and inertia to use as little muscular energy as possible so that we actually fall forward onto the lifted foot with each step. Jonathan Samir Matthis at the University of Texas at Austin wanted to know how we aim and control this forward motion – particularly since the way ahead is rarely level and obstacle-free. “We have to be much more careful about where we place our feet than we would if we had four legs on the ground,” he says. “Because if we do it wrong, there’s serious consequences like breaking your leg.” © Copyright New Scientist Ltd.

Keyword: Movement Disorders; Attention
Link ID: 23872 - Posted: 07.25.2017

By Sam Wong Students who take Adderall to improve their test scores may get a slight benefit, but it’s mainly a placebo effect. The drug Adderall is a combination of the stimulants amphetamine and dextroamphetamine, and is used to treat attention deficit hyperactivity disorder (ADHD). But it’s growing in popularity as a study drug in the US, where around a third of college students are thought to try using prescription stimulants for non-medical reasons. But does it work? Rachel Fargason, a psychiatrist at the University of Alabama, Birmingham, says the idea of stimulants as cognitive enhancers didn’t tally with her experience of patients who were diagnosed incorrectly. “If they didn’t have ADHD, the stimulants generally didn’t help them cognitively,” she says. To investigate further, Fargason’s team set up a trial in 32 people between the ages of 19 and 30, none of whom had ADHD. Each participant took a batch of cognitive tests four times. On two of these occasions they were given 10 milligrams of Adderall, while they were given a placebo the other times. With each treatment, they were once told they were getting medication, and once told they were getting a placebo. © Copyright New Scientist Ltd.

Keyword: ADHD; Drug Abuse
Link ID: 23858 - Posted: 07.21.2017

By PERRI KLASS, M.D. We want to believe we’re raising our kids to think for themselves, and not to do dumb or unhealthy things just because the cool kids are doing them. But research shows that when it comes to smoking, children are heavily influenced by some of the folks they consider the coolest of the cool: actors in movies. “There’s a dose-response relationship: The more smoking kids see onscreen, the more likely they are to smoke,” said Dr. Stanton Glantz, a professor and director of the University of California, San Francisco, Center for Tobacco Control Research and Education. He is one of the authors of a new study that found that popular movies are showing more tobacco use onscreen. “The evidence shows it’s the largest single stimulus,” for smoking, he said; “it overpowers good parental role modeling, it’s more powerful than peer influence or even cigarette advertising.” He said that epidemiological studies have shown that if you control for all the other risk factors of smoking (whether parents smoke, attitudes toward risk taking, socioeconomic status, and so on), younger adolescents who are more heavily exposed to smoking on film are two to three times as likely to start smoking, compared with the kids who are more lightly exposed. Those whose parents smoke are more likely to smoke, he said, but exposure to smoking in movies can overcome the benefit of having nonsmoking parents. In one study, the children of nonsmoking parents with heavy exposure to movie smoking were as likely to smoke as the children of smoking parents with heavy movie exposure. To Dr. Glantz, and the other people who study this topic, that makes smoking in movies an “environmental toxin,” a factor endangering children. © 2017 The New York Times Company

Keyword: Drug Abuse; Attention
Link ID: 23844 - Posted: 07.18.2017

Tim Adams Henry Marsh made the decision to become a neurosurgeon after he had witnessed his three-month-old son survive the complex removal of a brain tumour. For two decades he was the senior consultant in the Atkinson Morley wing at St George’s hospital in London, one of the country’s largest specialist brain surgery units. He pioneered techniques in operating on the brain under local anaesthetic and was the subject of the BBC documentary Your Life in Their Hands. His first book, Do No Harm: Stories of Life, Death, and Brain Surgery, was published in 2014 to great acclaim, and became a bestseller across the world. Marsh retired from full-time work at St George’s in 2015, though he continues with long-standing surgical roles at hospitals in the Ukraine and Nepal. He is also an avid carpenter. Earlier this year he published a second volume of memoir, Admissions: a Life in Brain Surgery, in which he looks back on his career as he takes up a “retirement project” of renovating a decrepit lock-keeper’s cottage near where he grew up in Oxfordshire. He lives with his second wife, the social anthropologist and author Kate Fox. They have homes in Oxford, and in south London, which is where the following conversation took place. Have you officially retired now? Well, I still do one day a week for the NHS, though apparently they want a “business case” for it, so I’m not getting paid at present. Yes, well, people talk about the mind-matter problem – it’s not a problem for me: mind is matter. That’s not being reductionist. It is actually elevating matter. We don’t even begin to understand how electrochemistry and nerve cells generate thought and feeling. We have not the first idea. The relation of neurosurgery to neuroscience is a bit like the relationship between plumbing and quantum mechanics.

Keyword: Consciousness
Link ID: 23842 - Posted: 07.17.2017

By BENEDICT CAREY Keith Conners, whose work with hyperactive children established the first standards for diagnosing and treating what is now known as attention deficit hyperactivity disorder, or A.D.H.D. — and who late in life expressed misgivings about how loosely applied that label had become — died on July 5 in Durham, N.C. He was 84. His wife, Carolyn, said the cause was heart failure. The field of child psychiatry was itself still young when Dr. Conners joined the faculty of the Johns Hopkins University School of Medicine in the early 1960s as a clinical psychologist. Children with emotional and behavioral problems often got a variety of diagnoses, depending on the clinic, and often ended up being given strong tranquilizers as treatment. Working with Dr. Leon Eisenberg, a prominent child psychiatrist, Dr. Conners focused on a group of youngsters who were chronically restless, hyperactive and sometimes aggressive. Doctors had recognized this type — “hyperkinesis,” it was called, or “minimal brain dysfunction” — but Dr. Conners combined existing descriptions and, using statistical analysis, focused on the core symptoms. The 39-item questionnaire he devised, called the Conners Rating Scale, quickly became the worldwide standard for assessing the severity of such problems and measuring improvement. It was later abbreviated to 10 items, giving child psychiatry a scientific foothold and anticipating by more than a decade the kind of checklists that would come to define all psychiatric diagnosis. He used his scale to study the effects of stimulant drugs on hyperactive children. Doctors had known since the 1930s that amphetamines could, paradoxically, calm such youngsters; a Rhode Island doctor, Charles Bradley, had published a well-known report detailing striking improvements in attention and academic performance among many children at a children’s inpatient home he ran near Providence. But it was a series of rigorous studies by Dr. Conners, in the 1960s and ’70s, that established stimulants — namely Dexedrine and Ritalin — as the standard treatments. © 2017 The New York Times Company

Keyword: ADHD
Link ID: 23833 - Posted: 07.14.2017

Deborah Orr Most people know about SSRIs, the antidepressant drugs that stop the brain from re-absorbing too much of the serotonin we produce, to regulate mood, anxiety and happiness. And a lot of people know about these drugs first hand, for the simple reason that they have used them. Last year, according to NHS Digital, no fewer than 64.7m antidepressant prescriptions were given in England alone. In a decade, the number of prescriptions has doubled. On Tuesday I joined the throng, and popped my first Citalopram. It was quite a thing – not least because, like an idiot, I dropped my pill about 90 minutes before curtain up for the Royal Shakespeare Company’s production of The Tempest at the Barbican. That’s right. This isn’t just mental illness: this is metropolitan-elite mental illness. It was a pretty overwhelming theatrical experience. The first indication that something was up came as I approached my local tube station. I noticed that I was in a state of extreme dissociation, walking along looking as though I was entirely present in the world yet feeling completely detached from it. I had drifted into total mental autopilot. Luckily, I was able to recognise my fugue. It’s a symptom of my condition, which, as I’ve written before, is complex post-traumatic stress disorder. The drug-induced dissociation was more intense than I’m used to when it’s happening naturally. I use the word advisedly. Much of what is thought of as illness is actually an extreme and sensible protective reaction to unbearable interventions from outside the self. © 2017 Guardian News and Media Limited

Keyword: Depression; Attention
Link ID: 23818 - Posted: 07.09.2017

Hannah Devlin A Catholic priest, a Rabbi and a Buddhist walk into a bar and order some magic mushrooms. It may sound like the first line of a bad joke, but this scenario is playing out in one of the first scientific investigations into the effects of psychedelic drugs on religious experience – albeit in a laboratory rather than a bar. Scientists at Johns Hopkins University in Baltimore have enlisted two dozen religious leaders from a wide range of denominations, to participate in a study in which they will be given two powerful doses of psilocybin, the active ingredient in magic mushrooms. Dr William Richards, a psychologist at Johns Hopkins University in Baltimore, Maryland who is involved in the work, said: “With psilocybin these profound mystical experiences are quite common. It seemed like a no-brainer that they might be of interest, if not valuable, to clergy.” The experiment, which is currently under way, aims to assess whether a transcendental experience makes the leaders more effective and confident in their work and how it alters their religious thinking. Despite most organised religions frowning on the use of illicit substances, Catholic, Orthodox and Presbyterian priests, a Zen Buddhist and several rabbis were recruited. The team has yet to persuade a Muslim imam or Hindu priest to take part, but “just about all the other bases are covered,” according to Richards. After preliminary screening, including medical and psychological tests, the participants have been given two powerful doses of psilocybin in two sessions, one month apart. © 2017 Guardian News and Media Limited

Keyword: Drug Abuse; Attention
Link ID: 23814 - Posted: 07.09.2017

By Anil Ananthaswamy To understand human consciousness, we need to know why it exists in the first place. New experimental evidence suggests it may have evolved to help us learn and adapt to changing circumstances far more rapidly and effectively. We used to think consciousness was a uniquely human trait, but neuroscientists now believe we share it with many other animals, including mammals, birds and octopuses. While plants and arguably some animals like jellyfish seem able to respond to the world around them without any conscious awareness, many other animals consciously experience and perceive their environment. Read more: Why be conscious – The improbable origins of our unique mind In the 19th century, Thomas Henry Huxley and others argued that such consciousness is an “epiphenomenon” – a side effect of the workings of the brain that has no causal influence, the way a steam whistle has no effect on the way a steam engine works. More recently, neuroscientists have suggested that consciousness enables us to integrate information from different senses or keep such information active for long enough in the brain that we can experience the sight and sound of car passing by, for example, as one unified perception, even though sound and light travel at different speeds. © Copyright New Scientist Ltd.

Keyword: Consciousness; Learning & Memory
Link ID: 23785 - Posted: 06.28.2017

By THERESE HUSTON “Does being over 40 make you feel like half the man you used to be?” Ads like that have led to a surge in the number of men seeking to boost their testosterone. The Food and Drug Administration reports that prescriptions for testosterone supplements have risen to 2.3 million from 1.3 million in just four years. There is such a condition as “low-T,” or hypogonadism, which can cause fatigue and diminished sex drive, and it becomes more common as men age. But according to a study published in JAMA Internal Medicine, half of the men taking prescription testosterone don’t have a deficiency. Many are just tired and want a lift. But they may not be doing themselves any favors. It turns out that the supplement isn’t entirely harmless: Neuroscientists are uncovering evidence suggesting that when men take testosterone, they make more impulsive — and often faulty — decisions. Researchers have shown for years that men tend to be more confident about their intelligence and judgments than women, believing that solutions they’ve generated are better than they actually are. This hubris could be tied to testosterone levels, and new research by Gideon Nave, a cognitive neuroscientist at the University of Pennsylvania, along with Amos Nadler at Western University in Ontario, reveals that high testosterone can make it harder to see the flaws in one’s reasoning. How might heightened testosterone lead to overconfidence? One possible explanation lies in the orbitofrontal cortex, a region just behind the eyes that’s essential for self-evaluation, decision making and impulse control. The neuroscientists Pranjal Mehta at the University of Oregon and Jennifer Beer at the University of Texas, Austin, have found that people with higher levels of testosterone have less activity in their orbitofrontal cortex. Studies show that when that part of the brain is less active, people tend to be overconfident in their reasoning abilities. It’s as though the orbitofrontal cortex is your internal editor, speaking up when there’s a potential problem with your work. Boost your testosterone and your editor goes reassuringly (but misleadingly) silent. © 2017 The New York Times Company

Keyword: Hormones & Behavior; Attention
Link ID: 23776 - Posted: 06.26.2017

Staring down a packed room at the Hyatt Regency Hotel in downtown San Francisco this March, Randy Gallistel gripped a wooden podium, cleared his throat, and presented the neuroscientists sprawled before him with a conundrum. “If the brain computed the way people think it computes," he said, "it would boil in a minute." All that information would overheat our CPUs. Humans have been trying to understand the mind for millennia. And metaphors from technology—like cortical CPUs—are one of the ways that we do it. Maybe it’s comforting to frame a mystery in the familiar. In ancient Greece, the brain was a hydraulics system, pumping the humors; in the 18th century, philosophers drew inspiration from the mechanical clock. Early neuroscientists from the 20th century described neurons as electric wires or phone lines, passing signals like Morse code. And now, of course, the favored metaphor is the computer, with its hardware and software standing in for the biological brain and the processes of the mind. In this technology-ridden world, it’s easy to assume that the seat of human intelligence is similar to our increasingly smart devices. But the reliance on the computer as a metaphor for the brain might be getting in the way of advancing brain research. As Gallistel continued his presentation to the Cognitive Neuroscience Society, he described the problem with the computer metaphor. If memory works the way most neuroscientists think it does—by altering the strength of connections between neurons—storing all that information would be way too energy-intensive, especially if memories are encoded in Shannon information, high fidelity signals encoded in binary.

Keyword: Learning & Memory; Consciousness
Link ID: 23764 - Posted: 06.23.2017

Kerin Higa After surgery to treat her epilepsy severed the connection between the two halves of her brain, Karen's left hand took on a mind of its own, acting against her will to undress or even to slap her. Amazing, to be sure. But what may be even more amazing is that most people who have split-brain surgery don't notice anything different at all. But there's more to the story than that. In the 1960s, a young neuroscientist named Michael Gazzaniga began a series of experiments with split-brain patients that would change our understanding of the human brain forever. Working in the lab of Roger Sperry, who later won a Nobel Prize for his work, Gazzaniga discovered that the two halves of the brain experience the world quite differently. When Gazzaniga and his colleagues flashed a picture in front of a patient's right eye, the information was processed in the left side of the brain and the split-brain patient could easily describe the scene verbally. But when a picture was flashed in front of the left eye, which connects to the right side of the brain, the patient would report seeing nothing. If allowed to respond nonverbally, however, the right brain could adeptly point at or draw what was seen by the left eye. So the right brain knew what it was seeing; it just couldn't talk about it. These experiments showed for the first time that each brain hemisphere has specialized tasks. In this third episode of Invisibilia, hosts Alix Spiegel and Hanna Rosin talk to several people who are trying to change their other self, including a man who confronts his own biases and a woman who has a rare condition that causes one of her hands to take on a personality of its own. © 2017 npr

Keyword: Consciousness; Laterality
Link ID: 23749 - Posted: 06.17.2017

by Helen Thompson Paper wasps have a knack for recognizing faces, and a new study adds to our understanding of what that means in a wasp’s brain. Most wasps of a given species look the same, but some species of paper wasp (Polistes sp.) display varied colors and markings. Recognizing these patterns is at the core of the wasps’ social interactions. One species, Polistes fuscatus, is especially good at detecting differences in faces — even better than they are at detecting other patterns. To zero on the roots of this ability, biologist Ali Berens of Georgia Tech and her colleagues set up recognition exercises of faces and basic patterns for P. fuscatus wasps and P. metricus wasps — a species that doesn’t naturally recognize faces but can be trained to do so in the lab. After the training, scientists extracted DNA from the wasps’ brains and looked at which bits of DNA or genes were active. The researchers found 237 genes that were at play only in P. fuscatus during facial recognition tests. A few of the genes have been linked to honeybee visual learning, and some correspond to brain signaling with the neurotransmitters serotonin and tachykinin. In the brain, picking up on faces goes beyond basic pattern learning, the researchers conclude June 14 in the Journal of Experimental Biology. It’s possible that some of the same genes also play a broader role in how organisms such as humans and sheep tell one face from another. © Society for Science & the Public 2000 - 2017

Keyword: Attention
Link ID: 23742 - Posted: 06.15.2017

Maria Temming Fascination with faces is nature, not nurture, suggests a new study of third-trimester fetuses. Scientists have long known that babies like looking at faces more than other objects. But research published online June 8 in Current Biology offers evidence that this preference develops before birth. In the first-ever study of prenatal visual perception, fetuses were more likely to move their heads to track facelike configurations of light projected into the womb than nonfacelike shapes. Past research has shown that newborns pay special attention to faces, even if a “face” is stripped down to its bare essentials — for instance, a triangle of three dots: two up top for eyes, one below for a mouth or nose. This preoccupation with faces is considered crucial to social development. “The basic tendency to pick out a face as being different from other things in your environment, and then to actually look at it, is the first step to learning who the important people are in your world,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. Using a 4-D ultrasound, the researchers watched how 34-week-old fetuses reacted to seeing facelike triangles compared with seeing triangles with one dot above and two below. They projected triangles of red light in both configurations through a mother’s abdomen into the fetus’s peripheral vision. Then, they slid the light across the mom’s belly, away from the fetus’s line of sight, to see if it would turn its head to continue looking at the image. |© Society for Science & the Public 2000 - 2017

Keyword: Development of the Brain; Attention
Link ID: 23726 - Posted: 06.09.2017