Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 141 - 160 of 703

by Ben Guarino The next time a friend tells you that you look sick, hear the person out. We are better than chance at detecting illness in others simply by looking at their faces, according to new research led by a Swedish psychologist. “We can detect subtle cues related to the skin, eyes and mouth,” said John Axelsson of the Karolinska Institute, who co-wrote the study published Tuesday in the journal Proceedings of the Royal Society B. “And we judge people as sick by those cues.” Other species have more finely tuned disease radars, relying primarily on the sense of smell. And previous research, Axelsson noted, has shown that animals can sniff sickness in other animals. (A Canadian hospital enlisted the help of an English springer spaniel trained to smell bacterial spores that infect patients.) Yet while there is some evidence that an unhealthy person gives off odors that another individual can identify as sickness, the face is our primary source of “social information for communication,” Axelsson said. He and his colleagues, a team that included neuroscientists and psychologists in Germany and Sweden, injected eight men and eight women with a molecule found in bacterial membranes. Like animals — from insects to mammals — people react very strongly to this substance, lipopolysaccharide. “People did not really become sick from the bacteria,” Axelsson said, but their bodies did not know the bacteria weren't actually attacking. Their immune systems kicked into action, complete with feelings of sickness. The subjects, all white, received about $430 for their trouble. © 1996-2018 The Washington Post

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 24483 - Posted: 01.03.2018

Just 10 minutes of aerobic exercise can improve executive function by priming parts of the brain used to laser focus on the task at hand, according to a new study. This paper, “Executive-Related Oculomotor Control Is Improved Following a 10-minute Single-Bout of Aerobic Exercise: Evidence from the Antisaccade Task,” was published in the January 2018 issue of Neuropsychologia. This research was conducted by Matthew Heath, who is a kinesiology professor and supervisor in the Graduate Program in Neuroscience at the University of Western Ontario, along with UWO master’s student Ashna Samani. For this study, Samani and Heath asked a cohort of healthy young adults to either sit quietly and read magazines or perform 10 minutes of moderate-to-vigorous physical activity (MVPA) on a stationary bicycle. (MVPA aerobic intensity is hard enough that you might break a sweat but easy enough that you can carry on a conversation.) Immediately after the 10-minute reading task or time spent doing aerobic exercise, the researchers used eye-tracking equipment to gauge antisaccades, which is a way to measure varying degrees of executive control. As the authors explain in the study abstract, “Antisaccades are an executive task requiring a goal-directed eye movement (i.e., a saccade) mirror-symmetrical to a visual stimulus. The hands- and language-free nature of antisaccades coupled with the temporal precision of eye-tracking technology make it an ideal tool for identifying executive performance changes.” © 1991-2018 Sussex Publishers, LLC

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 24476 - Posted: 01.02.2018

By Bret Stetka Every day our brains grapple with various last-minute decisions. We adjust our gait to avoid a patch of ice; we exit to hit the rest stop; we switch to our backhand before thwacking a tennis ball. Scientists have long accepted that our ability to abruptly stop or modify a planned behavior is controlled via a single region within the brain’s prefrontal cortex, an area involved in planning and other higher mental functions. By studying other parts of the brain in both humans and monkeys, however, a team from Johns Hopkins University has now concluded that last-minute decision-making is a lot more complicated than previously known, involving complex neural coordination among multiple brain areas. The revelations may help scientists unravel certain aspects of addictive behaviors and understand why accidents like falls grow increasingly common as we age, according to the Johns Hopkins team. The findings, published Thursday in Neuron, reveal reneging on an intended behavior involves coordinated cross talk between several brain regions. As a result, changing our minds even mere milliseconds after making a decision is often too late to alter a movement or behavior. Using functional magnetic resonance imaging—a technique that monitors brain activity in real time—the Johns Hopkins group found reversing a decision requires ultrafast communication between two specific zones within the prefrontal cortex and another nearby structure called the frontal eye field, an area involved in controlling eye movements and visual awareness. © 2017 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24403 - Posted: 12.08.2017

By Wendy Jones In Jane Austen’s Sense and Sensibility, Elinor Dashwood is talking to a new acquaintance, Lucy Steele. Based on their previous encounters, Elinor doesn’t think much of Lucy’s character. But Lucy seems determined to befriend Elinor and to make her a confidante. Elinor discovers Lucy’s true motives when the latter reveals that she is secretly engaged to Edward Ferrars, the man Elinor loves. Elinor is speechless: “Her astonishment at what she heard was at first too great for words.” Elinor isn’t the only one to experience this kind of shutdown and its accompanying frustration. When we’re angry, or upset, or fearful—in the grip of any strong emotion—most of us find it difficult to think clearly. This has to do with the inverse relationship between our sympathetic and parasympathetic nervous systems, which manage (respectively) the degree to which we’re excited or calm. Neuroscientist Stephen Porges has suggested that the thermostat for adjusting sympathetic and parasympathetic input can be found within these systems themselves. He has highlighted the operations involved from a “polyvagal perspective,” which considers our neurophysiological functioning in the context of safety, whether our environments are threatening or benign. I explore these and other neurosocial phenomena through the lens of the immensely popular novels of Jane Austen in my new book, Jane on the Brain: Exploring the Science of Social Intelligence. © 1986-2017 The Scientist

Related chapters from BN: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Higher Cognition
Link ID: 24402 - Posted: 12.08.2017

Mariah Quintanilla Emma Watson, Jake Gyllenhaal, journalist Fiona Bruce and Barack Obama all walk into a sheep pen. No, this isn’t the beginning of a baaa-d joke. By training sheep using pictures of these celebrities, researchers from the University of Cambridge discovered that the animals are able to recognize familiar faces from 2-D images. Given a choice, the sheep picked the familiar celebrity’s face over an unfamiliar face the majority of the time, the researchers report November 8 in Royal Society Open Science. Even when a celeb’s face was slightly tilted rather than face-on, the sheep still picked the image more often than not. That means the sheep were not just memorizing images, demonstrating for the first time that sheep have advanced face-recognition capabilities similar to those of humans and other primates, say neurobiologist Jennifer Morton and her colleagues. Sheep have been known to pick out pictures of individuals in their flock, and even familiar handlers (SN: 10/6/12, p. 20). But it’s been unclear whether the skill was real recognition or simple memorization. Sheep now join other animals, including horses, dogs, rhesus macaques and mockingbirds, that are able to distinguish between individuals of other species. Over a series of four training sessions, the sheep’s ability to choose a familiar face, represented by one of the four celebrities, over a completely unfamiliar face improved. |© Society for Science & the Public 2000 - 2017.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24306 - Posted: 11.08.2017

James Gorman Dogs have evolved to be friendly and tolerant of humans and one another, which might suggest they would be good at cooperative tasks. Wolves are known to cooperate in hunting and even in raising one another’s pups, but they can seem pretty intolerant of one another when they are snapping and growling around a kill. So researchers at the Wolf Science Center at the University of Vienna decided to compare the performance of wolves and dogs on a classic behavioral test. To get a food treat, two animals have to pull ropes attached to different ends of a tray. The trick is that they have to pull both ropes at the same time. Chimps, parrots, rooks and elephants have all succeeded at the task. When Sarah Marshall-Pescini, Friederike Range and colleagues put wolves and dogs to the test, wolves did very well and dogs very poorly. In recordings of the experiments, the pairs of wolves look like experts, while the dogs seem, well, adorable and confused. The researchers reported their findings in the Proceedings of the National Academy of Sciences. With no training, five of seven wolf pairs succeeded in mastering the task at least once. Only one of eight dog pairs did. © 2017 The New York Times Company

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress
Link ID: 24304 - Posted: 11.08.2017

Molecular method reveals neuronal basis of brain states – NIH-funded animal study. NIMH-funded scientists revealed the types of neurons supporting alertness, using a molecular method called MultiMAP in transparent larval zebrafish. Multiple types of neurons communicate by secreting the same major chemical messengers: serotonin (red), dopamine and noradrenalin (yellow) and acetylcholine (cyan). Using a molecular method likely to become widely adopted by the field, researchers supported by the National Institutes of Health have discovered brain circuitry essential for alertness, or vigilance – and for brain states more generally. Strikingly, the same cell types and circuits are engaged during alertness in zebra fish and mice, species whose evolutionary forebears parted ways hundreds of millions of years ago. This suggests that the human brain is likely similarly wired for this state critical to survival. “Vigilance gone awry marks states such as mania and those seen in post-traumatic stress disorder and depression,” explained Joshua Gordon, M.D., Ph.D., director of the NIH’s National Institute of Mental Health (NIMH), which along with the National Institute on Drug Abuse, co-funded the study. “Gaining familiarity with the molecular players in a behavior – as this new tool promises – may someday lead to clinical interventions targeting dysfunctional brain states.” For the first time, Multi-MAP makes it possible to see which neurons are activated in a behaving animal during a particular brain state – and subsequently molecularly analyze just those neurons to identify the subtypes and circuits involved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24282 - Posted: 11.03.2017

By Helen Thomson Do you find it difficult to spot a face in the crowd? Now we know why: people with face blindness seem to have a missing “hub” of brain connections. The discovery could be used to diagnose children with the condition, and teach them new ways to identify faces. People with prosopagnosia, which often runs in families, cannot easily tell faces apart. This can have a significant impact on people’s lives. People with the condition rely heavily on voice recognition, clothes, hairstyle and gait to identify people, but can still fail to recognise family and friends. It can lead to social anxiety and depression, and can often go undiagnosed for many years. Face processing isn’t a function of a single brain region, but involves the coordinated activity of several regions. To investigate what might be causing the problem, Galia Avidan at Ben-Gurion University of the Negev, Israel, and her colleagues scanned the brains of 10 adults who have reported life-long problems with face processing. They also scanned 10 adults without the condition. During the scan, participants were shown sets of images of emotional, neutral, famous and unfamiliar faces. During the task they were asked to press a button when two consecutive images were identical. Some of the images also included buildings, which people with face blindness do not have any trouble identifying – these acted as a control. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24281 - Posted: 11.03.2017

By GRETCHEN REYNOLDS Do brains trump brawn? A remarkable new study of how the human body prioritizes its inner workings found that if you intensely think at the same time as you intensely exercise, your performance in both thinking and moving can worsen. But your muscles’ performance will decline much more than your brain’s will, the study found. The results raise interesting questions about the roles that our body’s wide-ranging abilities may have played in the evolution of humans and also whether a hard workout is the ideal time to be cogitating. Compared to almost all other animals, we humans have disproportionately large brains for our size. Our supersized cranial contents probably provided an advantage during our evolution as a species. Smart creatures presumably could have outwitted predators and outmaneuvered prey, keeping themselves fed, uneaten and winners in the biological sweepstakes to pass on their genes. But most other species eschewed developing similarly outsized brains during evolution, because large brains carry a hefty metabolic cost. Brains are extraordinarily hungry organs, requiring, ounce for ounce, more calories to sustain their operations than almost any other tissue, and these caloric demands rise when the brain is hard at work. Thinking demands considerable bodily fuel. In order to feed and maintain these large brains, early humans’ bodies had to make certain trade-offs, most evolutionary biologists agree. Our digestive systems shrank during evolution, for one thing, since food processing is also metabolically ravenous. But whether a similar trade-off occurred with our muscles has remained in doubt. Muscles potentially provided another route to survival during our species’ early days. With sufficient brawn, animals, including people, could physically overpower prey and sprint from danger. © 2017 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 24243 - Posted: 10.26.2017

By Jessica Hamzelou Ever realised you have driven yourself home but haven’t really been paying attention? Brain scans have revealed that when your mind wanders, it switches into “autopilot” mode, enabling you to carry on doing tasks quickly, accurately and without conscious thought. Our autopilot mode seems to be run by a set of brain structures called the default mode network (DMN). It was discovered in the 1990s, when researchers noticed that people lying in brain scanners show patterns of brain activity even when they aren’t really doing anything. This research provided the first evidence that our brains are active even when we aren’t consciously putting our minds to work. But what does the DMN do? Several studies have found that it seems to be involved in assessing past events and planning for the future. Others suggest the network is involved in self-awareness – although this has been called into question by findings that rats and newborns appear to have a version of the DMN too. It is unlikely that rats are conscious of themselves in the same way that humans are, says Deniz Vatansever at the University of York, UK. Instead, the DMN must have a more basic function, common to all animals. Vatansever and his colleagues at the University of Cambridge wondered if the network might help us do things without paying much attention, such as tying our shoelaces, or driving along a familiar road. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24240 - Posted: 10.25.2017

Nicola Davis When it comes to understanding how another person thinks and feels, it might be best to close your eyes and listen. A study by an American psychologist suggests that people are better able to pick up on the emotions of others when simply focusing on their voice, compared with both watching and listening to them, or just watching them. “Humans are actually remarkably good at using many of their senses for conveying emotions, but emotion research historically is focused almost exclusively on the facial expressions,” said Michael Kraus, a social psychologist at Yale University and author of the study. While combining information from a person’s voice with their facial expressions and other cues might at first seem like a way to boost understanding of their thoughts and feelings, Kraus says pooling the senses divides attention. What’s more, he notes, facial expressions can mask a person’s true feelings – something that he says is harder to do with the voice – while language plays a key role in how people understand and label their emotions. The upshot, he says, is that what people say, and the way they say it, offers the clearest insights into the emotions of others. “Listening matters,” said Kraus. “Actually considering what people are saying and the ways in which they say it can, I believe, lead to improved understanding of others at work or in your personal relationships.” © 2017 Guardian News and Media Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24173 - Posted: 10.11.2017

By HEATHER MURPHY Well done -- you are an atypical person. Usually people notice the other, smaller toothbrush first. Most people will quickly spot the toothbrush on the front of the counter, but take longer — or even fail to find — the much bigger one behind it. The oversight has to do with scale. People have a tendency to miss objects when their size is inconsistent with their surroundings, according to a recent study in Current Biology. This is just the latest in a robust body of research that reveals how expectations dramatically affect our ability to notice what’s around us. Though the image above was provided by the authors of the study to illuminate their point, the study was set up slightly differently. The researchers were interested not only in what people saw — but also in how their performance compared with computers. Flesh-and-blood participants and a deep neural network, a computer system with advanced machine vision, were given one second to select an object in a computer-rendered scene, such as the one below. The object could be absent, presented at scale or featured at four times scale. Is there a parking meter in this image? Once you know what to expect, of course, it's easier. In the study, the object was either absent, presented at scale or featured at four times scale. Humans missed giant objects about 13 percent more than normal-sized objects, the researchers found. Scale had no impact on machine performance. “We were surprised about how compelling of an effect it is,” said Miguel Eckstein, a psychologist at the University of California, Santa Barbara’s Vision and Image Understanding Laboratory and one of the authors. In particular, the first time a person examined a photo with a giant object, the object often seemed to be invisible. But it’s not a deficiency, he said: “This is a useful trick the brain does to rapidly process scenes and find what we are looking for.” © 2017 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24161 - Posted: 10.07.2017

You may well be yawning just reading this - it's contagious. Now researchers have looked at what happens in our brains to trigger that response. A University of Nottingham team found it occurs in a part of the brain responsible for motor function. The primary motor cortex also plays a part in conditions such as Tourette's syndrome. So the scientists say understanding contagious yawning could also help understand those disorders too. Contagious yawning is a common form of echophenomena - the automatic imitation of someone else's words or actions. Echophenomena is also seen in Tourette's, as well as in other conditions, including epilepsy and autism. To test what's happening in the brain during the phenomenon, scientists monitored 36 volunteers while they watched others yawning. In the study, published in the journal Current Biology, some were told it was fine to yawn while others were told to stifle the urge. The urge to yawn was down to how each person's primary motor cortex worked - its "excitability". And, using external transcranial magnetic stimulation (TMS), it was also possible to increase "excitability" in the motor cortex and therefore people's propensity for contagious yawns. Georgina Jackson, professor of cognitive neuropsychology who worked on the study, said the finding could have wider uses: "In Tourette's, if we could reduce the excitability we might reduce the ticks, and that's what we are working on." Prof Stephen Jackson, who also worked on the research, added: "If we can understand how alterations in cortical excitability give rise to neural disorders we can potentially reverse them. "We are looking for potential non-drug, personalised treatments, using TMS that might be effective in modulating imbalances in the brain networks." © 2017 BBC

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24022 - Posted: 09.01.2017

By Helen Thomson Have you ever seen the Virgin Mary in your grilled cheese? Or a screaming face inside a bell pepper? Seeing faces in inanimate objects is a common phenomenon. Now it seems that we’re not alone in experiencing it – monkeys do too. Pareidolia is the scientific term for erroneously perceiving faces where none exist. Other examples including seeing “ghosts” in blurry photos and the man in the moon. To investigate whether pareidolia was a uniquely human experience, Jessica Taubert at the US National Institute of Mental Health in Maryland and her colleagues trained five rhesus macaques to stare at pairs of photos. Each photo showed either an inanimate object that prompts pareidolia in humans, an equivalent object that doesn’t, or the face of a monkey (below). We already knew that both people and monkeys will look longer at images of faces than other things. So the team presented each of the photos in every possible pairing – 1980 in all – and measured the time the monkeys spent looking at each. The monkeys did indeed seem to succumb to pareidolia – they spent more time looking at illusory faces than the non-illusory photos they were paired with. Interestingly, they also spent more time looking at the illusory faces than the monkey faces, perhaps because they spent longer studying these more unusual “faces”, or because they tend to dislike holding the gaze of another monkey. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23997 - Posted: 08.25.2017

By Helen Thomson Our brains seem better at predictions than we are. A part of our brain becomes active when it knows something will be successfully crowdfunded, even if we consciously decide otherwise. If this finding stands up and works in other areas of life, neuroforecasting may lead to better voting polls or even predict changes in financial markets. To see if one can predict market behaviour by sampling a small number of people, Brian Knutson at Stanford University in California and his team scanned the brains of 30 people while they decided whether to fund 36 projects from the crowdfunding website Kickstarter. The projects were all recently posted proposals for documentary films. Each participant had their brain scanned while taking in the pictures and descriptions of each campaign, and they were then asked if they would want to fund the project. When the real Kickstarter campaigns ended a few weeks later, 18 of the projects had gained enough funding to go forward. Examining the participants’ brain scans, the team discovered that activity in a region called the nucleus accumbens had been different when they considered projects that later went on to be successful. Prediction paradox The team trained an algorithm to recognise these differences in brain activity using scan data from 80 per cent of the projects, then tested the program on the remaining 20 per cent. Using neural activity alone, the algorithm was able to forecast which Kickstarter campaigns would be funded with 59.1 per cent accuracy – more than would be expected by chance. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23984 - Posted: 08.22.2017

By Abby Olena Our brains quickly characterize everything we see as familiar or new, and scientists have been investigating this connection between vision and cognition for years. Now, research in Japanese macaques (Macaca fuscata) reveals that the activation of neurons in a part of the primate brain called the perirhinal cortex can cause monkeys to recognize new objects as familiar and vice versa. The study was published today (August 17) in Science. “There are a lot of really exciting aspects to this paper,” says neuroscientist David Sheinberg of Brown University, who did not participate in the work. “This group continues to make advances that are helping us understand how we convert visual impressions into things we know.” Primate brains process visual information through several brain structures that make up the ventral visual stream. The last stop in this stream is the perirhinal cortex, part of the medial temporal lobe. Scientists know that this brain structure plays roles in visual memory and object discrimination. But one open question is whether the perirhinal cortex represents objects’ physical traits or whether it might also communicate information about nonphysical attributes, such as whether an object has been seen before. “In the primate, the perirhinal cortex is the link between the visual pathway and the limbic memory system,” coauthor and University of Tokyo neuroscientist Yasushi Miyashita writes in an email to The Scientist. “Therefore, the perirhinal cortex is one of the most likely candidates in the brain where visual information is transformed to subjective semantic values by referring to one’s own memory.” © 1986-2017 The Scientist

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23979 - Posted: 08.19.2017

By Aylin Woodward Two newly identified brain areas in rhesus monkeys seem to help the animals recognise familiar faces. Primates, Homo sapiens included, must be able to differentiate between faces and recognise friend from foe because social hierarchies play a large role in daily life. But exactly how primate brains deal with faces is not completely clear. One idea is that the same parts of the brain are involved in recognising both familiar and unfamiliar faces, just with varying efficiency. But Sofia Landi and Winrich Freiwald at Rockefeller University in New York have now cast doubt on that thinking. Their work shows that distinct brain areas are responsible for recognising the primates you know. Many researchers have already shown that certain areas of the temporal and prefrontal cortex are involved in unfamiliar face perception in rhesus monkey brains. Using whole-brain fMRI scans of four monkeys, Landi and Freiwald have now identified two additional brain areas that play a role not only in unfamiliar face perception but also in recognising familiar faces. The two new areas are in the anterior temporal lobe – the part of our brains above and in front of our ears. One is in the perirhinal cortex and one is in the temporal pole. These regions lit up far more when the monkeys recognised a familiar face in a photograph, as opposed to when they were presented with images of a stranger. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23949 - Posted: 08.11.2017

By Amanda Onion, While driving and accelerating in his car, a patient in France suddenly had a bizarre sensation. He felt like he was outside his car, looking in at his physical self, which was still at the wheel. The patient was part of a new study that links problems of the inner ear with eerie "out-of-body" experiences. These experiences arecurious, usually brief sensations in which a person's consciousness seems to exitthe body and then view the body from the outside. The study analyzed 210 patients who had visited their doctors with so-called vestibular disorders. The vestibular system, which is made up of several structures in the inner ear, provides the body with a sense of balance and spatial orientation. Problems with this system can cause dizziness or a floating sensation, among other symptoms. [7 Weird Facts About Balance] Maya Elzière, an ear, nose and throat specialist at Hôpital Européen in Marseille, France, and co-author of the study, enlisted patients who had experienced a range of issues, from recurrent vertigo and tinnitus to infections in the ear. Among these patients, 14 percent reported out-of-body experiences, compared with only 5 percent of healthy people without vestibular disorders who said the same. "Out-of-body experiences were about three times more frequent" in patients with vestibular disorders, versus those without these disorders, said Christophe Lopez, lead author of the study and a neuroscientist at Aix-Marseille Université in France. © 2017 Scientific American,

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Higher Cognition
Link ID: 23922 - Posted: 08.07.2017

By Knvul Sheikh The brain has evolved to recognize and remember many different faces. We can instantly identify a friend's countenance among dozens in a crowded restaurant or on a busy street. And a brief glance tells us whether that person is excited or angry, happy or sad. Brain-imaging studies have revealed that several blueberry-size regions in the temporal lobe—the area under the temple—specialize in responding to faces. Neuroscientists call these areas “face patches.” But neither brain scans nor clinical studies of patients with implanted electrodes explained exactly how the cells in these patches work. Now, using a combination of brain imaging and single-neuron recording in macaques, biologist Doris Tsao and her colleagues at the California Institute of Technology appear to have finally cracked the neural code for primate face recognition. The researchers found the firing rate of each face patch cell corresponds to a separate facial feature. Like a set of dials, the cells can be fine-tuned to respond to bits of information, which they can then combine in various ways to create an image of every face the animal encounters. “This was mind-blowing,” Tsao says. “The values of each dial are so predictable that we can re-create the face that a monkey sees by simply tracking the electrical activity of its face cells.” Previous studies had hinted at the specificity of these brain areas for encoding faces. In the early 2000s, when Tsao was a postdoctoral researcher at Harvard Medical School, she and electrophysiologist Winrich Freiwald showed that neurons in a monkey's face patches would fire electrical signals every time the animal saw pictures of a face. But the same brain cells showed little or no response to other objects, such as images of vegetables, radios or nonfacial body parts. Other experiments indicated that neurons in these regions could also distinguish among individual faces, even if they were cartoons. © 2017 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23913 - Posted: 08.03.2017

By Karl Gruber Are you good with faces? So is the Japanese rice fish – at least, it is if the faces are the right way up. Just like humans, the tiny fish has no problem recognising faces orientated the usual way, but, again like us, it struggles when they are inverted. The finding indicates that the fish may have developed a unique brain pathway for face recognition, just as humans have. We have no problem identifying most objects in our environment – say, a chair – no matter what way up they are. But faces are different. It is relatively easy for us to spot the differences between two faces, even if they are physically similar, if we see them in photographs the right way up. But if the images are upside down, telling them apart gets a bit tricky. “This is because we have a specific brain area for processing faces, and when the face is upside down, we process the image through object processing pathways, and not the face-processing pathways any more,” says Mu-Yun Wang at the University of Tokyo, Japan. Until now, this face-inversion effect was considered exclusive to mammals as it has only been observed in primates and sheep. Enter the Japanese rice fish, also known as the medaka (Oryzias latipes), a 3.5-centimetre-long shoaling fish commonly found in rice paddies, marshes, ponds and slow-moving streams in East Asia. These fish are very social, so identifying the right individuals to associate with is important. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 23891 - Posted: 07.28.2017