Links for Keyword: Attention
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Angus Chen We know we should put the cigarettes away or make use of that gym membership, but in the moment, we just don't do it. There is a cluster of neurons in our brain critical for motivation, though. What if you could hack them to motivate yourself? These neurons are located in the middle of the brain, in a region called the ventral tegmental area. A paper published Thursday in the journal Neuron suggests that we can activate the region with a little bit of training. The researchers stuck 73 people into an fMRI, a scanner that can detect what part of the brain is most active, and focused on that area associated with motivation. When the researchers said "motivate yourself and make this part of your brain light up," people couldn't really do it. "They weren't that reliable when we said, 'Go! Get psyched. Turn on your VTA,' " says Dr. Alison Adcock, a psychiatrist at Duke and senior author on the paper. That changed when the participants were allowed to watch a neurofeedback meter that displayed activity in their ventral tegmental area. When activity ramps up, the participants see the meter heat up while they're in the fMRI tube. "Your whole mind is allowed to speak to a specific part of your brain in a way you never imagined before. Then you get feedback that helps you discover how to turn that part of the brain up or down," says John Gabrieli, a neuroscientist at the Massachusetts Institute of Technology who was not involved with the work. © 2016 npr
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 21954 - Posted: 03.05.2016
Monya Baker Is psychology facing a ‘replication crisis’? Last year, a crowdsourced effort that was able to validate fewer than half of 98 published findings1 rang alarm bells about the reliability of psychology papers. Now a team of psychologists has reassessed the study and say that it provides no evidence for a crisis. “Our analysis completely invalidates the pessimistic conclusions that many have drawn from this landmark study,” says Daniel Gilbert, a psychologist at Harvard University in Cambridge, Massachusetts, and a co-author of the reanalysis, published on 2 March in Science2. But a response3 in the same issue of Science counters that the reanalysis itself depends on selective assumptions. And others say that psychology still urgently needs to improve its research practices. Statistical criticism In August 2015, a team of 270 researchers reported the largest ever single-study audit of the scientific literature. Led by Brian Nosek, executive director of the Center for Open Science in Charlottesville, Virginia, the Reproducibility Project attempted to replicate studies in 100 psychology papers. (It ended up with 100 replication attempts for 98 papers because of problems assigning teams to two papers.) According to one of several measures of reproducibility, just 36% could be confirmed; by another statistical measure, 47% could1. Either way, the results looked worryingly feeble. “Both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted” Not so fast, says Gilbert. © 2016 Nature Publishing Group
By David Z. Hambrick We all make stupid mistakes from time to time. History is replete with examples. Legend has it that the Trojans accepted the Greek’s “gift” of a huge wooden horse, which turned out to be hollow and filled with a crack team of Greek commandos. The Tower of Pisa started to lean even before construction was finished—and is not even the world’s farthest leaning tower. NASA taped over the original recordings of the moon landing, and operatives for Richard Nixon’s re-election committee were caught breaking into a Watergate office, setting in motion the greatest political scandal in U.S. history. More recently, the French government spent $15 billion on a fleet of new trains, only to discover that they were too wide for some 1,300 station platforms. We readily recognize these incidents as stupid mistakes—epic blunders. On a more mundane level, we invest in get-rich-quick schemes, drive too fast, and make posts on social media that we later regret. But what, exactly, drives our perception of these actions as stupid mistakes, as opposed to bad luck? Their seeming mindlessness? The severity of the consequences? The responsibility of the people involved? Science can help us answer these questions. In a study just published in the journal Intelligence, using search terms such as “stupid thing to do”, Balazs Aczel and his colleagues compiled a collection of stories describing stupid mistakes from sources such as The Huffington Post and TMZ. One story described a thief who broke into a house and stole a TV and later returned for the remote; another described burglars who intended to steal cell phones but instead stole GPS tracking devices that were turned on and gave police their exact location. The researchers then had a sample of university students rate each story on the responsibility of the people involved, the influence of the situation, the seriousness of the consequences, and other factors. © 2016 Scientific American,
Alison Abbott. More than 50 years after a controversial psychologist shocked the world with studies that revealed people’s willingness to harm others on order, a team of cognitive scientists has carried out an updated version of the iconic ‘Milgram experiments’. Their findings may offer some explanation for Stanley Milgram's uncomfortable revelations: when following commands, they say, people genuinely feel less responsibility for their actions — whether they are told to do something evil or benign. “If others can replicate this, then it is giving us a big message,” says neuroethicist Walter Sinnot-Armstrong of Duke University in Durham, North Carolina, who was not involved in the work. “It may be the beginning of an insight into why people can harm others if coerced: they don’t see it as their own action.” The study may feed into a long-running legal debate about the balance of personal responsibility between someone acting under instruction and their instructor, says Patrick Haggard, a cognitive neuroscientist at University College London, who led the work, published on 18 February in Current Biology1. Milgram’s original experiments were motivated by the trial of Nazi Adolf Eichmann, who famously argued that he was ‘just following orders’ when he sent Jews to their deaths. The new findings don’t legitimize harmful actions, Haggard emphasizes, but they do suggest that the ‘only obeying orders’ excuse betrays a deeper truth about how a person feels when acting under command. © 2016 Nature Publishing Group
Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 21915 - Posted: 02.19.2016
By Jordana Cepelewicz Seasonal variations play a major role in the animal kingdom—in reproduction, food availability, hibernation, even fur color. Whether this seasonality has such a significant influence on humans, however, is an open question. Its best-known association is with mood—that is, feeling down during the colder months and up in the summer—and, in extreme cases, seasonal depression, a phenomenon known as seasonal affective disorder (SAD). A new study published in this week’s Proceedings of the National Academy of Sciences seeks to delve deeper into how human biology has adapted not only to day/night cycles (circadian rhythms) but to yearly seasonal patterns as well. Scientists have previously found seasonal variation in the levels and concentrations of certain compounds associated with mood (including dopamine and serotonin), conception and even mortality. Now for the first time, using functional MRI, “it’s [been] conclusively shown that cognition and the brain’s means of cognition are seasonal,” says neuroscientist Gilles Vandewalle of the University of Liège in Belgium, the study’s lead researcher. These findings come at a time when some scientists are disputing the links between seasonality and mental health. Originally aiming to investigate the impact of sleep and sleep deprivation on brain function, Vandewalle and his fellow researchers placed 28 participants on a controlled sleep/wake schedule for three weeks before bringing them into the laboratory, where they stayed for 4.5 days. During this time they underwent a cycle of sleep deprivation and recovery in the absence of seasonal cues such as natural light, time information and social interaction. Vandewalle’s team repeated the entire procedure with the same subjects several times throughout the course of nearly a year and a half. © 2016 Scientific American
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 21882 - Posted: 02.10.2016
By Virginia Morell Like fearful humans, horses raise the inner brow of their eyes when threatened or surprised. Altogether their faces can convey 17 emotions (ours express 27), and they readily recognize the expressions on their fellow equines. But can they read our facial cues? To find out, researchers tested 28 horses, including 21 geldings and seven mares, from stables in the United Kingdom. Each horse was led by his/her halter rope to a position in the stable, and then presented with a life-size color photograph of the face of a man. The man was either smiling or frowning angrily. The scientists recorded the animals’ reactions, and measured their heart rates. Other studies have shown that stressed horses’ heart rates fluctuate, and when the horses looked at the angry man, their hearts reached a maximum heart rate more quickly than when they viewed the smiling image. When shown the angry face, 20 of the horses also turned their heads so that they could look at it with their left eye—a response that suggests they understood the expression, the scientists report online today in Biology Letters, because the right hemisphere of the brain is specialized for processing negative emotions. Dogs, too, have this “left-gaze bias” when confronting angry faces. Also, like dogs, the horses showed no such bias, such as moving their heads to look with the right eye, when viewing the happy faces—perhaps because the animals don’t need to respond to nonthreatening cues. But an angry expression carries a warning—the person may be about to strike. The discovery that horses as well as dogs—the only two animals this has been tested in—can read our facial expressions spontaneously and without training suggests one of two things: Either these domesticated species devote a lot of time to learning our facial cues, or the ability is innate and more widespread in the animal kingdom than previously thought. © 2016 American Association for the Advancement of Scienc
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21878 - Posted: 02.10.2016
By Anna K. Bobak, Sarah Bate For years scientists have studied the biological basis of human speed, and reported that the fastest athletes are short and muscular in build. However, these conclusions were challenged in 2008 when a new athlete, substantially taller than previous world-record holders, was identified as the fastest man in history. Usain Bolt presented the purest expression of human speed on the planet – and raised the possibility that scientists may need to entirely change the way they think about human biometrics. In the same vein, one might ask whether examinations of the brain at its height of efficiency will present new insights into its workings. Although researchers have historically examined people with a very high IQ (i.e. those with more generalised skills), it has become more and more clear that some individuals only perform extraordinarily well on specific cognitive tasks. Among the most interesting of these is facial identity recognition. In fact, the extraordinary skills of these so-called “super-recognisers” do not seem to correlate with IQ or memory for objects, yet they claim to recognise faces which they have only briefly been seen before, or have undergone substantial changes in appearance. For instance, in a recent scientific report from our laboratory (unpublished), one super-recogniser described bumping into a girl from a children’s’ swimming class he coached as a teenager. He recognised her immediately, despite the fact that he’d not seen her for over ten years and she was now an adult. So how can these people change the way that scientists think about the human brain? For many years researchers have generally agreed that faces are “special.” © 2016 Scientific American
Timothy Egan This weekend, I’m going to the Mojave Desert, deep into an arid wilderness of a half-million acres, for some stargazing, bouldering and January sunshine on my public lands. I won’t be out of contact. I checked. If Sarah Palin says something stupid on Donald Trump’s behalf — scratch that. When Sarah Palin says something stupid on Donald Trump’s behalf, I’ll get her speaking-in-tongues buffoonery in real time, along with the rest of the nation. The old me would have despised the new me for admitting such a thing. I’ve tried to go on digital diets, fasting from my screens. I was a friend’s guest at a spa in Arizona once and had so much trouble being “mindful” that they nearly kicked me out. Actually, I just wanted to make sure I didn’t miss the Seahawks game, mindful of Seattle’s woeful offensive line. In the information blur of last year, you may have overlooked news of our incredibly shrinking attention span. A survey of Canadian media consumption by Microsoft concluded that the average attention span had fallen to eight seconds, down from 12 in the year 2000. We now have a shorter attention span than goldfish, the study found. Attention span was defined as “the amount of concentrated time on a task without becoming distracted.” I tried to read the entire 54-page report, but well, you know. Still, a quote from Satya Nadella, the chief executive officer of Microsoft, jumped out at me. “The true scarce commodity” of the near future, he said, will be “human attention.” Putting aside Microsoft’s self-interest in promoting quick-flash digital ads with what may be junk science, there seems little doubt that our devices have rewired our brains. We think in McNugget time. The trash flows, unfiltered, along with the relevant stuff, in an eternal stream. And the last hit of dopamine only accelerates the need for another one. © 2016 The New York Times Company
by Emily Reynolds We know more about what the brain does when it's active than we do when it's at rest. It makes sense -- much neuroscientific research has looked to understand particular (and active) processes. James Kozloski, a researcher at IBM, has investigated what the brain does when it's resting -- what he calls 'the Grand Loop'. "The brain consumes a great amount of energy doing nothing. It's a great mystery of neuroscience," Kozloski told PopSci. He argued that around 90 percent of the energy used by the brain remained "unaccounted for". He believes that the brain is constantly 'looping signals', retracing neural pathways over and over again. It's a "closed loop", according to Kozloski, meaning it isn't reliant on external inputs as much of the brain's activity is. Kozloski tested his theory by running his model through IBM's neural tissue simulator and found that it could potentially account for genetic mutations such as Huntington's. He argued that information created by one mutated gene could, through the 'Grand Loop', affect an entire neural pathway. So what happens when our brain is at work? And how does expending energy affect our neural processes? Much historic research into anxiety has found that people tend to exert more energy or force when they're being watched -- something that leads to slip-ups or mistakes under pressure.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 21806 - Posted: 01.21.2016
A map for other people’s faces has been discovered in the brain. It could help explain why some of us are better at recognising faces than others. Every part of your body that you can move or feel is represented in the outer layer of your brain. These “maps”, found in the motor and sensory cortices (see diagram, below), tend to preserve the basic spatial layout of the body – neurons that represent our fingers are closer to neurons that represent our arms than our feet, for example. The same goes for other people’s faces, says Linda Henriksson at Aalto University in Helsinki, Finland. Her team scanned 12 people’s brains while they looked at hundreds of images of noses, eyes, mouths and other facial features and recorded which bits of the brain became active. This revealed a region in the occipital face area in which features that are next to each other on a real face are organised together in the brain’s representation of that face. The team have called this map the “faciotopy”. The occipital face area is a region of the brain known to be involved in general facial processing. “Facial recognition is so fundamental to human behaviour that it makes sense that there would be a specialised area of the brain that maps features of the face,” she says. © Copyright Reed Business Information Ltd.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 5: The Sensorimotor System
Link ID: 21804 - Posted: 01.20.2016
Maggie Koerth-Baker In 1990, when James Danckert was 18, his older brother Paul crashed his car into a tree. He was pulled from the wreckage with multiple injuries, including head trauma. The recovery proved difficult. Paul had been a drummer, but even after a broken wrist had healed, drumming no longer made him happy. Over and over, Danckert remembers, Paul complained bitterly that he was just — bored. “There was no hint of apathy about it at all,” says Danckert. “It was deeply frustrating and unsatisfying for him to be deeply bored by things he used to love.” A few years later, when Danckert was training to become a clinical neuropsychologist, he found himself working with about 20 young men who had also suffered traumatic brain injury. Thinking of his brother, he asked them whether they, too, got bored more easily than they had before. “And every single one of them,” he says, “said yes.” Those experiences helped to launch Danckert on his current research path. Now a cognitive neuroscientist at the University of Waterloo in Canada, he is one of a small but growing number of investigators engaged in a serious scientific study of boredom. There is no universally accepted definition of boredom. But whatever it is, researchers argue, it is not simply another name for depression or apathy. It seems to be a specific mental state that people find unpleasant — a lack of stimulation that leaves them craving relief, with a host of behavioural, medical and social consequences. © 2016 Nature Publishing Group
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 21784 - Posted: 01.13.2016
By Diana Kwon Pupils are a rich source of social information. Although changes in pupil size are automatic and uncontrollable, they can convey interest, arousal, helpful or harmful intentions, and a variety of emotions. According to a new study published in Psychological Science, we even synchronize our pupil size with others—and doing so influences social decisions. Mariska Kret, a psychologist now at the University of Amsterdam in the Netherlands, and her colleagues recruited 69 Dutch university students to take part in an investment game. Each participant decided whether to transfer zero or five euros to a virtual partner after viewing a video of their eyes for four seconds. The invested money is tripled, and the receiver chooses how much to give back to the donor—so subjects had to make quick decisions about how trustworthy each virtual partner seemed. Using an eye tracker, the investigators found that the participants' pupils tended to mimic the changes in the partners' pupils, whether they dilated, constricted or remained static. As expected, subjects were more likely to give more money to partners with dilating pupils, a well-established signal of nonthreatening intentions. The more a subject mirrored the dilating pupils of a partner, the more likely he or she was to invest—but only if they were of the same race. The Caucasian participants trusted Caucasian eyes more than Asian eyes—which suggests that group membership is important when interpreting these subtle signals. © 2015 Scientific American
Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 21735 - Posted: 12.30.2015
James Bond's villain in the latest 007 film, Spectre, could use a lesson in neuroanatomy, a Toronto neurosurgeon says. In a scene recorded in a Morroccan desert, Ernst Stavro Blofeld, played by Christoph Waltz, tortures Bond using restraints and a head clamp fused with a robotic drill. The goal is to inflict pain and erase 007's memory bank of faces. But Blofeld didn't have his brain anatomy down and could have likely killed Daniel Craig's character instead, Dr. Michael Cusimano of St. Michael's Hospital, says in a letter published in this week's issue of the journal Nature. Aiming to erase Bond's memory of faces, the villain correctly intends to drill into the lateral fusiform gyrus, an area of the brain responsible for recognizing faces, Cusimano said. But in practice, the drill was placed in the wrong area, aiming for the neck instead of the brain. "Whereas the drill should have been aimed just in front of 007's ear, it was directed below the mastoid process under and behind his left ear," Cusimano wrote. It likely would have triggered a stroke or massive hemorrhage, he said. In a draft of the letter, Cusimano said he was "spellbound" watching the film in a packed theatre, but his enjoyment was somewhat marred by the blunder. "I laughed," he recalled in an interview. "I think people around me kind of looked at me and were wondering why I was laughing because it's a pretty tense part of the movie." ©2015 CBC/Radio-Canada.
by Laura Sanders There’s only so much brainpower to go around, and when the eyes hog it all, the ears suffer. When challenged with a tough visual task, people are less likely to perceive a tone, scientists report in the Dec. 9 Journal of Neuroscience. The results help explain what parents of screen-obsessed teenagers already know. For the study, people heard a tone while searching for a letter on a computer screen. When the letter was easy to find, participants were pretty good at identifying a tone. But when the search got harder, people were less likely to report hearing the sound, a phenomenon called inattentional deafness. Neural responses to the tone were blunted when people worked on a hard visual task, but not when the visual task was easy, researchers found. By showing that a demanding visual job can siphon resources away from hearing, the results suggest that perceptual overload can jump between senses. © Society for Science & the Public 2000 - 2015
Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 21682 - Posted: 12.09.2015
Scientists have come up with a questionnaire they say should help diagnose a condition called face blindness. Prosopagnosia, as doctors call it, affects around two in every 100 people in the UK and is the inability to recognise people by their faces alone. In its most extreme form, people cannot even recognise their family or friends. Milder forms, while still distressing, can be tricky to diagnose, which is why tests are needed. People with prosopagnosia often use non-facial cues to recognise others, such as their hairstyle, clothes, voice, or distinctive features. Some may be unaware they have the condition, instead believing they have a "bad memory for faces". But prosopagnosia is entirely unrelated to intelligence or broader memory ability. One [anonymous] person with prosopagnosia explains: "My biggest problem is seeing the difference between ordinary-looking people, especially faces with few specific traits. "I work at a hospital with an awful lot of employees and I often introduce myself to colleagues with whom I have worked several times before. I also often have problems recognising my next-door neighbour, even though we have been neighbours for eight years now. She often changes clothes, hairstyle and hair colour. When I strive to recognise people, I try to use technical clues like clothing, hairstyle, scars, glasses, their dialect and so on." Doctors can use computer-based tests to see if people can spot famous faces and memorise and recognise a set of unfamiliar faces. And now Drs Richard Cook, Punit Shah and City University London and Kings College London have come up with a 20-item questionnaire to help measure the severity of someone's face blindness. © 2015 BBC
By Christian Jarrett Neuroscientists, for obvious reasons, are really interested in finding out what’s different about the brains of people with unpleasant personalities, such as narcissists, or unsavory habits, like porn addiction. Their hope is that by studying these people’s brains we might learn more about the causes of bad character, and ways to helpfully intervene. Now to the list of character flaws that've received the brain-scanner treatment we can apparently add sexism — a new Japanese study published in Scientific Reports claims to have found its neurological imprint. The researchers wanted to know whether there is something different about certain individuals’ brains that potentially predisposes them to sexist beliefs and attitudes (of course, as with so much neuroscience research like this, it’s very hard to disentangle whether any observed brain differences are the cause or consequence of the trait or behavior that’s being studied, a point I’ll come back to). More specifically, they were looking to see if people who publicly endorse gender inequality have brains that are anatomically different from people who believe in gender equality. In short, it seems the answer is yes. Neuroscientist Hikaru Takeuchi at Tohoku University and his colleagues have identified two brain areas where people who hold sexist attitudes have different levels of gray-matter density (basically, a measure of how many brain cells are packed into a given area), as compared with people who profess a belief in gender equality (their study doesn’t speak to any subconsciously held sexist beliefs). What’s more, these neural differences were correlated with psychological characteristics that could help explain some people’s sexist beliefs. © 2015, New York Media LLC.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 21579 - Posted: 10.29.2015
Dyscalculia is like dyslexia — but for those who have trouble with math instead of reading. But not enough people know about it, according to a neuroscientist. "There is a lack of awareness among teachers and educators," said Daniel Ansari, professor and Canada Research Chair in Developmental Cognitive Neuroscience at the University of Western Ontario. Individuals with dyscalculia have trouble with simple calculations. "If I ask you what is 1 + 3, you don't need to calculate. Four will pop in to your head, it is stored in your long-term memory," he said. But those with dyscalculia will have to use their hands to count. Scientists have known about dyscalculia since the 1940's but little research has been done on it, even though it is probably just as common as dyslexia, says Ansari. Currently, there is no existing universal form of testing for dyscalculia. But Ansari has come up with screening tests for children in kindergarten. He says it's important to diagnose dyscalculia early on, so individuals can learn to adapt and improve their skills before it's too late. "We don't just need math to be good in school but to function in society," said Ansari. He says research has shown poor math skills can lead to an increased chance of unemployment, imprisonment or mortgage default. ©2015 CBC/Radio-Canada.
In a study of mice, scientists discovered that a brain region called the thalamus may be critical for filtering out distractions. The study, published in Nature and partially funded by the National Institutes of Health, paves the way to understanding how defects in the thalamus might underlie symptoms seen in patients with autism, attention deficit hyperactivity disorder (ADHD), and schizophrenia. “We are constantly bombarded by information from our surroundings,” said James Gnadt, Ph.D., program director at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS). “This study shows how the circuits of the brain might decide which sensations to pay attention to.” Thirty years ago Dr. Francis Crick proposed that the thalamus “shines a light” on regions of the cortex, which readies them for the task at hand, leaving the rest of the brain’s circuits to idle in darkness. “We typically use a very small percentage of incoming sensory stimuli to guide our behavior, but in many neurological disorders the brain is overloaded,” said Michael Halassa, M.D., Ph.D., the study’s senior author and an assistant professor at New York University’s Langone Medical Center. “It gets a lot of sensory input that is not well-controlled because this filtering function might be broken.” Neuroscientists have long believed that an area at the very front of the brain called the prefrontal cortex (PFC) selects what information to focus on, but how this happens remains unknown. One common theory is that neurons in the PFC do this by sending signals to cells in the sensory cortices located on the outer part of the brain. However, Dr. Halassa’s team discovered that PFC neurons may instead tune the sensitivity of a mouse brain to sights and sounds by sending signals to inhibitory thalamic reticular nucleus (TRN) cells located deep inside the brain.
Susan Gaidos CHICAGO — Teens like high-tech gadgets so much that they often use them all at once. While doing homework or playing video games, teens may listen to music or watch TV, all the while texting their friends. Some of these multitaskers think they are boosting their ability to attend to multiple activities, but in fact are more likely impairing their ability to focus, psychologist Mona Moisala of the University of Helsinki, reported October 18 at the annual meeting of the Society for Neuroscience. Moisala and colleagues tested 149 adolescents and young adults, ages 13 to 24, who regularly juggle multiple forms of media or play video games daily. Each participant had to focus attention on sentences (some logical, some illogical) under three conditions: without any distractions, while listening to distracting sounds, and while both listening to a sentence and reading another sentence. Using functional MRI to track brain activity, the researchers found that daily gaming had no effect on participants’ ability to focus. Those who juggle multiple forms of electronic media, however, had more trouble paying attention. Multitaskers performed lower overall, even when they weren’t being distracted. Brain images showed that the multitaskers also showed a higher level of activity in the right prefrontal cortex, an area of the brain implicated in problem solving and in processing complex thoughts and emotions. “Participants with the highest reported frequency of multimedia use showed the highest levels of brain activation in this area,” Moisala said. “In addition, these adolescents did worse on the task.” © Society for Science & the Public 2000 - 2015
by Bethany Brookshire It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before. The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again. “Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past. But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. © Society for Science & the Public 2000 - 2015