Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 464

By Anna K. Bobak, Sarah Bate For years scientists have studied the biological basis of human speed, and reported that the fastest athletes are short and muscular in build. However, these conclusions were challenged in 2008 when a new athlete, substantially taller than previous world-record holders, was identified as the fastest man in history. Usain Bolt presented the purest expression of human speed on the planet – and raised the possibility that scientists may need to entirely change the way they think about human biometrics. In the same vein, one might ask whether examinations of the brain at its height of efficiency will present new insights into its workings. Although researchers have historically examined people with a very high IQ (i.e. those with more generalised skills), it has become more and more clear that some individuals only perform extraordinarily well on specific cognitive tasks. Among the most interesting of these is facial identity recognition. In fact, the extraordinary skills of these so-called “super-recognisers” do not seem to correlate with IQ or memory for objects, yet they claim to recognise faces which they have only briefly been seen before, or have undergone substantial changes in appearance. For instance, in a recent scientific report from our laboratory (unpublished), one super-recogniser described bumping into a girl from a children’s’ swimming class he coached as a teenager. He recognised her immediately, despite the fact that he’d not seen her for over ten years and she was now an adult. So how can these people change the way that scientists think about the human brain? For many years researchers have generally agreed that faces are “special.” © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21853 - Posted: 02.03.2016

Timothy Egan This weekend, I’m going to the Mojave Desert, deep into an arid wilderness of a half-million acres, for some stargazing, bouldering and January sunshine on my public lands. I won’t be out of contact. I checked. If Sarah Palin says something stupid on Donald Trump’s behalf — scratch that. When Sarah Palin says something stupid on Donald Trump’s behalf, I’ll get her speaking-in-tongues buffoonery in real time, along with the rest of the nation. The old me would have despised the new me for admitting such a thing. I’ve tried to go on digital diets, fasting from my screens. I was a friend’s guest at a spa in Arizona once and had so much trouble being “mindful” that they nearly kicked me out. Actually, I just wanted to make sure I didn’t miss the Seahawks game, mindful of Seattle’s woeful offensive line. In the information blur of last year, you may have overlooked news of our incredibly shrinking attention span. A survey of Canadian media consumption by Microsoft concluded that the average attention span had fallen to eight seconds, down from 12 in the year 2000. We now have a shorter attention span than goldfish, the study found. Attention span was defined as “the amount of concentrated time on a task without becoming distracted.” I tried to read the entire 54-page report, but well, you know. Still, a quote from Satya Nadella, the chief executive officer of Microsoft, jumped out at me. “The true scarce commodity” of the near future, he said, will be “human attention.” Putting aside Microsoft’s self-interest in promoting quick-flash digital ads with what may be junk science, there seems little doubt that our devices have rewired our brains. We think in McNugget time. The trash flows, unfiltered, along with the relevant stuff, in an eternal stream. And the last hit of dopamine only accelerates the need for another one. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21812 - Posted: 01.23.2016

by Emily Reynolds We know more about what the brain does when it's active than we do when it's at rest. It makes sense -- much neuroscientific research has looked to understand particular (and active) processes. James Kozloski, a researcher at IBM, has investigated what the brain does when it's resting -- what he calls 'the Grand Loop'. "The brain consumes a great amount of energy doing nothing. It's a great mystery of neuroscience," Kozloski told PopSci. He argued that around 90 percent of the energy used by the brain remained "unaccounted for". He believes that the brain is constantly 'looping signals', retracing neural pathways over and over again. It's a "closed loop", according to Kozloski, meaning it isn't reliant on external inputs as much of the brain's activity is. Kozloski tested his theory by running his model through IBM's neural tissue simulator and found that it could potentially account for genetic mutations such as Huntington's. He argued that information created by one mutated gene could, through the 'Grand Loop', affect an entire neural pathway. So what happens when our brain is at work? And how does expending energy affect our neural processes? Much historic research into anxiety has found that people tend to exert more energy or force when they're being watched -- something that leads to slip-ups or mistakes under pressure.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 21806 - Posted: 01.21.2016

A map for other people’s faces has been discovered in the brain. It could help explain why some of us are better at recognising faces than others. Every part of your body that you can move or feel is represented in the outer layer of your brain. These “maps”, found in the motor and sensory cortices (see diagram, below), tend to preserve the basic spatial layout of the body – neurons that represent our fingers are closer to neurons that represent our arms than our feet, for example. The same goes for other people’s faces, says Linda Henriksson at Aalto University in Helsinki, Finland. Her team scanned 12 people’s brains while they looked at hundreds of images of noses, eyes, mouths and other facial features and recorded which bits of the brain became active. This revealed a region in the occipital face area in which features that are next to each other on a real face are organised together in the brain’s representation of that face. The team have called this map the “faciotopy”. The occipital face area is a region of the brain known to be involved in general facial processing. “Facial recognition is so fundamental to human behaviour that it makes sense that there would be a specialised area of the brain that maps features of the face,” she says. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 5: The Sensorimotor System
Link ID: 21804 - Posted: 01.20.2016

Maggie Koerth-Baker In 1990, when James Danckert was 18, his older brother Paul crashed his car into a tree. He was pulled from the wreckage with multiple injuries, including head trauma. The recovery proved difficult. Paul had been a drummer, but even after a broken wrist had healed, drumming no longer made him happy. Over and over, Danckert remembers, Paul complained bitterly that he was just — bored. “There was no hint of apathy about it at all,” says Danckert. “It was deeply frustrating and unsatisfying for him to be deeply bored by things he used to love.” A few years later, when Danckert was training to become a clinical neuropsychologist, he found himself working with about 20 young men who had also suffered traumatic brain injury. Thinking of his brother, he asked them whether they, too, got bored more easily than they had before. “And every single one of them,” he says, “said yes.” Those experiences helped to launch Danckert on his current research path. Now a cognitive neuroscientist at the University of Waterloo in Canada, he is one of a small but growing number of investigators engaged in a serious scientific study of boredom. There is no universally accepted definition of boredom. But whatever it is, researchers argue, it is not simply another name for depression or apathy. It seems to be a specific mental state that people find unpleasant — a lack of stimulation that leaves them craving relief, with a host of behavioural, medical and social consequences. © 2016 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 21784 - Posted: 01.13.2016

By Diana Kwon Pupils are a rich source of social information. Although changes in pupil size are automatic and uncontrollable, they can convey interest, arousal, helpful or harmful intentions, and a variety of emotions. According to a new study published in Psychological Science, we even synchronize our pupil size with others—and doing so influences social decisions. Mariska Kret, a psychologist now at the University of Amsterdam in the Netherlands, and her colleagues recruited 69 Dutch university students to take part in an investment game. Each participant decided whether to transfer zero or five euros to a virtual partner after viewing a video of their eyes for four seconds. The invested money is tripled, and the receiver chooses how much to give back to the donor—so subjects had to make quick decisions about how trustworthy each virtual partner seemed. Using an eye tracker, the investigators found that the participants' pupils tended to mimic the changes in the partners' pupils, whether they dilated, constricted or remained static. As expected, subjects were more likely to give more money to partners with dilating pupils, a well-established signal of nonthreatening intentions. The more a subject mirrored the dilating pupils of a partner, the more likely he or she was to invest—but only if they were of the same race. The Caucasian participants trusted Caucasian eyes more than Asian eyes—which suggests that group membership is important when interpreting these subtle signals. © 2015 Scientific American

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 21735 - Posted: 12.30.2015

James Bond's villain in the latest 007 film, Spectre, could use a lesson in neuroanatomy, a Toronto neurosurgeon says. In a scene recorded in a Morroccan desert, Ernst Stavro Blofeld, played by Christoph Waltz, tortures Bond using restraints and a head clamp fused with a robotic drill. The goal is to inflict pain and erase 007's memory bank of faces. But Blofeld didn't have his brain anatomy down and could have likely killed Daniel Craig's character instead, Dr. Michael Cusimano of St. Michael's Hospital, says in a letter published in this week's issue of the journal Nature. Aiming to erase Bond's memory of faces, the villain correctly intends to drill into the lateral fusiform gyrus, an area of the brain responsible for recognizing faces, Cusimano said. But in practice, the drill was placed in the wrong area, aiming for the neck instead of the brain. "Whereas the drill should have been aimed just in front of 007's ear, it was directed below the mastoid process under and behind his left ear," Cusimano wrote. It likely would have triggered a stroke or massive hemorrhage, he said. In a draft of the letter, Cusimano said he was "spellbound" watching the film in a packed theatre, but his enjoyment was somewhat marred by the blunder. "I laughed," he recalled in an interview. "I think people around me kind of looked at me and were wondering why I was laughing because it's a pretty tense part of the movie." ©2015 CBC/Radio-Canada.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21726 - Posted: 12.27.2015

by Laura Sanders There’s only so much brainpower to go around, and when the eyes hog it all, the ears suffer. When challenged with a tough visual task, people are less likely to perceive a tone, scientists report in the Dec. 9 Journal of Neuroscience. The results help explain what parents of screen-obsessed teenagers already know. For the study, people heard a tone while searching for a letter on a computer screen. When the letter was easy to find, participants were pretty good at identifying a tone. But when the search got harder, people were less likely to report hearing the sound, a phenomenon called inattentional deafness. Neural responses to the tone were blunted when people worked on a hard visual task, but not when the visual task was easy, researchers found. By showing that a demanding visual job can siphon resources away from hearing, the results suggest that perceptual overload can jump between senses. © Society for Science & the Public 2000 - 2015

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 21682 - Posted: 12.09.2015

Scientists have come up with a questionnaire they say should help diagnose a condition called face blindness. Prosopagnosia, as doctors call it, affects around two in every 100 people in the UK and is the inability to recognise people by their faces alone. In its most extreme form, people cannot even recognise their family or friends. Milder forms, while still distressing, can be tricky to diagnose, which is why tests are needed. People with prosopagnosia often use non-facial cues to recognise others, such as their hairstyle, clothes, voice, or distinctive features. Some may be unaware they have the condition, instead believing they have a "bad memory for faces". But prosopagnosia is entirely unrelated to intelligence or broader memory ability. One [anonymous] person with prosopagnosia explains: "My biggest problem is seeing the difference between ordinary-looking people, especially faces with few specific traits. "I work at a hospital with an awful lot of employees and I often introduce myself to colleagues with whom I have worked several times before. I also often have problems recognising my next-door neighbour, even though we have been neighbours for eight years now. She often changes clothes, hairstyle and hair colour. When I strive to recognise people, I try to use technical clues like clothing, hairstyle, scars, glasses, their dialect and so on." Doctors can use computer-based tests to see if people can spot famous faces and memorise and recognise a set of unfamiliar faces. And now Drs Richard Cook, Punit Shah and City University London and Kings College London have come up with a 20-item questionnaire to help measure the severity of someone's face blindness. © 2015 BBC

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21598 - Posted: 11.04.2015

By Christian Jarrett Neuroscientists, for obvious reasons, are really interested in finding out what’s different about the brains of people with unpleasant personalities, such as narcissists, or unsavory habits, like porn addiction. Their hope is that by studying these people’s brains we might learn more about the causes of bad character, and ways to helpfully intervene. Now to the list of character flaws that've received the brain-scanner treatment we can apparently add sexism — a new Japanese study published in Scientific Reports claims to have found its neurological imprint. The researchers wanted to know whether there is something different about certain individuals’ brains that potentially predisposes them to sexist beliefs and attitudes (of course, as with so much neuroscience research like this, it’s very hard to disentangle whether any observed brain differences are the cause or consequence of the trait or behavior that’s being studied, a point I’ll come back to). More specifically, they were looking to see if people who publicly endorse gender inequality have brains that are anatomically different from people who believe in gender equality. In short, it seems the answer is yes. Neuroscientist Hikaru Takeuchi at Tohoku University and his colleagues have identified two brain areas where people who hold sexist attitudes have different levels of gray-matter density (basically, a measure of how many brain cells are packed into a given area), as compared with people who profess a belief in gender equality (their study doesn’t speak to any subconsciously held sexist beliefs). What’s more, these neural differences were correlated with psychological characteristics that could help explain some people’s sexist beliefs. © 2015, New York Media LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 21579 - Posted: 10.29.2015

Dyscalculia is like dyslexia — but for those who have trouble with math instead of reading. But not enough people know about it, according to a neuroscientist. "There is a lack of awareness among teachers and educators," said Daniel Ansari, professor and Canada Research Chair in Developmental Cognitive Neuroscience at the University of Western Ontario. Individuals with dyscalculia have trouble with simple calculations. "If I ask you what is 1 + 3, you don't need to calculate. Four will pop in to your head, it is stored in your long-term memory," he said. But those with dyscalculia will have to use their hands to count. Scientists have known about dyscalculia since the 1940's but little research has been done on it, even though it is probably just as common as dyslexia, says Ansari. Currently, there is no existing universal form of testing for dyscalculia. But Ansari has come up with screening tests for children in kindergarten. He says it's important to diagnose dyscalculia early on, so individuals can learn to adapt and improve their skills before it's too late. "We don't just need math to be good in school but to function in society," said Ansari. He says research has shown poor math skills can lead to an increased chance of unemployment, imprisonment or mortgage default. ©2015 CBC/Radio-Canada.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21564 - Posted: 10.26.2015

In a study of mice, scientists discovered that a brain region called the thalamus may be critical for filtering out distractions. The study, published in Nature and partially funded by the National Institutes of Health, paves the way to understanding how defects in the thalamus might underlie symptoms seen in patients with autism, attention deficit hyperactivity disorder (ADHD), and schizophrenia. “We are constantly bombarded by information from our surroundings,” said James Gnadt, Ph.D., program director at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS). “This study shows how the circuits of the brain might decide which sensations to pay attention to.” Thirty years ago Dr. Francis Crick proposed that the thalamus “shines a light” on regions of the cortex, which readies them for the task at hand, leaving the rest of the brain’s circuits to idle in darkness. “We typically use a very small percentage of incoming sensory stimuli to guide our behavior, but in many neurological disorders the brain is overloaded,” said Michael Halassa, M.D., Ph.D., the study’s senior author and an assistant professor at New York University’s Langone Medical Center. “It gets a lot of sensory input that is not well-controlled because this filtering function might be broken.” Neuroscientists have long believed that an area at the very front of the brain called the prefrontal cortex (PFC) selects what information to focus on, but how this happens remains unknown. One common theory is that neurons in the PFC do this by sending signals to cells in the sensory cortices located on the outer part of the brain. However, Dr. Halassa’s team discovered that PFC neurons may instead tune the sensitivity of a mouse brain to sights and sounds by sending signals to inhibitory thalamic reticular nucleus (TRN) cells located deep inside the brain.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21545 - Posted: 10.22.2015

Susan Gaidos CHICAGO — Teens like high-tech gadgets so much that they often use them all at once. While doing homework or playing video games, teens may listen to music or watch TV, all the while texting their friends. Some of these multitaskers think they are boosting their ability to attend to multiple activities, but in fact are more likely impairing their ability to focus, psychologist Mona Moisala of the University of Helsinki, reported October 18 at the annual meeting of the Society for Neuroscience. Moisala and colleagues tested 149 adolescents and young adults, ages 13 to 24, who regularly juggle multiple forms of media or play video games daily. Each participant had to focus attention on sentences (some logical, some illogical) under three conditions: without any distractions, while listening to distracting sounds, and while both listening to a sentence and reading another sentence. Using functional MRI to track brain activity, the researchers found that daily gaming had no effect on participants’ ability to focus. Those who juggle multiple forms of electronic media, however, had more trouble paying attention. Multitaskers performed lower overall, even when they weren’t being distracted. Brain images showed that the multitaskers also showed a higher level of activity in the right prefrontal cortex, an area of the brain implicated in problem solving and in processing complex thoughts and emotions. “Participants with the highest reported frequency of multimedia use showed the highest levels of brain activation in this area,” Moisala said. “In addition, these adolescents did worse on the task.” © Society for Science & the Public 2000 - 2015

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21529 - Posted: 10.20.2015

by Bethany Brookshire It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before. The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again. “Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past. But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. © Society for Science & the Public 2000 - 2015

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21511 - Posted: 10.14.2015

By Erika Hayasaki For 40 years, Joel Dreyer was a respected psychiatrist who oversaw a clinic for troubled children, belonged to an exclusive country club, and doted on his four daughters and nine grandchildren. Then, suddenly, he became a major drug dealer. Why? In the 1980s, psychiatrist Joel Dreyer was a fixture on Detroit’s WXYZ Channel 7. His commercials promoting his treatment center, InnerVisions, which he named after the Stevie Wonder album, sometimes ran up to five times a day. In one ad, Dreyer blocks a bartender from serving a mug of beer to a patron and says, “Don’t let your marriage or your job suffer from alcohol or drugs.” In another, Dreyer, in a navy pinstriped suit with a white pocket square, looks into the camera, his expression concerned and sympathetic. “Don’t you want to talk to someone who will listen?” he asks. “Someone who won’t pass judgment? Someone who cares? Come talk to me.” InnerVisions, which was based in Southfield, a suburb northwest of Detroit, had a staff of 80 physicians, psychologists, and therapists and took up two floors of a high-rise. It had made Dreyer not only a public figure but also wealthy. He maintained a side career as an expert witness. Attorneys called on him because he was smart, charming, and persuasive. Dreyer mostly testified for the defense, and with each high-profile case, his celebrity grew. Between the clinic, trial work, and his private practice, he was earning as much as $450,000 a year. Dreyer loved to be the center of attention. He would sometimes ride to work on a motorcycle in a bejeweled Elvis outfit to entertain his colleagues.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 21478 - Posted: 10.06.2015

Archy de Berker and Sven Bestmann A great deal of excitement has been generated in recent weeks by a review paper examining the literature on the drug modafinil, which concluded that “modafinil may well deserve the title of the first well-validated pharmaceutical ‘nootropic’ [cognitive enhancing] agent”. Coverage in the Guardian, Telegraph, British Medical Journal, and the Independent all called attention to the work, with a press release from Oxford University trumpeting “Review of ‘smart drug’ shows modafinil does enhance cognition”. The paper in question is a well-written summary of the recent literature (although though it probably underestimates side effects, as pointed out in the British Medical Journal). A deeper problem is that reviews do not “show” anything. Reviews can be educational and informative, but that’s not the same as using all of the available data to test whether something works or not. Two different scientists can write reviews on the same topic and come to completely different conclusions. You can think of reviews as a watercolour painting of current knowledge. We sometimes forget that this is a far cry from a technical drawing, each element measured, quantified, and bearing a strict resemblance to reality. Scientists, and the public, trying to figure out what works face a tricky problem: there will often be many papers on a given topic, offering a variety of sometimes conflicting conclusions. Fortunately, we have a well-developed toolkit for assessing the state of the current literature and drawing conclusions from it. This procedure is called meta-analysis; it combines the available sources of data (e.g., published studies), and is extensively used to assess the efficacy of medical interventions. Initiatives such as the Cochrane Collaboration use meta-analyses to synthesize available evidence into a consensus on what works and what doesn’t. © 2015 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 21476 - Posted: 10.05.2015

By Kelli Whitlock Burton They say beauty is in the eye of the beholder. But whether the beholder’s opinion is a product of one's genes or one's environment has long been a question for scientists. Although some research suggests that a preference for certain physical traits, such as height or muscular build, may be encoded in our genes, a new study finds it’s our individual life experiences that lead us to find one face more attractive than another. To get some closure on the nature versus nurture debate in human aesthetics, researchers asked 547 pairs of identical twins and 214 pairs of same-gender fraternal twins to view 200 faces and rate them on a scale of one to seven, with one being the least attractive and seven the most attractive. A group of 660 nontwins then completed the same survey. If genes were more involved in facial preference, identical twins would have had similar ratings; if the influence of a familial environment carried more weight, fraternal twins would have also answered similarly. However, most twins’ scores were quite different from one another, suggesting that something else was at play. The researchers suspect that it’s an individual’s life experiences that guide our opinions of attractiveness. The findings, reported today in Current Biology, build on earlier work by the same team that shows the ability to recognize faces is largely a genetic trait. The research is ongoing, and you can participate, too. Just complete the facial preference survey through the researchers’ website at: www.TestMyBrain.org. © 2015 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 8: Hormones and Sex
Link ID: 21467 - Posted: 10.03.2015

Are you good at picking someone out of a crowd? Most of us are better at recognising faces than distinguishing between other similar objects, so it’s long been suspected there’s something mysterious about the way the brain processes a face. Now further evidence has emerged that this is a special, highly evolved skill. A study of twins suggests there are genes influencing face recognition abilities that are distinct from the ones affecting intelligence – so it’s not that people who are good with faces just have a better memory, for instance. “The idea is that telling friend from foe was so important to survival that there was very strong pressure to improve that trait,” says Nicholas Shakeshaft of King’s College London. Previous studies using brain scanning have suggested there is a part of the brain dedicated to recognising faces, called the fusiform face area. But others have suggested this region may in fact just be used for discriminating between any familiar objects. Wondering if genetics could shed any light, Shakeshaft’s team tested more than 900 sets of UK twins – including both identical and non-identical pairs – on their face recognition skills. The ability turned out to be highly heritable, with identical twins having more similar abilities than fraternal ones. The same went for intelligence, which had earlier been tested as part of a long-running study. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21461 - Posted: 09.30.2015

Mo Costandi In an infamous set of experiments performed in the 1960s, psychologist Walter Mischel sat pre-school kids at a table, one by one, and placed a sweet treat – a small marshmallow, a biscuit, or a pretzel – in front of them. Each of the young participants was told that they would be left alone in the room, and that if they could resist the temptation to eat the sweet on the table in front of them, they would be rewarded with more sweets when the experimenter returned. The so-called Marshmallow Test was designed to test self-control and delayed gratification. Mischel and his colleagues tracked some of the children as they grew up, and then claimed that those who managed to hold out for longer in the original experiment performed better at school, and went on to become more successful in life, than those who couldn’t resist the temptation to eat the treat before the researcher returned to the room. The ability to exercise willpower and inhibit impulsive behaviours is considered to be a core feature of the brain’s executive functions, a set of neural processes - including attention, reasoning, and working memory - which regulate our behaviour and thoughts, and enable us to adapt them according to the changing demands of the task at hand. Executive function is a rather vague term, and we still don’t know much about its underlying bran mechanisms, or about how different components of this control system are related to one another. New research shows that self-control and memory share, and compete with each other for, the same brain mechanisms, such that exercising willpower saps these common resources and impairs our ability to encode memories. © 2015 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 21386 - Posted: 09.08.2015

By LISA FELDMAN BARRETT Boston — IS psychology in the midst of a research crisis? An initiative called the Reproducibility Project at the University of Virginia recently reran 100 psychology experiments and found that over 60 percent of them failed to replicate — that is, their findings did not hold up the second time around. The results, published last week in Science, have generated alarm (and in some cases, confirmed suspicions) that the field of psychology is in poor shape. But the failure to replicate is not a cause for alarm; in fact, it is a normal part of how science works. Suppose you have two well-designed, carefully run studies, A and B, that investigate the same phenomenon. They perform what appear to be identical experiments, and yet they reach opposite conclusions. Study A produces the predicted phenomenon, whereas Study B does not. We have a failure to replicate. Does this mean that the phenomenon in question is necessarily illusory? Absolutely not. If the studies were well designed and executed, it is more likely that the phenomenon from Study A is true only under certain conditions. The scientist’s job now is to figure out what those conditions are, in order to form new and better hypotheses to test. A number of years ago, for example, scientists conducted an experiment on fruit flies that appeared to identify the gene responsible for curly wings. The results looked solid in the tidy confines of the lab, but out in the messy reality of nature, where temperatures and humidity varied widely, the gene turned out not to reliably have this effect. In a simplistic sense, the experiment “failed to replicate.” But in a grander sense, as the evolutionary biologist Richard Lewontin has noted, “failures” like this helped teach biologists that a single gene produces different characteristics and behaviors, depending on the context. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 21369 - Posted: 09.01.2015