Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 612

Bruce Bower Monkeys can keep strings of information in order by using a simple kind of logical thought. Rhesus macaque monkeys learned the order of items in a list with repeated exposure to pairs of items plucked from the list, say psychologist Greg Jensen of Columbia University and colleagues. The animals drew basic logical conclusions about pairs of listed items, akin to assuming that if A comes before B and B comes before C, then A comes before C, the scientists conclude July 30 in Science Advances. Importantly, rewards given to monkeys didn’t provide reliable guidance to the animals about whether they had correctly ordered pairs of items. Monkeys instead worked out the approximate order of images in the list, and used that knowledge to make choices in experiments about which of two images from the list followed the other, Jensen’s group says. Previous studies have suggested that a variety of animals, including monkeys, apes, pigeons, rats and crows, can discern the order of a list of items (SN: 7/5/08, p. 13). But debate persists about whether nonhuman creatures do so only with the prodding of rewards for correct responses or, at least sometimes, by consulting internal knowledge acquired about particular lists. Jensen’s group designed experimental sessions in which four monkeys completed as many as 600 trials to determine the order of seven images in a list. Images included a hot air balloon, an ear of corn and a zebra. Monkeys couldn’t rely on rewards to guide their choices. In some sessions, animals usually received a larger reward for correctly identifying which of two images came later in the list and a smaller reward for an incorrect response. In other sessions, incorrect responses usually yielded a larger reward than correct responses. Rewards consisted of larger or smaller gulps of water delivered through tubes to the moderately thirsty primates. |© Society for Science & the Public 2000 - 2019

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 26475 - Posted: 08.01.2019

By Jocelyn Kaiser U.S. scientists who challenged a new rule that would require them to register their basic studies of the human brain and behavior in a federal database of clinical trials have won another reprieve. The National Institutes of Health (NIH) in Bethesda, Maryland, says it now understands why some of that kind of research won’t easily fit the format of ClinicalTrials.gov, and the agency has delayed for the reporting requirements for another 2 years. The controversy dates back to 2017, when behavioral and cognitive researchers realized that new requirements for registering and reporting results from NIH-funded clinical studies would also cover even basic studies of human subjects, experiments that did not test drugs or other potential treatments. The scientists protested that including such studies would confuse the public and create burdensome, unnecessary paperwork. A year ago, NIH announced it would delay the requirement until September and seek further input. The responses prompted NIH staff to examine published papers from scientists conducting basic research. They agreed it would be hard to include some of these studies into the rigid informational format used by ClinicalTrials.gov—for example, because the authors didn’t specify the outcome they expected before the study began, or they reported results for individuals and not the whole group. In other cases, the authors did several preliminary studies to help them design their experiment. © 2019 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 26450 - Posted: 07.25.2019

Maria Temming A new analysis of brain scans may explain why hyperrealistic androids and animated characters can be creepy. By measuring people’s neural activity as they viewed pictures of humans and robots, researchers identified a region of the brain that seems to underlie the “uncanny valley” effect — the unsettling sensation sometimes caused by robots or animations that look almost, but not quite, human (SN Online: 11/22/13). Better understanding the neural circuitry that causes this feeling may help designers create less unnerving androids. In research described online July 1 in the Journal of Neuroscience, neuroscientist Fabian Grabenhorst and colleagues took functional MRI scans of 21 volunteers during two activities. In each activity, participants viewed pictures of humans, humanoid robots of varying realism and — to simulate the appearance of hyperrealistic robots — “artificial humans,” pictures of people whose features were slightly distorted through plastic surgery and photo editing. In the first activity, participants rated each picture on likability and how humanlike the figures appeared. Next, participants chose between pairs of these pictures, based on which subject they would rather receive a gift from. In line with the uncanny valley effect, participants generally rated more humanlike candidates as more likable, but this trend broke down for artificial humans — the most humanlike of the nonhuman options. A similar uncanny valley trend emerged in participants’ judgments about which figures were more trustworthy gift-givers. |© Society for Science & the Public 2000 - 2019.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 26387 - Posted: 07.04.2019

By Nathan Dunne I would stare at my hands and think, “I’m not me.” No matter where I was, in the middle of a busy street or at my dining table at home, the condition would be the same. It was like looking at my hands through a plate of glass. Although I could feel the skin on my palms, it did not feel like my own. Half of myself would move through the day while the other half watched. I was split in two. Nothing I did would relieve the condition. I went to see an ophthalmologist, convinced I had cataracts. The verdict was near-perfect vision. I tried taking time off work, talking with family and writing notes about how my life had become a simulation. Each morning I would stare at the mirror in an attempt to recognize myself, but the distance between my body and this new, outer eye only grew larger. I began to believe I was becoming psychotic and would soon be in a psychiatric ward. I was a 28-year-old, working as a copywriter while pursuing a PhD in art history, and I felt my life was nearing its end. One evening in April 2008, as I contemplated another helpless night trapped beyond my body, full blown panic set in. I took up the phone, ready to dial for emergency, when suddenly music began to play from downstairs. It was a nauseating pop song that my neighbor played incessantly, but something about the melody gave me pause. The next day I began a series of frustrating doctor’s visits. First with my physician, then a neurologist, gastroenterologist and chiropractor. I said that I had never taken drugs or drank alcohol excessively. While I was fatigued from my doctoral study, I didn’t think this qualified me for the split in the self that had occurred. © 1996-2019 The Washington Post

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 26372 - Posted: 07.01.2019

By Bret Stetka The hippocampus is a small curl of brain, which nests beneath each temple. It plays a crucial role in memory formation, taking our experiences and interactions and setting them in the proverbial stone by creating new connections among neurons. A report published on June 27in Science reveals how the hippocampus learns and hard wires certain experiences into memory. The authors show that following a particular behavior, the hippocampus replays that behavior repeatedly until it is internalized. They also report on how the hippocampus tracks our brain’s decision-making centers to remember our past choices. Previous research has shown that the rodent hippocampus replays or revisits past experiences during sleep or periods of rest. While a rat navigates a maze, for example, so-called place cells are activated and help the animal track its position. Following their journey through the maze, those same cells are reactivated in the exact same pattern. What previously happened is mentally replayed again. The authors of the new study were curious whether this phenomenon only applies to previous encounters with a particular location or if perhaps this hippocampal replay also applies to memory more generally, including mental and nonspatial memories. It turns out it does. In the study, 33 participants were presented with a series of images containing both a face and a house. They had to judge the age of either one or the other. If during the second trial, the age of the selected option remained the same, the judged category also did not change in the subsequent trial. If the ages differed, the judged category flipped to the other option in the next round. © 2019 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 26367 - Posted: 06.28.2019

By Susana Martinez-Conde and Stephen L. Macknik The man and the woman sat down, facing each other in the dimly illuminated room. This was the first time the two young people had met, though they were about to become intensely familiar with each other—in an unusual sort of way. The researcher informed them that the purpose of the study was to understand “the perception of the face of another person.” The two participants were to gaze at each other’s eyes for 10 minutes straight, while maintaining a neutral facial expression, and pay attention to their partner’s face. After giving these instructions, the researcher stepped back and sat on one side of the room, away from the participants’ lines of sight. The two volunteers settled in their seats and locked eyes—feeling a little awkward at first, but suppressing uncomfortable smiles to comply with the scientist’s directions. Ten minutes had seemed like a long stretch to look deeply into the eyes of a stranger, but time started to lose its meaning after a while. Sometimes, the young couple felt as if they were looking at things from outside their own bodies. Other times, it seemed as if each moment contained a lifetime. Throughout their close encounter, each member of the duo experienced their partner’s face as everchanging. Human features became animal traits, transmogrifying into grotesqueries. There were eyeless faces, and faces with too many eyes. The semblances of dead relatives materialized. Monstrosities abounded. The bizarre perceptual phenomena that the pair witnessed were manifestations of the “strange face illusion,” first described by the psychologist Giovanni Caputo of the University of Urbino, Italy. Urbino’s original study, published in 2010, reported a new type of illusion, experienced by people looking at themselves in the mirror in low light conditions. © 2019 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 26230 - Posted: 05.14.2019

By Stephen L. Macknik, Susana Martinez-Conde We were very sad to learn that Johnny Thompson (aka The Great Tomsoni) passed away on March 9, 2019, at the age of 84. We first met Johnny in 2007, when he spoke at the ‘Magic of Consciousness’ Symposium that we organized at the annual meeting of the Association for the Scientific Study of Consciousness, in Las Vegas. Johnny Thompson, along with Mac King, Teller, Apollo Robbins, and James Randi, talked to an academic audience of neuroscientists, psychologists and philosophers about his impressions about the psychologically puzzling aspects of magic, and helped jumpstart ‘neuromagic’ as a field of scientific enquiry. Johnny Thomson and his co-presenters inspired us, among many other investigators, to conduct research into the neuroscientific bases of magic. Dozens of papers by labs around the world have been published in the intervening decade as a result. Johnny himself co-authored an academic review with us, on the intersection of magic and neuroscience, published in Nature Reviews Neuroscience in 2008. Our later book Sleights of Mind: What the Neuroscience of Magic Reveals About Our Everyday Deceptions, drew significantly from our extensive conversations with Johnny and his keen insights. Thompson was regarded as a deeply knowledgeable magician's magician and magic theorist. He was generous and kind with his wisdom and is widely recognized for having served as consultant to numerous world-renowned magic acts. Though his contributions to the neuroscience of magic are less well known than his magic artistry, they have led to significant advances in the science of attention and misdirection, too. Among the magic aphorisms we have heard over the years, one of our favorites is Johnny’s assertion that “when the audience laughs, time stops,” allowing the magician, at that precise moment, to get away with magical murder. © 2019 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 26119 - Posted: 04.08.2019

By: Kevin P. Madore, Ph.D., and Anthony D. Wagner, Ph.D. As you go about your day, you may barely notice that you are frequently multitasking. It may be driving to work while listening to a radio program or talking to a loved one on the phone (putting yourself and others at risk), or perusing Facebook while texting a friend, or switching back and forth between a high-level project like compiling a report and a routine chore like scheduling an appointment. Multitasking means trying to perform two or more tasks concurrently, which typically leads to repeatedly switching between tasks (i.e., task switching) or leaving one task unfinished in order to do another. The scientific study of multitasking over the past few decades has revealed important principles about the operations, and processing limitations, of our minds and brains. One critical finding to emerge is that we inflate our perceived ability to multitask: there is little correlation with our actual ability. In fact, multitasking is almost always a misnomer, as the human mind and brain lack the architecture to perform two or more tasks simultaneously. By architecture, we mean the cognitive and neural building blocks and systems that give rise to mental functioning. We have a hard time multitasking because of the ways that our building blocks of attention and executive control inherently work. To this end, when we attempt to multitask, we are usually switching between one task and another. The human brain has evolved to single task. Together with studies of patients who have suffered focal neural injuries, functional neuroimaging studies indicate that key brain systems involved in executive control and sustained attention determine our ability to multitask. These include the frontoparietal control network, dorsal attention network, and ventral attention network. © 2019 The Dana Foundation

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 26117 - Posted: 04.06.2019

Nicola Davis “Acting is the least mysterious of all crafts,” Marlon Brando once said. But for scientists, working out what is going on in an actor’s head has always been something of a puzzle. Now, researchers have said thespians show different patterns of brain activity depending on whether they are in character or not. Dr Steven Brown, the first author of the research from McMaster University in Canada, said: “It looks like when you are acting, you are suppressing yourself; almost like the character is possessing you.” Character building and what makes a truly great actor Read more Writing in the journal Royal Society Open Science, Brown and colleagues report how 15 method actors, mainly theatre students, were trained to take on a Shakespeare role – either Romeo or Juliet – in a theatre workshop, and were asked various questions, to which they responded in character. They were then invited into the laboratory, where their brains were scanned in a series of experiments. Once inside the MRI scanner, the actors were asked to think about their response to a number of fresh conundrums that flashed up on screen, and which might well have occurred to the star-crossed lovers, such as: would they gatecrash a party? And would they tell their parents that they had fallen in love? Each actor was asked to respond to different questions, based on four different premises assigned in a random order. In one, they were asked for their own perspective; in another, they were asked to say how they thought a particular close friend would react, while in a third, they were asked to respond as though they were either Romeo or Juliet. = © 2019 Guardian News & Media Limited

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 26029 - Posted: 03.13.2019

By Max Evans BBC News A stranger once waved at Boo James on a bus. She did not think any more of it - until it later emerged it was her mother. She has a relatively rare condition called face blindness, which means she cannot recognise the faces of her family, friends, or even herself. Scientists have now launched a study they hope could help train people like Boo to recognise people better. Boo said for many years she thought she was "from another planet". "It is immensely stressful and very emotionally upsetting to sit and dwell upon so I try not to do that," she said. "It's very hard work. It can be physically and emotionally exhausting to spend a day out in public constantly wondering whether you should have spoken to someone." For most of her life, she didn't know she had the condition - also known as prosopagnosia - and blamed herself for the "social awkwardness" caused when she failed to recognise people. "I had to try and find a way to explain that. I really couldn't very well, except to think that I was just the one to blame for not being bothered to remember who people were. "[Like it was] some sort of laziness: I didn't want to know them, obviously I wasn't interested enough to remember them, so that was some kind of deficiency, perhaps, in me." But the penny dropped in her early 40s when she saw a news item about the condition on television. "I then knew that the only reason I wasn't recognising that person was because my brain physically wasn't able to do it," she said. "I could immediately engage more self-understanding and forgive myself and try to approach things from a different angle." Image caption Boo has developed techniques to try to help her cope, including remembering what people wear She said her childhood was punctuated by "traumatic experiences" with fellow children, childminders and teachers she could not recognise. © 2019 BBC

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 26011 - Posted: 03.06.2019

Bruce Bower WASHINGTON — Beliefs among some university professors that intelligence is fixed, rather than capable of growth, contribute to a racial achievement gap in STEM courses, a new study suggests. Those professors may subtly communicate stereotypes about blacks, Hispanics and Native Americans allegedly being less intelligent than Asians and whites, say psychologist Elizabeth Canning of Indiana University in Bloomington and her colleagues. In turn, black, Hispanic and Native American undergraduates may respond by becoming less academically motivated and more anxious about their studies, leading to lower grades. Even small dips in STEM grades — especially for students near pass/fail cutoffs — can accumulate across the 15 or more science, technology, engineering and math classes needed to become a physician or an engineer, Canning says. That could jeopardize access to financial aid and acceptance to graduate programs. “Our work suggests that academic benefits could accrue over time if all students, and particularly underrepresented minority students, took STEM classes with faculty who endorse a growth mind-set,” Canning says. Underrepresented minority students’ reactions to professors with fixed or flexible beliefs about intelligence have yet to be studied. But over a two-year period, the disparity in grade point averages separating Asian and white STEM students from black, Hispanic and Native American peers was nearly twice as large in courses taught by professors who regarded intelligence as set in stone, versus malleable, Canning’s team reports online February 15 in Science Advances. |© Society for Science & the Public 2000 - 2019.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 25970 - Posted: 02.18.2019

By Alex Fox If math is the language of the universe, bees may have just uttered their first words. New research suggests these busybodies of the insect world are capable of addition and subtraction—using colors in the place of plus and minus symbols. In the animal kingdom, the ability to count—or at least distinguish between differing quantities—isn’t unusual: It has been seen in frogs, spiders, and even fish. But solving equations using symbols is rare air, so far only achieved by famously brainy animals such as chimpanzees and African grey parrots. Enter the honey bee (Apis mellifera). Building on prior research that says the social insects can count to four and understand the concept of zero, researchers wanted to test the limits of what their tiny brains can do. Scientists trained 14 bees to link the colors blue and yellow to addition and subtraction, respectively. They placed the bees at the entrance of a Y-shaped maze, where they were shown several shapes in either yellow or blue. If the shapes were blue, bees got a reward if they went to the end of the maze with one more blue shape (the other end had one less blue shape); if the shapes were yellow, they got a reward if they went to the end of the maze with one less yellow shape. © 2018 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25938 - Posted: 02.08.2019

By Benedict Carey The world’s most common digital habit is not easy to break, even in a fit of moral outrage over the privacy risks and political divisions Facebook has created, or amid concerns about how the habit might affect emotional health. Although four in 10 Facebook users say they have taken long breaks from it, the digital platform keeps growing. A recent study found that the average user would have to be paid $1,000 to $2,000 to be pried away for a year. So what happens if you actually do quit? A new study, the most comprehensive to date, offers a preview. Expect the consequences to be fairly immediate: More in-person time with friends and family. Less political knowledge, but also less partisan fever. A small bump in one’s daily moods and life satisfaction. And, for the average Facebook user, an extra hour a day of downtime. The study, by researchers at Stanford University and New York University, helps clarify the ceaseless debate over Facebook’s influence on the behavior, thinking and politics of its active monthly users, who number some 2.3 billion worldwide. The study was posted recently on the Social Science Research Network, an open access site. “For me, Facebook is one of those compulsive things,” said Aaron Kelly, 23, a college student in Madison, Wis. “It’s really useful, but I always felt like I was wasting time on it, distracting myself from study, using it whenever I got bored.” Mr. Kelly, who estimated that he spent about an hour a day on the platform, took part in the study “because it was kind of nice to have an excuse to deactivate and see what happened,” he said. Well before news broke that Facebook had shared users’ data without consent, scientists and habitual users debated how the platform had changed the experience of daily life. A cadre of psychologists has argued for years that the use of Facebook and other social media is linked to mental distress, especially in adolescents. Others have likened habitual Facebook use to a mental disorder, comparing it to drug addiction and even publishing magnetic-resonance images of what Facebook addiction “looks like in the brain.” © 2019 The New York Times Company

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 12: Psychopathology: The Biology of Behavioral Disorders
Link ID: 25919 - Posted: 01.31.2019

By Scott Barry Kaufman What is going on in our brains when we are creating? How does our brain look different when we are engaging in art vs. science? How does the brain of genius creators differ from the rest of us? What are some of the limitations of studying the creative brain? The neuroscience of creativity is booming. There is now a society (with an annual conference), an edited volume, a handbook, and now an entire textbook on the topic. Bringing the latest research together from a number of scientists*, Anna Abraham wrote a wonderful resource that covers some of the most hot button topics in the field. She was gracious enough to do a Q & A with me. Enjoy! SBK: How’d you get interested in the neuroscience of creativity? AA: I have always been curious about creativity. At the most fundamental level I think I simply wanted to get my head around the mystery of this marvelous ability that each of us possesses. In particular, I hoped to find out what makes some people more creative than others. When I saw an opportunity to pursue a PhD in Neuroscience in the early 2000s in any topic of my choice, I went all in - it was an exciting and promising approach that had until then only been limitedly used to explore the creative mind. SBK: What is creativity? Does the field have a unified, agreed upon definition of creativity that you are satisfied with? AA: There is a surprising level of unanimity in the field when it comes to a boilerplate definition. Most experts agree that two elements are central to creativity. First and foremost, it reflects our capacity to generate ideas that are original, unusual or novel in some way. The second element is that these ideas also need to be satisfying, appropriate or suited to the context in question. I am reasonably satisfied with this definition but not in how it guides scientific enquiry.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25839 - Posted: 01.05.2019

By Christina Karns ’Tis the season when the conversation shifts to what you’re thankful for. Gathered with family and friends around a holiday feast, for instance, people may recount some of the biggies — such as their health or their children — or smaller things that enhance everyday life — such as happening upon a great movie while channel surfing or enjoying a favorite seasonal food. Psychology researchers recognize that taking time to be thankful has benefits for well-being. Gratitude not only goes along with more optimism, less anxiety and depression, and greater goal attainment, but also is associated with fewer symptoms of illness and other physical benefits. In recent years, researchers have been making connections between the internal experience of gratitude and the external practice of altruism. How does being thankful about things in your own life relate to any selfless concern you may have about the well-being of others? As a neuroscientist, I’m particularly interested in the brain regions and connections that support gratitude and altruism. I’ve been exploring how changes in one might lead to changes in the other. To study the relationship between gratitude and altruism in the brain, my colleagues and I first asked volunteers questions meant to tease out how frequently they feel thankful and the degree to which they tend to care about the well-being of others. Then we used statistics to determine the extent to which someone’s gratitude could predict their altruism. As others have found, the more grateful people in this group tended to be more altruistic. © 1996-2018 The Washington Post

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25823 - Posted: 12.26.2018

Phil Jaekl Driving a car is a complex task for a brain to coordinate. A driver may drink a cup of coffee and have a conversation with a passenger, all while safely piloting a vehicle through traffic. But all of this activity requires attention—that is, concentrating on the tasks and sources of information that matter and blocking out those that don’t. How the brain manages that orchestration is a long-standing scientific mystery. One prominent view, based on findings from human behavioral studies, is that the brain guides us through a world chock-full of sensory inputs by focusing a metaphorical spotlight on what it deems important, while filtering out relatively trivial details. Unlike some other, functionally well-defined aspects of cognition, this attentional spotlight has eluded scientific understanding. Its neural substrates have been particularly difficult to pin down to specific activities and locations in the brain—although several studies have implicated the frontoparietal network, which spans the frontal and parietal lobes of the brain. Meanwhile, attention studies involving visual tasks that require continuous focus—detecting a small object flashing on a cluttered computer screen, for example—have shown that task performance varies over short time intervals, with episodes of peak performance and of poor performance alternating on millisecond timescales. Such research suggests that the attentional spotlight might not be as constant as once thought. Yet, until now, researchers have not been able to directly connect these changes in performance to fluctuations in brain activity. © 1986 - 2018 The Scientist

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25800 - Posted: 12.20.2018

By Kuheli Dutt When neuroscientist Ben Barres delivered his first seminar, an audience member praised him, commenting that Ben’s work was much better than that of his sister, Barbara Barres. The irony? Ben Barres (now deceased), a transgender scientist was Barbara Barres before he transitioned to male. When New York Times columnist Brent Staples was a graduate student in Chicago’s Hyde Park, he found that white people on the street perceived him, an African American, as a threat to their safety. They were visibly tense around him, clutched their purses and sometimes even crossed the street to avoid him. But when he started whistling tunes from classical music, people suddenly weren’t afraid of him anymore—they relaxed and some even smiled at him. Implicit bias runs far deeper than we realize. A riddle used at implicit bias trainings goes like this: A father and his son are in a terrible car crash. The father dies at the scene. His son, in critical condition, is rushed to the hospital; he’s in the operating room, about to go under the knife. The surgeon says, “I can’t operate on this boy—he’s my son!” The audience is then asked how that’s possible. Responses include several scenarios: two gay fathers; one biological and one adopted father; one father and one priest (religious father); all of which are possible. However, an obvious answer that most people miss: the surgeon is the boy’s mother. Whether we like it or not, we are conditioned to associating surgeons with being male. © 2018 Scientific American,

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25799 - Posted: 12.20.2018

Alison Abbott Doris Tsao launched her career deciphering faces — but for a few weeks in September, she struggled to control the expression on her own. Tsao had just won a MacArthur Foundation ‘genius’ award, an honour that comes with more than half a million dollars to use however the recipient wants. But she was sworn to secrecy — even when the foundation sent a film crew to her laboratory at the California Institute of Technology (Caltech) in Pasadena. Thrilled and embarrassed at the same time, she had to invent an explanation, all while keeping her face in check. It was her work on faces that won Tsao awards and acclaim. Last year, she cracked the code that the brain uses to recognize faces from a multitude of minuscule differences in shapes, distances between features, tones and textures. The simplicity of the coding surprised and impressed the neuroscience community. “Her work has been transformative,” says Tom Mrsic-Flogel, director of the Sainsbury Wellcome Centre for Neural Circuits and Behaviour at University College London. But Tsao doesn’t want to be remembered just as the scientist who discovered the face code. It is a means to an end, she says, a good tool for approaching the question that really interests her: how does the brain build up a complete, coherent model of the world by filling in gaps in perception? “This idea has an elegant mathematical formulation,” she says, but it has been notoriously hard to put to the test. Tsao now has an idea of how to begin. © 2018 Springer Nature Publishing AG

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25773 - Posted: 12.11.2018

By Aaron E. Carroll Even before the recent news that a group of researchers managed to get several ridiculous fake studies published in reputable academic journals, people have been aware of problems with peer review. Throwing out the system — which deems whether research is robust and worth being published — would do more harm than good. But it makes sense to be aware of peer review’s potential weaknesses. Reviewers may be overworked and underprepared. Although they’re experts in the subject they are reading about, they get no specific training to do peer review, and are rarely paid for it. With 2.5 million peer-reviewed papers published annually worldwide — and more that are reviewed but never published — it can be hard to find enough people to review all the work. There is evidence that reviewers are not always consistent. A 2010 paper describes a study in which two researchers selected 12 articles already accepted by highly regarded journals, swapped the real names and academic affiliations for false ones, and resubmitted the identical material to the same journals that had already accepted them in the previous 18 to 32 months. Only 8 percent of editors or reviewers noticed the duplication, and three papers were detected and pulled. Of the nine papers that continued through the review process, eight were turned down, with 89 percent of reviewers recommending rejection. Peer review may be inhibiting innovation. It takes significant reviewer agreement to have a paper accepted. One potential downside is that important research bucking a trend or overturning accepted wisdom may face challenges surviving peer review. In 2015, a study published in P.N.A.S. tracked more than 1,000 manuscripts submitted to three prestigious medical journals. Of the 808 that were published at some point, the 2 percent that were most frequently cited had been rejected by the journals. An even bigger issue is that peer review may be biased. Reviewers can usually see the names of the authors and their institutions, and multiple studies have shown that reviews preferentially accept or reject articles based on a number of demographic factors. In a study published in eLife last year, researchers created a database consisting of more than 9,000 editors, 43,000 reviewers and 126,000 authors whose work led to about 41,000 articles in 142 journals in a number of domains. They found that women made up only 26 percent of editors, 28 percent of reviewers and 37 percent of authors. Analyses showed that this was not because fewer women were available for each role. © 2018 The New York Times Compan

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 25643 - Posted: 11.05.2018

Jon Hamilton An ancient part of the brain long ignored by the scientific world appears to play a critical role in everything from language and emotions to daily planning. It's the cerebellum, which is found in fish and lizards as well as people. But in the human brain, this structure is wired to areas involved in higher-order thinking, a team led by researchers from Washington University in St. Louis reports Thursday in the journal Neuron. "We think that the cerebellum is acting as the brain's ultimate quality control unit," says Scott Marek, a postdoctoral research scholar and the study's first author. The finding adds to the growing evidence that the cerebellum "isn't only involved in sensory-motor function, it's involved in everything we do," says Dr. Jeremy Schmahmann, a neurology professor at Harvard and director of the ataxia unit at Massachusetts General Hospital. Schmahmann, who wasn't involved in the new study, has been arguing for decades that the cerebellum plays a key role in many aspects of human behavior, as well as mental disorders such as schizophrenia. But only a handful of scientists have explored functions of the cerebellum beyond motor control. "It's been woefully understudied," says Dr. Nico Dosenbach, a professor of neurology at Washington University whose lab conducted the study. Even now, many scientists think of the cerebellum as the part of the brain that lets you pass a roadside sobriety test. It helps you do things like walk in a straight line or stand on one leg or track a moving object — if you're not drunk. © 2018 npr

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 5: The Sensorimotor System
Link ID: 25624 - Posted: 10.27.2018