Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1282

By Ricki Rusting, Every morning, Avigael Wodinsky sets a timer to keep her 12-year-old son, Naftali, on track while he gets dressed for school. “Otherwise,” she says, “he’ll find 57 other things to do on the way to the bathroom.” Wodinsky says she knew something was different about Naftali from the time he was born, long before his autism diagnosis at 15 months. He lagged behind his twin sister in hitting developmental milestones, and he seemed distant. “When he was an infant and he was feeding, he wouldn’t cry if you took the bottle away from him,” she says. He often sat facing the corner, turning the pages of a picture book over and over again. Although he has above-average intelligence, he did not speak much until he was 4, and even then his speech was often ‘scripted:’ He would repeat phrases and sentences he had heard on television. Naftali’s trouble with maintaining focus became apparent in preschool—and problematic in kindergarten. He would stare out the window or wander around the classroom. “He was doing everything except what he was supposed to be doing,” Wodinsky recalls. At first, his psychiatrist credited these behaviors to his autism and recommended he drink coffee for its mild stimulant effect. The psychiatrist also suggested anxiety drugs. Neither treatment helped. A doctor then prescribed a series of drugs used for attention deficit hyperactivity disorder (ADHD), even though Naftali’s hyperactivity was still considered a part of his autism; those medications also failed or caused intolerable side effects. © 2018 Scientific American

Keyword: ADHD; Autism
Link ID: 24662 - Posted: 02.15.2018

/ By Dinsa Sachan When reporting a rape to police or testifying during a trial, it’s not uncommon for women to face a barrage of intrusive questions: What were you wearing at the time of the assault? Were you intoxicated? Why were you walking home alone at night? For decades, social psychologists have documented links between the ways society perceives women and their bodies — ones that often lead to this line of questioning — and attitudes towards gender violence. But only recently have neuroscientists begun to investigate what sexual objectification actually looks like in the brain. In a study published in the journal Cortex in December, European researchers explored the relationship between empathy — the ability to feel others’ emotions — and sexual objectification. Their findings, based on measuring brain activity in response to viewing a woman being left out of a social activity, suggest that people feel less empathy for women dressed in revealing clothing compared to those dressed more conservatively. To conduct the research, Giorgia Silani, a neuroscientist at the University of Vienna, Austria, along with her colleagues, asked 36 participants — both men and women — to participate in and watch videos of others playing a digital ball-tossing game. The videos featured a model who either wore long pants, a plain top, and light makeup, or a short dress, high heels, and heavy makeup. At different points in the videos, the model was included or excluded from the game. Copyright 2018 Undark

Keyword: Emotions; Brain imaging
Link ID: 24661 - Posted: 02.15.2018

Dean Burnett The internet is a weird place. Part of this is due to how things linger rather than disappear, as they tended to do with more “traditional” media. Nowadays, people’s jobs can (rightly or wrongly) be endangered for tweets they wrote years ago. The adage about “today’s news is tomorrow’s fish and chip papers” seems no longer to apply. This is particularly true when a headline or story from years ago can be found by a group or community on a social network that missed it previously, so they share it widely and it ends up in your feeds long after it’s been “forgotten”. It can be a bit confusing for those of us who grew up solely with televised news. It’s like watching the weekend football roundup when it’s suddenly interrupted by a report that the Berlin Wall has come down. Case in point: yesterday I saw several examples of a story from 2015 about how scientists have discovered that cheese triggers the same part of the brain as hard drugs. A lot of people seem to be sharing this again (even me, thinking it was new). You’d assume someone well-versed in neuroscience like myself would easily recognise an old story like this. So why didn’t I? Stories like this are hardly uncommon. You can barely go a month without some study or report describing something supposedly innocuous as having the same effect on the brain, or activating the same brain regions, as drugs of abuse, be it sugar, pornography, religion, sex, Facebook, music, or, apparently, cheese. Give it a week, something else will be cited as stimulating our brains just like the most powerful narcotics. Maybe walking on crunchy leaves or taking your bra off after a long day will be described as the equivalent of inhaling a bin-bag full of cocaine? © 2018 Guardian News and Media Limited

Keyword: Drug Abuse; Attention
Link ID: 24658 - Posted: 02.14.2018

By NEIL GENZLINGER Anne M. Treisman, whose insights into how we perceive the world around us provided some of the core theories for the field of cognitive psychology, died on Friday at her home in Manhattan. She was 82. Her daughter Deborah Treisman said the cause was a stroke after a long illness. Dr. Treisman considered a fundamental question: How does the brain make sense of the bombardment of input it is receiving and focus attention on a particular object or activity? What she came up with is called the feature integration theory of attention, detailed in a much-cited 1980 article written with Garry Gelade in the journal Cognitive Psychology, then refined and elaborated on in later work. “Perhaps Anne’s central insight in the field of visual attention was that she realized that you could see basic features like color, orientation and shape everywhere in the visual field, but that there was a problem in knowing how those colors, orientations, shapes, etc., were ‘bound’ together into objects,” Jeremy M. Wolfe, director of the Visual Attention Lab of Harvard Medical School and Brigham and Women’s Hospital, explained in an email. “Her seminal feature integration theory,” he continued, “proposed that selective attention to an object or location enabled the binding of those features and, thus, enabled object recognition. Much argument has followed, but her formulation of the problem has shaped the field for almost four decades.” Dr. Treisman did not merely theorize about how perception works; she tested her ideas with countless experiments in which subjects were asked, for instance, to pick a particular letter out of a visual field, or to identify black digits and colored letters flashing by. The work showed not only how we perceive, but also how we can sometimes misperceive. © 2018 The New York Times Company

Keyword: Attention; Vision
Link ID: 24657 - Posted: 02.14.2018

Nicola Davis While you might be tempted to wolf down a sandwich or gobble up your dinner, researchers say there may be advantages to taking your time over a meal. According to a study looking at type 2 diabetics, eating slowly could help prevent obesity, with researchers finding a link to both lower waist circumference and body mass index (BMI). “Interventions aimed at altering eating habits, such as education initiatives and programmes to reduce eating speed, may be useful in preventing obesity and reducing the risk of non-communicable diseases,” the authors write. The latest study is not the first to suggest that taking a sedate pace at the dinner table could be beneficial: various pieces of work have hinted that those who eat quickly are more likely to be overweight, have acid reflux and have metabolic syndrome. The latest study, published in the journal BMJ Open by researchers in Japan, looked at data collected though health checkups and claims from more than 59,700 individuals as part of health insurance plans, with data spanning from 2008 to mid-2013. As part of the health checkup, participants were asked seven questions about their lifestyle, including whether their eating speed was fast, normal or slow, whether they snacked after dinner three times or more a week, and whether they skipped breakfast three times or more a week. © 2018 Guardian News and Media Limited

Keyword: Obesity; Attention
Link ID: 24653 - Posted: 02.13.2018

By John Horgan I’ve been writing for decades about the mind-body problem, the deepest of all mysteries, and I’m trying to finish a book tentatively titled Mind-Body Problems. And yet only recently have I realized that few people outside philosophy and mind-related scientific fields are familiar with the phrase “mind-body problem.” I also realized that I knew nothing about the origins of the phrase. Google didn’t provide an immediate answer, so I reached out to David Chalmers, a prominent philosopher of mind. “Good question,” he said when I asked on Facebook who coined mind-body problem. He passed my query on to other scholars. I’ve culled the information below from responses of Chalmers, Galen Strawson, Eric Schliesser, Charles T. Wolfe, Godehard Bruntrup, Victor Caston and Paolo Pecere, to whom I am very grateful. A Google N-gram on “mind-body problem” shows the phrase spiking from 1910 to 1925, dipping for a couple of decades and then rising again in the 1950s. The earliest reference I can find on Google Books dates back to 1879, when the prominent American scholar Felix Adler lectured on atheism to the Ethical Culture Society. An excerpt: Advertisement If then, consciousness, or mind, in something like its traditional sense, cannot successfully be explained away by the new epistemology, we must resolutely face the metaphysical question of the relation of the mind to the physical world in which it has its setting. The central and crucial part of this question is, of course, to be found in the mind-body problem… If we refuse to accept the pan-objective epistemology already considered which would do away with consciousness in the traditional sense, we must recognize that the relation of the mind to the body forms a real and inescapable problem… How can two things so different from each other as mind and body interact? To which, it seems to me, the sufficient answer is to be found in the rather obvious query, Why can they not? Are we so sure that unlike things cannot influence each other? The only way really to decide this question is to go to experience and see. [Bold added.] © 2018 Scientific American

Keyword: Consciousness
Link ID: 24648 - Posted: 02.12.2018

Menaka Wilhelm Karen Byrne's left hand sometimes operates on its own terms. It has unbuttoned shirts and stubbed out cigarettes, without her permission. Oh, and a few times, her own hand has slapped her across the face. This is a documented medical occurrence, not a premise for a Jim Carrey movie. The condition's name? Alien hand syndrome. Invisibilia featured Byrne and her alien hand last summer, and Giant Ant Studios recently created an otherworldly animation of Byrne's story. Byrne says she's gotten used to her left hand's new attitude, but alien hand syndrome is a pesky, strange condition. Imagine, as another patient has reported, sitting down to play the piano, only to have one hand levitate far above the piano keys as you try to practice. It's not that you've changed your mind; your goal is still to play a sonata. But that hand — still yours, and now also not yours — is obeying new directions, and you didn't come up with them consciously. For the pianist and Byrne, and many other cases, alien hand symptoms appears rooted in disruption of communication through the corpus callosum. That's the set of fibers that connects the right and left sides of the brain. The pianist's corpus callosum showed missing connections on an MRI, and Byrne's left hand developed its disobedience after a surgeon severed her corpus callosum in an operation to treat epileptic seizures. © 2018 npr

Keyword: Laterality; Consciousness
Link ID: 24643 - Posted: 02.10.2018

By NATALIE ANGIER Every night during breeding season, the male túngara frog of Central America will stake out a performance patch in the local pond and spend unbroken hours broadcasting his splendor to the world. The mud-brown frog is barely the size of a shelled pecan, but his call is large and dynamic, a long downward sweep that sounds remarkably like a phaser weapon on “Star Trek,” followed by a brief, twangy, harmonically dense chuck. Unless, that is, a competing male starts calling nearby, in which case the first frog is likely to add two chucks to the tail of his sweep. And should his rival respond likewise, Male A will tack on three chucks. Back and forth they go, call and raise, until the frogs hit their respiratory limit at six to seven rapid-fire chucks. The acoustic one-upfrogship is energetically draining and risks attracting predators like bats. Yet the male frogs have no choice but to keep count of the competition, for the simple reason that female túngaras are doing the same: listening, counting and ultimately mating with the male of maximum chucks. Behind the frog’s surprisingly sophisticated number sense, scientists have found, are specialized cells located in the amphibian midbrain that tally up sound signals and the intervals between them. “The neurons are counting the number of appropriate pulses, and they’re highly selective,” said Gary Rose, a biologist at the University of Utah. If the timing between pulses is off by just a fraction of a second, the neurons don’t fire and the counting process breaks down. “It’s game over,” Dr. Rose said. “Just as in human communication, an inappropriate comment can end the whole conversation.” © 2018 The New York Times Company

Keyword: Attention; Evolution
Link ID: 24623 - Posted: 02.06.2018

By JOANNA KLEIN Plants don’t get enough credit. They move. You know this. Your houseplant salutes the sun each morning. At night, it returns to center. You probably don’t think much of it. This is simply what plants do: Get light. Photosynthesize. Make food. Live. But what about all the signs of plant intelligence that have been observed? Under poor soil conditions, the pea seems to be able to assess risk. The sensitive plant can make memories and learn to stop recoiling if you mess with it enough. The Venus fly trap appears to count when insects trigger its trap. And plants can communicate with one another and with caterpillars. Now, a study published recently in Annals of Botany has shown that plants can be frozen in place with a range of anesthetics, including the types that are used when you undergo surgery. Insights gleaned from the study may help doctors better understand the variety of anesthetics used in surgeries. But the research also highlights that plants are complex organisms, perhaps less different from animals than is often assumed. “Plants are not just robotic, stimulus-response devices,” said Frantisek Baluska, a plant cell biologist at the University of Bonn in Germany and co-author of the study. “They’re living organisms which have their own problems, maybe something like with humans feeling pain or joy.” “In order to navigate this complex life, they must have some compass.” © 2018 The New York Times Company

Keyword: Consciousness; Sleep
Link ID: 24611 - Posted: 02.03.2018

By Jocelyn Kaiser Scientists who conduct basic behavioral research are bracing for a policy kicking in this week that will impose new rules on their federally funded studies, many of which the National Institutes of Health (NIH) in Bethesda, Maryland, will now consider clinical trials. Although many researchers maintain that the policy makes no sense and will hinder their work, recent revisions by NIH officials have eased some fears. “There’s still a problem, but the problem is less dire than the original set of concerns that we had,” says cognitive psychologist Jeremy Wolfe of the Harvard University–affiliated Brigham and Women’s Hospital in Boston, who is also the immediate past president of the Federation of Associations in Behavioral & Brain Sciences (FABBS) in Washington, D.C. The changes, which take effect for proposals with due dates of 25 January or later, are part of a new clinical trials definition that NIH released in 2014 but only began implementing last year. That was when scientists who use tools such as MRI scans to explore how the normal brain works realized that their studies, which they never thought of as clinical trials because they don’t test drugs or other treatments, fell under the new definition. The change imposed several new requirements on researchers, such as submitting proposals in response to a formal funding opportunity for clinical trials and registering the studies in, the federal trials database. © 2018 American Association for the Advancement of Science

Keyword: Vision
Link ID: 24556 - Posted: 01.24.2018

By ALAN BURDICK In his first year in office President Trump gave himself credit for numerous accomplishments that he had little or nothing to do with: the passage of the Republican tax bill; Walmart’s creating 10,000 jobs in the United States; the invention of the phrase “prime the pump”; and the fact that in his brief tenure, nobody died in a commercial aviation accident. (The last fatal crash on a domestic commercial airline in the United States was in 2009.) But one thing that Mr. Trump almost certainly managed to do, without effort or notice, is alter our perception of time. We’re all aware that our experience of time is fungible: Days fly by, conversations drag on, that weeklong vacation seems to last forever until suddenly it doesn’t. As long ago as 1890 the psychologist William James noted that our feelings of time “harmonize with different mental moods.” There now exists a large body of scientific literature demonstrating that emotions play a large part in generating these temporal flexions. For instance, when viewing faces on a computer monitor, lab subjects report that happy faces seem to last longer onscreen than nonexpressive ones, and angry faces seem to last longer still. Fear, alarm and stress are factors too. Forty-five seconds with a live spider seems to last far longer to people who are afraid of spiders. Watching three minutes of video clips of the Sept. 11 attacks feels longer than watching a three-minute clip from “The Wizard of Oz.” Now consider that Mr. Trump’s first year in office must rank as the most chaotic and tumultuous in modern presidential history. Virtually every week served up a new drama: the firing of the national security adviser Michael Flynn; the firing of the F.B.I. director James Comey; the appointment of Robert Mueller as special counsel; Mr. Trump’s announcement, via Twitter, banning transgender people from the military; his bungled phone call to the widow of a soldier killed in Niger; his support of the Senate candidacy of Roy Moore; his pardon of the former Arizona sheriff Joe Arpaio; his mockery of the television host Mika Brzezinski; his failure to immediately denounce the white supremacist marchers in Charlottesville, Va.; his rants about the peaceful protests of professional football players; his taunting of the North Korean leader Kim Jong-un with his bigger “nuclear button.” It has been a 12-month-long emotional roller coaster, even for Mr. Trump’s supporters. © 2018 The New York Times Company

Keyword: Attention
Link ID: 24551 - Posted: 01.22.2018

Ian Sample Science editor Donatella Versace finds it in the conflict of ideas, Jack White under pressure of deadlines. For William S Burroughs, an old Dadaist trick helped: cutting pages into pieces and rearranging the words. Every artist has their own way of generating original ideas, but what is happening inside the brain might not be so individual. In new research, scientists report signature patterns of neural activity that mark out those who are most creative. “We have identified a pattern of brain connectivity that varies across people, but is associated with the ability to come up with creative ideas,” said Roger Beaty, a psychologist at Harvard University. “It’s not like we can predict with perfect accuracy who’s going to be the next Einstein, but we can get a pretty good sense of how flexible a given person’s thinking is.” Creative thinking is one of the primary drivers of cultural and technological change, but the brain activity that underpins original thought has been hard to pin down. In an effort to shed light on the creative process, Beaty teamed up with colleagues in Austria and China to scan people’s brains as they came up with original ideas. The scientists asked the volunteers to perform a creative thinking task as they lay inside a brain scanner. While the machine recorded their white matter at work, the participants had 12 seconds to come up with the most imaginative use for an object that flashed up on a screen. Three independent scorers then rated their answers. © 2018 Guardian News and Media Limited

Keyword: Attention; Brain imaging
Link ID: 24531 - Posted: 01.16.2018

By PATRICK SHARKEY Over the past few years, the discussion of crime and violence in the United States has focused on police brutality, mass incarceration and the sharp rise in violence in cities like Baltimore, St. Louis and Chicago. This is entirely appropriate: Any spike in violence should garner attention, and redressing the injustices of our criminal justice system is a matter of moral urgency. But it is also worth reflecting on how much the level of violence has fallen in this country over the past 25 years and how widespread the benefits of that decline have been. From the 1970s through the early part of the 1990s, the murder rate in some cities in the United States rose to levels seen only in the most violent, war-torn nations of the developing world. In the years since, violent crime has decreased in almost every city, in many cases by more than 75 percent. For well-off urbanites, the decline of crime is most visible in sanitized, closely guarded city spaces where tourists and others can now comfortably wander about. But far more consequential have been the changes in low-income, highly segregated urban communities. Indeed, my research has shown that the most disadvantaged people have gained the most from the reduction in violent crime. Start with the lives saved. Though homicide is not a common cause of death for most of the United States population, for African-American men between the ages of 15 and 34 it is the leading cause, which means that any change in the homicide rate has a disproportionate impact on them. The sociologist Michael Friedson and I calculated what the life expectancy would be today for blacks and whites had the homicide rate never shifted from its level in 1991. We found that the national decline in the homicide rate since then has increased the life expectancy of black men by roughly nine months. That figure may not seem like much, but it is exceedingly rare for any change in society to generate such a degree of change in life expectancy. For example, researchers have estimated that if the obesity epidemic in the United States was eliminated, life expectancy would increase by a similar amount. The drop in homicides is probably the most important development in the health of black men in the past several decades. © 2018 The New York Times Company

Keyword: Aggression; Attention
Link ID: 24528 - Posted: 01.15.2018

By Adam Bear, Rebecca Fortgang and Michael Bronstein Have you ever felt as though you predicted exactly when the light was going to turn green or sensed that the doorbell was about to ring? Imagine the possibility that these moments of clairvoyance occur simply because of a glitch in your mind’s time logs. What happened first — your thought about the doorbell or its actual ringing? It may have felt as if the thought came first, but when two events (ringing of doorbell, thought about doorbell) occur close together, we can mistake their order. This leads to the sense that we accurately predicted the future when, in fact, all we did is notice the past. In a recent study published in the Proceedings of the National Academy of Sciences, we found that this tendency to mix up the timing of thoughts and events may be more than a simple mental hiccup. We supposed that if some people are prone to mixing up the order of their thoughts and perceptions in this way, they could develop a host of odd beliefs. Most obviously, they might come to believe they are clairvoyant or psychic — having abilities to predict such things as whether it is going to rain. Further, these individuals might confabulate — unconsciously make up — explanations for why they have these special abilities, inferring that they are particularly important (even godlike) or are tapping into magical forces that transcend the physical world. Such beliefs are hallmarks of psychosis, seen in mental illnesses such as schizophrenia and bipolar disorder, but they are not uncommon in less-extreme forms in the general population. Would even ordinary people who mistime their thoughts and perceptions be more likely to hold ­delusion-like ideas? © 1996-2018 The Washington Post

Keyword: Attention; Schizophrenia
Link ID: 24527 - Posted: 01.15.2018

James Gorman Humans, chimpanzees, elephants, magpies and bottle-nosed dolphins can recognize themselves in a mirror, according to scientific reports, although as any human past age 50 knows, that first glance in the morning may yield ambiguous results. Not to worry. Scientists are talking about species-wide abilities, not the fact that one’s father or mother makes unpredictable appearances in the looking glass. Mirror self-recognition, at least after noon, is often taken as a measure of a kind of intelligence and self-awareness, although not all scientists agree. And researchers have wondered not only about which species display this ability, but about when it emerges during early development. Children start showing signs of self-recognition at about 12 months at the earliest and chimpanzees at two years old. But dolphins, researchers reported Wednesday, start mugging for the mirror as early as seven months, earlier than humans. Diana Reiss a psychologist at Hunter College, and Rachel Morrison, then a graduate student working with Reiss, studied two young dolphins over three years at the National Aquarium in Baltimore. Dr. Reiss first reported self-recognition in dolphins in 2001 with Lori Marino, now the head of The Kimmela Center for Animal Advocacy. She and Dr. Morrison, now an assistant professor in the psychology department at the University of North Carolina Pembroke collaborated on the study and published their findings in the journal PLoS One. Dr. Reiss said the timing of the emergence of self-recognition is significant, because in human children the ability has been tied to other milestones of physical and social development. Since dolphins develop earlier than humans in those areas, the researchers predicted that dolphins should show self-awareness earlier. Seven months was when Bayley, a female, started showing self-directed behavior, like twirling and taking unusual poses. © 2018 The New York Times Company

Keyword: Consciousness; Evolution
Link ID: 24519 - Posted: 01.11.2018

By Joshua Rothman One day in the nineteen-eighties, a woman went to the hospital for cancer surgery. The procedure was a success, and all of the cancer was removed. In the weeks afterward, though, she felt that something was wrong. She went back to her surgeon, who reassured her that the cancer was gone; she consulted a psychiatrist, who gave her pills for depression. Nothing helped—she grew certain that she was going to die. She met her surgeon a second time. When he told her, once again, that everything was fine, she suddenly blurted out, “The black stuff—you didn’t get the black stuff!” The surgeon’s eyes widened. He remembered that, during the operation, he had idly complained to a colleague about the black mold in his bathroom, which he could not remove no matter what he did. The cancer had been in the woman’s abdomen, and during the operation she had been under general anesthesia; even so, it seemed that the surgeon’s words had lodged in her mind. As soon as she discovered what had happened, her anxiety dissipated. Henry Bennett, an American psychologist, tells this story to Kate Cole-Adams, an Australian journalist, in her book “Anesthesia: The Gift of Oblivion and the Mystery of Consciousness.” Cole-Adams hears many similar stories from other anesthesiologists and psychologists: apparently, people can hear things while under anesthesia, and can be affected by what they hear even if they can’t remember it. One woman suffers from terrible insomnia after her hysterectomy; later, while hypnotized, she recalls her anesthesiologist joking that she would “sleep the sleep of death.” Another patient becomes suicidal after a minor procedure; later, she remembers that, while she was on the table, her surgeon exclaimed, “She is fat, isn’t she?” In the nineteen-nineties, German scientists put headphones on thirty people undergoing heart surgery, then, during the operation, played them an abridged version of “Robinson Crusoe.” None of the patients recalled this happening, but afterward, when asked what came to mind when they heard the word “Friday,” many mentioned the story. In 1985, Bennett himself asked patients receiving gallbladder or spinal surgeries to wear headphones. A control group heard the sounds of the operating theatre; the others heard Bennett saying, “When I come to talk with you, you will pull on your ear.” When they met with him, those who’d heard the message touched their ears three times more often than those who hadn’t. © 2018 Condé Nast.

Keyword: Consciousness; Sleep
Link ID: 24493 - Posted: 01.05.2018

by Ben Guarino The next time a friend tells you that you look sick, hear the person out. We are better than chance at detecting illness in others simply by looking at their faces, according to new research led by a Swedish psychologist. “We can detect subtle cues related to the skin, eyes and mouth,” said John Axelsson of the Karolinska Institute, who co-wrote the study published Tuesday in the journal Proceedings of the Royal Society B. “And we judge people as sick by those cues.” Other species have more finely tuned disease radars, relying primarily on the sense of smell. And previous research, Axelsson noted, has shown that animals can sniff sickness in other animals. (A Canadian hospital enlisted the help of an English springer spaniel trained to smell bacterial spores that infect patients.) Yet while there is some evidence that an unhealthy person gives off odors that another individual can identify as sickness, the face is our primary source of “social information for communication,” Axelsson said. He and his colleagues, a team that included neuroscientists and psychologists in Germany and Sweden, injected eight men and eight women with a molecule found in bacterial membranes. Like animals — from insects to mammals — people react very strongly to this substance, lipopolysaccharide. “People did not really become sick from the bacteria,” Axelsson said, but their bodies did not know the bacteria weren't actually attacking. Their immune systems kicked into action, complete with feelings of sickness. The subjects, all white, received about $430 for their trouble. © 1996-2018 The Washington Post

Keyword: Attention; Neuroimmunology
Link ID: 24483 - Posted: 01.03.2018

By James Hartzell A hundred dhoti-clad young men sat cross-legged on the floor in facing rows, chatting amongst themselves. At a sign from their teacher the hall went quiet. Then they began the recitation. Without pause or error, entirely from memory, one side of the room intoned one line of the text, then the other side of the room answered with the next line. Bass and baritone voices filled the hall with sonorous prosody, every word distinctly heard, their right arms moving together to mark pitch and accent. The effect was hypnotic, ancient sound reverberating through the room, saturating brain and body. After 20 minutes they halted, in unison. It was just a demonstration. The full recitation of one of India´s most ancient Sanskrit texts, the Shukla Yajurveda, takes six hours. I spent many years studying and translating Sanskrit, and became fascinated by its apparent impact on mind and memory. In India's ancient learning methods textual memorization is standard: traditional scholars, or pandits, master many different types of Sanskrit poetry and prose texts; and the tradition holds that exactly memorizing and reciting the ancient words and phrases, known as mantras, enhances both memory and thinking. I had also noticed that the more Sanskrit I studied and translated, the better my verbal memory seemed to become. Fellow students and teachers often remarked on my ability to exactly repeat lecturers’ own sentences when asking them questions in class. Other translators of Sanskrit told me of similar cognitive shifts. So I was curious: was there actually a language-specific “Sanskrit effect” as claimed by the tradition? © 2018 Scientific American

Keyword: Language; Attention
Link ID: 24479 - Posted: 01.03.2018

Just 10 minutes of aerobic exercise can improve executive function by priming parts of the brain used to laser focus on the task at hand, according to a new study. This paper, “Executive-Related Oculomotor Control Is Improved Following a 10-minute Single-Bout of Aerobic Exercise: Evidence from the Antisaccade Task,” was published in the January 2018 issue of Neuropsychologia. This research was conducted by Matthew Heath, who is a kinesiology professor and supervisor in the Graduate Program in Neuroscience at the University of Western Ontario, along with UWO master’s student Ashna Samani. For this study, Samani and Heath asked a cohort of healthy young adults to either sit quietly and read magazines or perform 10 minutes of moderate-to-vigorous physical activity (MVPA) on a stationary bicycle. (MVPA aerobic intensity is hard enough that you might break a sweat but easy enough that you can carry on a conversation.) Immediately after the 10-minute reading task or time spent doing aerobic exercise, the researchers used eye-tracking equipment to gauge antisaccades, which is a way to measure varying degrees of executive control. As the authors explain in the study abstract, “Antisaccades are an executive task requiring a goal-directed eye movement (i.e., a saccade) mirror-symmetrical to a visual stimulus. The hands- and language-free nature of antisaccades coupled with the temporal precision of eye-tracking technology make it an ideal tool for identifying executive performance changes.” © 1991-2018 Sussex Publishers, LLC

Keyword: Attention
Link ID: 24476 - Posted: 01.02.2018

By Daniel Barron Earlier this year, I wrote about my patient, Andrew, an engineer who developed a heroin habit. An unfortunate series of joint replacements had left Andrew with terrible pain and, when his medication ran out, he turned to heroin. Months after his surgeries—after his tissue and scars had healed—Andrew remained disabled by a deep, biting pain. I recall puzzling over his pain, how it had spread throughout his body and how previous clinical teams had prescribed progressively higher doses of opioids to tame it. Andrew had transitioned from acute pain (i.e., pain from his surgical wounds) to chronic pain (i.e., pain in the absence of an obvious cause), but it was unclear to me whether this reflected a drug tolerance or a different pain process. The difference between drug tolerance and chronic pain is a difficult concept to get hold of. In the hospital workroom one morning, I realized how confused I was by the topic and paged the hospital’s on-call pain specialist. Fortune smiled and Donna-Ann M Thomas, Yale University’s Pain Medicine Division Chief, picked up the phone and patiently explained how tolerance and chronic pain are quite different. Andrew became tolerant to opioids when his body required progressively larger doses to have the same effect. Opioids activate the Mu opioid receptor, which blocks pain signals in the spinal cord. To find a way around the opioid blockade, Andrew’s body had made more Mu receptors to compensate for the drug, meaning more drug had to be present to stifle the pain signal, hence the escalating doses. © 2017 Scientific American,

Keyword: Pain & Touch; Attention
Link ID: 24471 - Posted: 12.30.2017