Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1334

/ By Caroline Williams I‘m not the kind of girl who jumps into a strange man’s car and hopes for the best. Especially when a quick Google stalk reveals him to be recovering from an addiction to methamphetamine. But having been assured by someone I trust that he was “one of the good guys,” I accepted his offer of a ride to the airport and … hoped for the best. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, Caroline Williams shares a story that was left out of “My Plastic Brain: One Woman’s Yearlong Journey to Discover if Science Can Improve Her Mind,” published by Prometheus Books. Some books make it sound so easy: Change the way you think, and hey presto, you can become a different person. In hindsight I’m glad I did. After many months talking to scientists about brain change, it was this journey that prompted me to think more deeply about what that actually meant. I was in Lawrence, Kansas, researching a book that I hoped would apply the latest science to make real, measurable, and lasting changes to my brain. I wanted to learn, among other things, how to concentrate better and to overcome my irrational anxieties about life. I was in Kansas to try to boost my powers of creativity. Copyright 2018 Undark

Keyword: Learning & Memory; Depression
Link ID: 25349 - Posted: 08.18.2018

Mike Robinson To call gambling a “game of chance” evokes fun, random luck and a sense of collective engagement. These playful connotations may be part of why almost 80 percent of American adults gamble at some point in their lifetime. When I ask my psychology students why they think people gamble, the most frequent suggestions are for pleasure, money or the thrill. While these might be reasons why people gamble initially, psychologists don’t definitely know why, for some, gambling stops being an enjoyable diversion and becomes compulsive. What keeps people playing even when it stops being fun? Why stick with games people know are designed for them to lose? Are some people just more unlucky than the rest of us, or simply worse at calculating the odds? As an addiction researcher for the past 15 years, I look to the brain to understand the hooks that make gambling so compelling. I’ve found that many are intentionally hidden in how the games are designed. And these hooks work on casual casino-goers just as well as they do on problem gamblers. Uncertainty as its own reward in the brain One of the hallmarks of gambling is its uncertainty – whether it’s the size of a jackpot or the probability of winning at all. And reward uncertainty plays a crucial role in gambling’s attraction. Dopamine, the neurotransmitter the brain releases during enjoyable activities such as eating, sex and drugs, is also released during situations where the reward is uncertain. In fact dopamine release increases particularly during the moments leading up to a potential reward. © 2010–2018, The Conversation US, Inc.

Keyword: Drug Abuse; Attention
Link ID: 25328 - Posted: 08.14.2018

By Matt Neal Sir John Eccles is an icon of Australian science, but an attempt in later life to mix religion and science made him an outsider in the scientific community as it won him fans in the Catholic Church. In 1963, along with British biophysicists Sir Alan Hodgkin and Sir Andrew Huxley, he won the Nobel Prize for Physiology or Medicine for their groundbreaking work on synapses and the electrical properties of neurons. How Sir John revolutionised neuroscience: He shared the 1963 Nobel Prize for Physiology Or Medicine with Alan Lloyd Hodgkin and Andrew Fielding Huxley "for their discoveries concerning the ionic mechanisms involved in excitation and inhibition in the peripheral and central portions of the nerve cell membrane" Eccles' work showed that the transmission of information and impulses between neurons in the brain was both electrical and chemical in nature His experiments paved the way for treatments of nervous diseases as well as further research into the brain, heart and kidneys That same year, Eccles was named Australian of the Year. But, as University of Sydney Honorary Associate Professor John Carmody once wrote: "the nation appears to have forgotten [Eccles despite the fact] modern neuroscience is forever in his debt". Part of the reason for the decline in his regard could stem from his latter-career work, in which he controversially attempted to marry his scientific prowess with his religious beliefs, and went in search of the soul. © 2018 ABC

Keyword: Consciousness
Link ID: 25327 - Posted: 08.14.2018

Ann Robinson Imagine a neurological condition that affects one in 20 under-18s. It starts early, causes significant distress and pain to the child, damages families and limits the chances of leading a fulfilled life as an adult. One in 20 children are affected but only half of these will get a diagnosis and a fifth will receive treatment. If those stats related to a familiar and well-understood illness, such as asthma, there would be little debate about the need to improve intervention rates. But this is attention deficit hyperactivity disorder (ADHD), and the outcry is muted. If anything, we hear warnings that too many children are being labelled this way, and too many given prescriptions. In the United States, ADHD is diagnosed at more than twice the incidence in Britain. The true prevalence is likely to be the same on both sides of the Atlantic. So what’s the story? Is the US too gung-ho, or is the UK dragging its heels? Are American doctors too quick to medicate children, or British doctors too slow? Emily Simonoff, co-author of a new meta-analysis in the journal the Lancet Psychiatry, says the problem in the UK is “predominantly about undermedication and underdiagnosis”. Her study examined a range of drug treatments compared to placebo, and it shows that methylphenidate (better known by under the brand name Ritalin) works best for children and amphetamines for adults. © 2018 Guardian News and Media Limited

Keyword: ADHD; Drug Abuse
Link ID: 25321 - Posted: 08.13.2018

Sarah Boseley Health editor Ritalin and other drugs of the same class are the most effective and safest medications to prescribe for children with attention deficit hyperactivity disorder (ADHD), according to a major scientific review. The review of ADHD drugs shows that they work, and work well, in spite of concerns among the public and some doctors that children in the UK are being overmedicated. Ofsted’s chief inspector, Amanda Spielman, has likened the drugs to a “chemical cosh” and claimed they were being overprescribed, disguising bad behaviour among children that could be better dealt with. The authors of a major study in the Lancet Psychiatry journal say that methylphenidate, of which Ritalin is the best-known brand, is the most effective and best-tolerated treatment for children while amphetamines work best for adults. While the number of children on medication has risen as ADHD has become better understood, many do not get the treatment they need to cope in life and get through school, they said. The Guardian has revealed that getting help in the UK can take as long as two years. Emily Simonoff, a professor of child and adolescent psychiatry at King’s College London, one of the authors, said the perception that children were overmedicated was not accurate. “Clinicians are very cautious about using medication in this country,” she said. “The problem in the UK is predominantly about undermedication and underdiagnosis.” © 2018 Guardian News and Media Limited

Keyword: ADHD
Link ID: 25305 - Posted: 08.08.2018

By Susan Schneider As you read this, it feels like something to be you. You are seeing these words on the page and hearing the world around you, for instance. And all these thoughts and sensations come together into your conscious “now.” Consciousness is this felt quality of experience. Without consciousness, there would be no enjoyment of a beautiful sunset. Nor would there be suffering. Experience, positive or negative, simply wouldn’t exist. At the heart of current theorizing about consciousness in philosophy is the hard problem of consciousness, a puzzle raised by the philosopher David Chalmers. (See his Scientific American article “The Puzzle of Conscious Experience.”) Cognitive science says that the brain is an information processing engine. The hard problem asks: but why does all this sophisticated information processing need to feel like anything, from the inside? Why do we have experience? One influential approach to the problem, endorsed by Chalmers himself, is panpsychism. Panpsychism holds that even the smallest layers of reality have experience. Fundamental particles have minute levels of consciousness, and in a watered-down sense, they are subjects of experience. When particles are in extremely sophisticated configurations, such as when they are in nervous systems, more sophisticated forms of consciousness arise. Panpsychism aims to locate the building blocks of reality in the most basic layer of reality identified by a completed physics. Indeed, panpsychists claim that it is a virtue of their theory that it meshes with fundamental physics, for experience is the underlying nature of the properties that physics identifies. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25297 - Posted: 08.06.2018

By Anna Clemens In 2003 a 65-year-old man brought a strange problem to neurologist Adam Zeman, now at the University of Exeter in England. The patient, later dubbed “MX,” claimed he could not conjure images of friends, family members or recently visited places. All his life, MX, a retired surveyor, had loved reading novels and had routinely drifted off to sleep visualizing buildings, loved ones and recent events. But after undergoing a procedure to open arteries in his heart, during which he probably suffered a minor stroke, his mind’s eye went blind. He could see normally, but he could not form pictures in his mind. Zeman had never encountered anything like it and set out to learn more. He has since given the condition a name—aphantasia (phantasia means “imagination” in Greek). And he and others are exploring its neurological underpinnings. Zeman and his colleagues began their analysis by testing MX’s visual imagination in several ways. Compared with control subjects, MX scored poorly on questionnaires assessing the ability to produce visual imagery. Surprisingly, though, he was able to accomplish tasks that typically involve visualization. Advertisement For example, when asked to say which is a lighter color of green—grass or pine trees—most people would decide by imagining both grass and tree and comparing them. MX correctly said that pine trees are darker than grass, but he insisted he had used no visual imagery to make the decision. “I just know the answer,” he said. © 2018 Scientific American

Keyword: Attention
Link ID: 25279 - Posted: 08.01.2018

By Stephen T. Casper The case report is dead. At least, it seems all but so in the realm of evidence-based medicine. It is thus thoroughly refreshing to read Helen Thomson’s Unthinkable: An Extraordinary Journey Through the World’s Strangest Brains and Eric R. Kandel’s The Disordered Mind: What Unusual Brains Tell Us About Ourselves, two ambitious books that draw on clinical profiles to tell stories about our brains and minds. Thomson’s memoir aims to help us understand our brains through stories about exceptional others, who, she argues, may serve as proxies for ourselves. Kandel’s book argues from neuroscience research and individual illness experiences for a biologically informed account of mind and brain. Both authors are unapologetic in their focus on what might be dismissed as merely anecdotal. Each foregrounds neurological and psychiatric patient narratives and experiences and from these draws out larger philosophical and scientific lessons. By profiling and seeking meaning in individuals with curious neurological conditions, Thomson’s Unthinkable follows a well-worn literary path but revitalizes the genre with an original and subtle shift to the personal. Perfected by neurologist Oliver Sacks, Thomson’s technique was invented before the 19th century but most famously pioneered in the 20th century by such eminent neurologists as Morton Prince, Sigmund Freud, and Alexander Luria. Where those authors represented patients as medical mysteries or as object lessons in physiology and philosophy, Thomson finds a timelier focus that corresponds with the growing advocacy for, and social attention to, individual patients’ rights. Unlike her predecessors in the genre, Thomson enters her subject’s lives—their restaurants, homes, families, communities, and online selves. © 2017 American Association for the Advancement of Science

Keyword: Attention
Link ID: 25278 - Posted: 08.01.2018

Allison Aubrey Was it hard to concentrate during that long meeting? Or, does the crossword seem a little tougher? You could be mildly dehydrated. A growing body of evidence finds that being just a little dehydrated is tied to a range of subtle effects — from mood changes to muddled thinking. "We find that when people are mildly dehydrated they really don't do as well on tasks that require complex processing or on tasks that require a lot of their attention," says Mindy Millard-Stafford, director of the Exercise Physiology Laboratory at Georgia Institute of Technology. She published an analysis of the evidence this month, based on 33 studies. Heat Making You Lethargic? Research Shows It Can Slow Your Brain, Too Shots - Health News Heat Making You Lethargic? Research Shows It Can Slow Your Brain, Too How long does it take to become mildly dehydrated in the summer heat? Not long at all, studies show, especially when you exercise outdoors. "If I were hiking at moderate intensity for one hour, I could reach about 1.5 percent to 2 percent dehydration," says Doug Casa, a professor of kinesiology at the University of Connecticut, and CEO of the Korey Stringer Institute. For an average-size person, 2 percent dehydration equates to sweating out about a liter of water. "Most people don't realize how high their sweat rate is in the heat," Casa says. If you're going hard during a run, you can reach that level of dehydration in about 30 minutes. And, at this level of dehydration the feeling of thirst, for many of us, is only just beginning to kick in. "Most people can't perceive that they're 1.5 percent dehydrated," Casa says. © 2018 npr

Keyword: Learning & Memory; Attention
Link ID: 25269 - Posted: 07.30.2018

By Perri Klass, M.D. Whenever I write about children getting medications for anxiety, for depression, or especially for attention deficit hyperactivity disorder, a certain number of readers respond with anger and suspicion, accusing me of being part of a conspiracy to medicate children for behaviors that are either part of the normal range of childhood or else the direct result of bad schools, bad environments or bad parenting. Others suggest that doctors who prescribe such medications are in the corrupt grip of the drug companies. And there are parents with stories of unexpected side effects and doctors who didn’t listen. (Of course, there are also parents who write to say that the right medication at the right moment really helped, or adults regretting that no one offered them something that might have helped back when they were struggling.) Putting children, especially young children, on psychotropic medications is scary for parents, sometimes scary for children and also, often, scary for the doctors who do the prescribing. As a pediatrician, I have often had occasion to be grateful to colleagues with more experience and training who could help a family figure out the right medication, dosing and follow-up. It is a big deal, and there are side effects to worry about and doctors should listen to families’ concerns. But when a child is suffering and struggling, families need help, and medications are often part of the discussion. And so, without presuming to judge what should be done for any specific child, I want to talk about the discussion that needs to take place around medicating a child in distress, and how the doctor and the family should monitor medications when they are prescribed. © 2018 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 25267 - Posted: 07.30.2018

By Darold A. Treffert Savant syndrome comes in different forms. In congenital savant syndrome the extraordinary savant ability surfaces in early childhood. In acquired savant syndrome astonishing new abilities, typically in music, art or mathematics, appear unexpectedly in ordinary persons after a head injury, stroke or other central nervous system (CNS) incident where no such abilities or interests were present pre-incident. But in sudden savant syndrome an ordinary person with no such prior interest or ability and no precipitating injury or other CNS incident has an unanticipated, spontaneous epiphanylike moment where the rules and intricacies of music, art or mathematics, for example, are experienced and revealed, producing almost instantaneous giftedness and ability in the affected area of skill sets. Because there is no underlying disability such as that which occurs in congenital or acquired savant syndromes, technically sudden savant syndrome would be better termed sudden genius A 28-year-old gentleman from Israel, K. A., sent his description of his epiphany moment. He was in a mall where there was a piano. Whereas he could play simple popular songs from rote memory before, “suddenly at age 28 after what I can best describe as a ‘just getting it moment,’ it all seemed so simple. I suddenly was playing like a well-educated pianist.” His friends were astonished as he played and suddenly understood music in an entirely intricate way. “I suddenly realized what the major scale and minor scale were, what their chords were and where to put my fingers in order to play certain parts of the scale. I was instantly able to recognize harmonies of the scales in songs I knew as well as the ability to play melody by interval recognition.” He began to search the internet for information on music theory and to his amazement “most of what they had to teach I already knew, which baffled me as to how could I know something I had never studied." © 2018 Scientific American

Keyword: Attention; Learning & Memory
Link ID: 25255 - Posted: 07.26.2018

By Jocelyn Kaiser Basic brain and behavioral researchers will get more than a year to comply with a new U.S. policy that will treat many of their studies as clinical trials. The announcement from the National Institutes of Health (NIH) appears to defuse, for now, a yearlong controversy over whether basic research on humans should follow the same rules as studies testing drugs. Although research groups had hoped NIH would drop its plans to tag basic studies with humans as trials, they say they’re relieved they get more time to prepare and give the agency input. “It’s a positive step forward,” says Paula Skedsvold, executive director of the Federation of Associations in Behavioral & Brain Sciences in Washington, D.C. At issue is a recently revised definition of a clinical trial along with a set of rules in effect since January that are meant to increase the rigor and transparency of NIH-funded clinical trials. About a year ago, basic scientists who study human cognition—for example, using brain imaging with healthy volunteers—were alarmed to realize many of these studies fit the new clinical trial definition. Researchers protested that many requirements, such as registering and reporting results in the federal database, made no sense for studies that weren’t testing a treatment and would confuse the public. NIH then issued a set of case studies explaining that only some basic studies would fall under the trials definition. But concerns remained about confusing criteria and burdensome new paperwork. © 2018 American Association for the Advancement of Science

Keyword: Attention; Learning & Memory
Link ID: 25248 - Posted: 07.25.2018

Rhitu Chatterjee Most teens today own a smartphone and go online every day, and about a quarter of them use the internet "almost constantly," according to a 2015 report by the Pew Research Center. Now a study published Tuesday in JAMA suggests that such frequent use of digital media by adolescents might increase their odds of developing symptoms of attention deficit hyperactivity disorder. "It's one of the first studies to look at modern digital media and ADHD risk," says psychologist Adam Leventhal, an associate professor of preventive medicine at the University of Southern California and an author of the study. When considered with previous research showing that greater social media use is associated with depression in teens, the new study suggests that "excessive digital media use doesn't seem to be great for [their] mental health," he adds. Previous research has shown that watching television or playing video games on a console put teenagers at a slightly higher risk of developing ADHD behaviors. But less is known about the impact of computers, tablets and smartphones. Because these tools have evolved very rapidly, there's been little research into the impact of these new technologies on us, says Jenny Radesky, a pediatrician at the University of Michigan, who wrote an editorial about the new study for JAMA. Each new platform reaches millions of people worldwide in a matter of days or weeks, she says. "Angry Birds reached 50 million users within 35 days. Pokémon Go reached the same number in 19 days." © 2018 npr

Keyword: ADHD
Link ID: 25220 - Posted: 07.18.2018

Allison Aubrey Can't cool off this summer? Heat waves can slow us down in ways we may not realize. New research suggests heat stress can muddle our thinking, making simple math a little harder to do. "There's evidence that our brains are susceptible to temperature abnormalities," says Joe Allen, co-director of the Center for Climate, Health and the Global Environment at Harvard University. And as the climate changes, temperatures spike and heat waves are more frequent. To learn more about how the heat influences young, healthy adults, Allen and his colleagues studied college students living in dorms during a summer heat wave in Boston. Half of the students lived in buildings with central AC, where the indoor air temperature averaged 71 degrees. The other half lived in dorms with no AC, where air temperatures averaged almost 80 degrees. "In the morning, when they woke up, we pushed tests out to their cellphones," explains Allen. The students took two tests a day for 12 consecutive days. One test, which included basic addition and subtraction, measured cognitive speed and memory. A second test assessed attention and processing speed. "We found that the students who were in the non-air-conditioned buildings actually had slower reaction times: 13 percent lower performance on basic arithmetic tests, and nearly a 10 percent reduction in the number of correct responses per minute," Allen explains. The results, published in PLOS Medicine, may come as a surprise. "I think it's a little bit akin to the frog in the boiling water," Allen says. There's a "slow, steady — largely imperceptible — rise in temperature, and you don't realize it's having an impact on you." © 2018 npr

Keyword: Attention
Link ID: 25211 - Posted: 07.16.2018

By Erica Goode Suppose that, seeking a fun evening out, you pay $175 for a ticket to a new Broadway musical. Seated in the balcony, you quickly realize that the acting is bad, the sets are ugly and no one, you suspect, will go home humming the melodies. Do you head out the door at the intermission, or stick it out for the duration? Studies of human decision-making suggest that most people will stay put, even though money spent in the past logically should have no bearing on the choice. This “sunk cost fallacy,” as economists call it, is one of many ways that humans allow emotions to affect their choices, sometimes to their own detriment. But the tendency to factor past investments into decision-making is apparently not limited to Homo sapiens. In a study published on Thursday in the journal Science, investigators at the University of Minnesota reported that mice and rats were just as likely as humans to be influenced by sunk costs. The more time they invested in waiting for a reward — in the case of the rodents, flavored pellets; in the case of the humans, entertaining videos — the less likely they were to quit the pursuit before the delay ended. “Whatever is going on in the humans is also going on in the nonhuman animals,” said A. David Redish, a professor of neuroscience at the University of Minnesota and an author of the study. This cross-species consistency, he and others said, suggested that in some decision-making situations, taking account of how much has already been invested might pay off. “Evolution by natural selection would not promote any behavior unless it had some — perhaps obscure — net overall benefit,” said Alex Kacelnik, a professor of behavioral ecology at Oxford, who praised the new study as “rigorous” in its methodology and “well designed.” © 2018 The New York Times Company

Keyword: Attention; Learning & Memory
Link ID: 25203 - Posted: 07.13.2018

Arran Frood The use of drugs by people hoping to boost mental performance is rising worldwide, finds the largest ever study of the trend. In a survey of tens of thousands of people, 14% reported using stimulants at least once in the preceding 12 months in 2017, up from 5% in 2015. The non-medical use of substances — often dubbed smart drugs — to increase memory or concentration is known as pharmacological cognitive enhancement (PCE), and it rose in all 15 nations included in the survey. The study looked at prescription medications such as Adderall and Ritalin — prescribed medically to treat attention deficit hyperactivity disorder (ADHD) — as well as the sleep-disorder medication modafinil and illegal stimulants such as cocaine. The work, published in the International Journal of Drug Policy1 in June, is based on the Global Drug Survey — an annual, anonymous online questionnaire about drug use worldwide. The survey had 79,640 respondents in 2015 and 29,758 in 2017. US respondents reported the highest rate of use: in 2017, nearly 30% said they had used drugs for PCE at least once in the preceding 12 months, up from 20% in 2015. But the largest increases were in Europe: use in France rose from 3% in 2015 to 16% in 2017; and from 5% to 23% in the United Kingdom (see ‘Quest for cognitive enhancement’). An informal reader survey by Nature in 2008 found that one in five respondents had used drugs to boost concentration or memory. The latest analysis is impressive in its size, says Barbara Sahakian, a neuroscientist at the University of Cambridge, UK, who was not involved in the work. There is an increasing ‘lifestyle use’ of cognitive-enhancing drugs by healthy people, which raises ethical concerns, she says. © 2018 Springer Nature Limited.

Keyword: Drug Abuse; Attention
Link ID: 25181 - Posted: 07.07.2018

By Michael Shermer In 1967 British biologist and Nobel laureate Sir Peter Medawar famously characterized science as, in book title form, The Art of the Soluble. “Good scientists study the most important problems they think they can solve. It is, after all, their professional business to solve problems, not merely to grapple with them,” he wrote. For millennia, the greatest minds of our species have grappled to gain purchase on the vertiginous ontological cliffs of three great mysteries—consciousness, free will and God—without ascending anywhere near the thin air of their peaks. Unlike other inscrutable problems, such as the structure of the atom, the molecular basis of replication and the causes of human violence, which have witnessed stunning advancements of enlightenment, these three seem to recede ever further away from understanding, even as we race ever faster to catch them in our scientific nets. Are these “hard” problems, as philosopher David Chalmers characterized consciousness, or are they truly insoluble “mysterian” problems, as philosopher Owen Flanagan designated them (inspired by the 1960s rock group Question Mark and the Mysterians)? The “old mysterians” were dualists who believed in nonmaterial properties, such as the soul, that cannot be explained by natural processes. The “new mysterians,” Flanagan says, contend that consciousness can never be explained because of the limitations of human cognition. I contend that not only consciousness but also free will and God are mysterian problems—not because we are not yet smart enough to solve them but because they can never be solved, not even in principle, relating to how the concepts are conceived in language. Call those of us in this camp the “final mysterians.” © 2018 Scientific American

Keyword: Consciousness
Link ID: 25167 - Posted: 07.02.2018

/ By Eric Allen Been ‘Anew generation of scientists is not satisfied merely to watch and describe brain activity,” writes David Adam. “They want to interfere, to change and improve the brain — to neuroenhance it.” In his new book “The Genius Within: Unlocking Your Brain’s Potential” (Pegasus), Adam offers a many-sided investigation of neuroenhancement — a hodgepodge of technologies and drug treatments aimed at improving intelligence. A London-based science writer and editor, he previously wrote about obsessive-compulsive disorder, its history, and his own struggle with it in “The Man Who Couldn’t Stop” (2014). “We wonder at the stars, and then we start to work out how far away things are. And then we design a spacecraft that’s going to take us up there. I think that’s happened with neuroscience.” For this installment of the Undark Five, I talked with Adam about neuroenhancement — among other things, whether it’s fair to enhance some people’s cognitive abilities but not others’, why the subject of intelligence makes so many people uncomfortable, and whether “smart drugs” will one day make us all Einsteins. Here’s our conversation, edited for length and clarity. UNDARK — There’s been a shift within neuroscience from not just trying to understand how the brain works but to enhance it. How did that happen? Copyright 2018 Undark

Keyword: Attention
Link ID: 25164 - Posted: 07.02.2018

By Simon Makin The electrical oscillations we call brain waves have intrigued scientists and the public for more than a century. But their function—and even whether they have one, rather than just reflecting brain activity like an engine’s hum—is still debated. Many neuroscientists have assumed that if brain waves do anything, it is by oscillating in synchrony in different locations. Yet a growing body of research suggests many brain waves are actually “traveling waves” that physically move through the brain like waves on the sea. Now a new study from a team at Columbia University led by neuroscientist Joshua Jacobs suggests traveling waves are widespread in the human cortex—the seat of higher cognitive functions—and that they become more organized depending on how well the brain is performing a task. This shows the waves are relevant to behavior, bolstering previous research suggesting they are an important but overlooked brain mechanism that contributes to memory, perception, attention and even consciousness. Brain waves were first discovered using electroencephalogram (EEG) techniques, which involve placing electrodes on the scalp. Researchers have noted activity over a range of different frequencies, from delta (0.5 to 4 hertz) through to gamma (25 to 140 Hz) waves. The slowest occur during deep sleep, with increasing frequency associated with increasing levels of consciousness and concentration. Interpreting EEG data is difficult due to its poor ability to pinpoint the location of activity, and the fact that passage through the head blurs the signals. The new study, published earlier this month in Neuron, used a more recent technique called electrocorticography (ECoG). This involves placing electrode arrays directly on the brain’s surface, minimizing distortions and vastly improving spatial resolution. © 2018 Scientific American

Keyword: Attention
Link ID: 25159 - Posted: 06.29.2018

David Levari Why do many problems in life seem to stubbornly stick around, no matter how hard people work to fix them? It turns out that a quirk in the way human brains process information means that when something becomes rare, we sometimes see it in more places than ever. Think of a “neighborhood watch” made up of volunteers who call the police when they see anything suspicious. Imagine a new volunteer who joins the watch to help lower crime in the area. When they first start volunteering, they raise the alarm when they see signs of serious crimes, like assault or burglary. Let’s assume these efforts help and, over time, assaults and burglaries become rarer in the neighborhood. What would the volunteer do next? One possibility is that they would relax and stop calling the police. After all, the serious crimes they used to worry about are a thing of the past. But you may share the intuition my research group had – that many volunteers in this situation wouldn’t relax just because crime went down. Instead, they’d start calling things “suspicious” that they would never have cared about back when crime was high, like jaywalking or loitering at night. You can probably think of many similar situations in which problems never seem to go away, because people keep changing how they define them. This is sometimes called “concept creep,” or “moving the goalposts,” and it can be a frustrating experience. How can you know if you’re making progress solving a problem, when you keep redefining what it means to solve it? My colleagues and I wanted to understand when this kind of behavior happens, why, and if it can be prevented. © 2010–2018, The Conversation US, Inc.

Keyword: Attention
Link ID: 25158 - Posted: 06.29.2018