Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 121 - 140 of 703

By Darold A. Treffert Savant syndrome comes in different forms. In congenital savant syndrome the extraordinary savant ability surfaces in early childhood. In acquired savant syndrome astonishing new abilities, typically in music, art or mathematics, appear unexpectedly in ordinary persons after a head injury, stroke or other central nervous system (CNS) incident where no such abilities or interests were present pre-incident. But in sudden savant syndrome an ordinary person with no such prior interest or ability and no precipitating injury or other CNS incident has an unanticipated, spontaneous epiphanylike moment where the rules and intricacies of music, art or mathematics, for example, are experienced and revealed, producing almost instantaneous giftedness and ability in the affected area of skill sets. Because there is no underlying disability such as that which occurs in congenital or acquired savant syndromes, technically sudden savant syndrome would be better termed sudden genius A 28-year-old gentleman from Israel, K. A., sent his description of his epiphany moment. He was in a mall where there was a piano. Whereas he could play simple popular songs from rote memory before, “suddenly at age 28 after what I can best describe as a ‘just getting it moment,’ it all seemed so simple. I suddenly was playing like a well-educated pianist.” His friends were astonished as he played and suddenly understood music in an entirely intricate way. “I suddenly realized what the major scale and minor scale were, what their chords were and where to put my fingers in order to play certain parts of the scale. I was instantly able to recognize harmonies of the scales in songs I knew as well as the ability to play melody by interval recognition.” He began to search the internet for information on music theory and to his amazement “most of what they had to teach I already knew, which baffled me as to how could I know something I had never studied." © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25255 - Posted: 07.26.2018

By Jocelyn Kaiser Basic brain and behavioral researchers will get more than a year to comply with a new U.S. policy that will treat many of their studies as clinical trials. The announcement from the National Institutes of Health (NIH) appears to defuse, for now, a yearlong controversy over whether basic research on humans should follow the same rules as studies testing drugs. Although research groups had hoped NIH would drop its plans to tag basic studies with humans as trials, they say they’re relieved they get more time to prepare and give the agency input. “It’s a positive step forward,” says Paula Skedsvold, executive director of the Federation of Associations in Behavioral & Brain Sciences in Washington, D.C. At issue is a recently revised definition of a clinical trial along with a set of rules in effect since January that are meant to increase the rigor and transparency of NIH-funded clinical trials. About a year ago, basic scientists who study human cognition—for example, using brain imaging with healthy volunteers—were alarmed to realize many of these studies fit the new clinical trial definition. Researchers protested that many requirements, such as registering and reporting results in the ClinicalTrials.gov federal database, made no sense for studies that weren’t testing a treatment and would confuse the public. NIH then issued a set of case studies explaining that only some basic studies would fall under the trials definition. But concerns remained about confusing criteria and burdensome new paperwork. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25248 - Posted: 07.25.2018

Allison Aubrey Can't cool off this summer? Heat waves can slow us down in ways we may not realize. New research suggests heat stress can muddle our thinking, making simple math a little harder to do. "There's evidence that our brains are susceptible to temperature abnormalities," says Joe Allen, co-director of the Center for Climate, Health and the Global Environment at Harvard University. And as the climate changes, temperatures spike and heat waves are more frequent. To learn more about how the heat influences young, healthy adults, Allen and his colleagues studied college students living in dorms during a summer heat wave in Boston. Half of the students lived in buildings with central AC, where the indoor air temperature averaged 71 degrees. The other half lived in dorms with no AC, where air temperatures averaged almost 80 degrees. "In the morning, when they woke up, we pushed tests out to their cellphones," explains Allen. The students took two tests a day for 12 consecutive days. One test, which included basic addition and subtraction, measured cognitive speed and memory. A second test assessed attention and processing speed. "We found that the students who were in the non-air-conditioned buildings actually had slower reaction times: 13 percent lower performance on basic arithmetic tests, and nearly a 10 percent reduction in the number of correct responses per minute," Allen explains. The results, published in PLOS Medicine, may come as a surprise. "I think it's a little bit akin to the frog in the boiling water," Allen says. There's a "slow, steady — largely imperceptible — rise in temperature, and you don't realize it's having an impact on you." © 2018 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25211 - Posted: 07.16.2018

By Erica Goode Suppose that, seeking a fun evening out, you pay $175 for a ticket to a new Broadway musical. Seated in the balcony, you quickly realize that the acting is bad, the sets are ugly and no one, you suspect, will go home humming the melodies. Do you head out the door at the intermission, or stick it out for the duration? Studies of human decision-making suggest that most people will stay put, even though money spent in the past logically should have no bearing on the choice. This “sunk cost fallacy,” as economists call it, is one of many ways that humans allow emotions to affect their choices, sometimes to their own detriment. But the tendency to factor past investments into decision-making is apparently not limited to Homo sapiens. In a study published on Thursday in the journal Science, investigators at the University of Minnesota reported that mice and rats were just as likely as humans to be influenced by sunk costs. The more time they invested in waiting for a reward — in the case of the rodents, flavored pellets; in the case of the humans, entertaining videos — the less likely they were to quit the pursuit before the delay ended. “Whatever is going on in the humans is also going on in the nonhuman animals,” said A. David Redish, a professor of neuroscience at the University of Minnesota and an author of the study. This cross-species consistency, he and others said, suggested that in some decision-making situations, taking account of how much has already been invested might pay off. “Evolution by natural selection would not promote any behavior unless it had some — perhaps obscure — net overall benefit,” said Alex Kacelnik, a professor of behavioral ecology at Oxford, who praised the new study as “rigorous” in its methodology and “well designed.” © 2018 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25203 - Posted: 07.13.2018

/ By Eric Allen Been ‘Anew generation of scientists is not satisfied merely to watch and describe brain activity,” writes David Adam. “They want to interfere, to change and improve the brain — to neuroenhance it.” In his new book “The Genius Within: Unlocking Your Brain’s Potential” (Pegasus), Adam offers a many-sided investigation of neuroenhancement — a hodgepodge of technologies and drug treatments aimed at improving intelligence. A London-based science writer and editor, he previously wrote about obsessive-compulsive disorder, its history, and his own struggle with it in “The Man Who Couldn’t Stop” (2014). “We wonder at the stars, and then we start to work out how far away things are. And then we design a spacecraft that’s going to take us up there. I think that’s happened with neuroscience.” For this installment of the Undark Five, I talked with Adam about neuroenhancement — among other things, whether it’s fair to enhance some people’s cognitive abilities but not others’, why the subject of intelligence makes so many people uncomfortable, and whether “smart drugs” will one day make us all Einsteins. Here’s our conversation, edited for length and clarity. UNDARK — There’s been a shift within neuroscience from not just trying to understand how the brain works but to enhance it. How did that happen? Copyright 2018 Undark

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 3: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 25164 - Posted: 07.02.2018

By Simon Makin The electrical oscillations we call brain waves have intrigued scientists and the public for more than a century. But their function—and even whether they have one, rather than just reflecting brain activity like an engine’s hum—is still debated. Many neuroscientists have assumed that if brain waves do anything, it is by oscillating in synchrony in different locations. Yet a growing body of research suggests many brain waves are actually “traveling waves” that physically move through the brain like waves on the sea. Now a new study from a team at Columbia University led by neuroscientist Joshua Jacobs suggests traveling waves are widespread in the human cortex—the seat of higher cognitive functions—and that they become more organized depending on how well the brain is performing a task. This shows the waves are relevant to behavior, bolstering previous research suggesting they are an important but overlooked brain mechanism that contributes to memory, perception, attention and even consciousness. Brain waves were first discovered using electroencephalogram (EEG) techniques, which involve placing electrodes on the scalp. Researchers have noted activity over a range of different frequencies, from delta (0.5 to 4 hertz) through to gamma (25 to 140 Hz) waves. The slowest occur during deep sleep, with increasing frequency associated with increasing levels of consciousness and concentration. Interpreting EEG data is difficult due to its poor ability to pinpoint the location of activity, and the fact that passage through the head blurs the signals. The new study, published earlier this month in Neuron, used a more recent technique called electrocorticography (ECoG). This involves placing electrode arrays directly on the brain’s surface, minimizing distortions and vastly improving spatial resolution. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 25159 - Posted: 06.29.2018

David Levari Why do many problems in life seem to stubbornly stick around, no matter how hard people work to fix them? It turns out that a quirk in the way human brains process information means that when something becomes rare, we sometimes see it in more places than ever. Think of a “neighborhood watch” made up of volunteers who call the police when they see anything suspicious. Imagine a new volunteer who joins the watch to help lower crime in the area. When they first start volunteering, they raise the alarm when they see signs of serious crimes, like assault or burglary. Let’s assume these efforts help and, over time, assaults and burglaries become rarer in the neighborhood. What would the volunteer do next? One possibility is that they would relax and stop calling the police. After all, the serious crimes they used to worry about are a thing of the past. But you may share the intuition my research group had – that many volunteers in this situation wouldn’t relax just because crime went down. Instead, they’d start calling things “suspicious” that they would never have cared about back when crime was high, like jaywalking or loitering at night. You can probably think of many similar situations in which problems never seem to go away, because people keep changing how they define them. This is sometimes called “concept creep,” or “moving the goalposts,” and it can be a frustrating experience. How can you know if you’re making progress solving a problem, when you keep redefining what it means to solve it? My colleagues and I wanted to understand when this kind of behavior happens, why, and if it can be prevented. © 2010–2018, The Conversation US, Inc.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25158 - Posted: 06.29.2018

By Frank Bures Even if there is no sonic weapon, or genitals are not truly shrinking, these conditions are all quite real to the sufferers, just as depression and anxiety are real. One afternoon in May of 2004, a third-grade boy at a local school in Fuhu reported feeling that his genitals were shrinking. He panicked, ran home, and his parents fetched the local healer — an 80-year-old woman who had seen this sort of thing before: In 1963, she said, around the time of the Great Leap Forward, an “evil wind” had blown through the village and many people were struck by this illness known as “suo-yang.” She treated the boy by traditional means and he recovered quickly. Two days later when the school principal learned of the incident, he gathered all 680 students in the school courtyard and, according to a report by Dr. Li Jie of the Guangzhou Psychiatric Hospital, “explained to the students in detail what had happened, and warned them to be cautious, and to take emergency measures if they experienced similar symptoms.” Within two days, 64 other boys were struck with suo-yang, which in its epidemic form, is referred to in the scientific literature as a “mass psychogenic illness” or a “collective stress response.” The Fuhu case was a textbook example of how such an illness can spread through a group of people, and the headmaster did the worst possible thing by explaining the symptoms in detail and assuring students they were in danger. He all but caused epidemic. Copyright 2018 Undark

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25123 - Posted: 06.22.2018

Leslie Henderson Anti-immigrant policies, race-related demonstrations, Title IX disputes, affirmative action court cases, same-sex marriage litigation. These issues are continually in the headlines. But even thoughtful articles on these subjects seem always to devolve to pitting warring factions against each other: black versus white, women versus men, gay versus straight. At the most fundamental level of biology, people recognize the innate advantage of defining differences in species. But even within species, is there something in our neural circuits that leads us to find comfort in those like us and unease with those who may differ? As in all animals, human brains balance two primordial systems. One includes a brain region called the amygdala that can generate fear and distrust of things that pose a danger – think predators or or being lost somewhere unknown. The other, a group of connected structures called the mesolimbic system, can give rise to pleasure and feelings of reward in response to things that make it more likely we’ll flourish and survive – think not only food, but also social pleasure, like trust. But how do these systems interact to influence how we form our concepts of community? Implicit association tests can uncover the strength of unconscious associations. Scientists have shown that many people harbor an implicit preference for their in-group – those like themselves – even when they show no outward or obvious signs of bias. For example, in studies whites perceive blacks as more violent and more apt to do harm, solely because they are black, and this unconscious bias is evident even toward black boys as young as five years old. © 2010–2018, The Conversation US, Inc.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25122 - Posted: 06.22.2018

Some human brains are nearly twice the size of others – but how might that matter? Researchers at the National Institute of Mental Health (NIMH) and their NIH grant-funded colleagues have discovered that these differences in size are related to the brain’s shape and the way it is organized. The bigger the brain, the more its additional area is accounted for by growth in thinking areas of the cortex, or outer mantle – at the expense of relatively slower growth in lower order emotional, sensory, and motor areas. This mirrors the pattern of brain changes seen in evolution and individual development – with higher-order areas showing greatest expansion. The researchers also found evidence linking the high-expanding regions to higher connectivity between neurons and higher energy consumption. “Just as different parts are required to scale-up a garden shed to the size of a mansion, it seems that big primate brains have to be built to different proportions,” explained Armin Raznahan, M.D., Ph.D., of the NIMH Intramural Research Program (IRP). “An extra investment has to be made in the part that integrates information – but that’s not to say that it’s better to have a bigger brain. Our findings speak more to the different organizational needs of larger vs. smaller brains.” Raznahan, P.K. Reardon, Jakob Seidlitz, and colleagues at more than six collaborating research centers report on their study incorporating brain scan data from more than 3,000 people in Science. Reardon and Seidlitz are students in the NIH Oxford-Cambridge Scholars Program.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25045 - Posted: 06.01.2018

By Simon Makin Everyone has unwelcome thoughts from time to time. But such intrusions can signal serious psychiatric conditions—from “flashbacks” in post-traumatic stress disorder (PTSD) to obsessive negative thinking in depression to hallucinations in schizophrenia. “These are some of the most debilitating symptoms,” says neuroscientist Michael Anderson of the University of Cambridge. New research led by Anderson and neuroscientist Taylor Schmitz, now at McGill University, suggests these symptoms may all stem from a faulty brain mechanism responsible for blocking thoughts. Researchers studying this faculty usually focus on the prefrontal cortex (PFC), a control center that directs the activity of other brain regions. But Anderson and his colleagues noticed that conditions featuring intrusive thoughts—such as schizophrenia—often involve increased activity in the hippocampus, an important memory region. The severity of symptoms such as hallucinations also increases with this elevated activity. In the new study, Anderson and his team had healthy participants learn a series of word pairs. The subjects were presented with one word and had to either recall or suppress the associated one. When participants suppressed thoughts, brain scans detected increased activity in part of the PFC and reduced activity in the hippocampus. The findings, which were published last November in Nature Communications, are consistent with a brain circuit in which a “stop” command from the PFC suppresses hippocampus activity. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24864 - Posted: 04.13.2018

Jason Murugesu We all daydream, whether about marrying Rihanna, discovering a sudden ability to sing opera or never having to answer another email again. Yet it is only in the last few decades that the science behind daydreaming, or mind-wandering as it is termed in most academic literature, has transitioned from the realms of pseudoscience to the cutting edge of cognitive neuroscience. At its most basic, daydreaming is your mind wandering from the here and now. Traditionally, daydreaming was considered to be a single psychological state of mind. This, however, caused conflict in academic literature, and the resulting confusion is the reason why you might read that daydreaming is linked to happiness in one paper, but to depression in the next. Different types of mind-wandering have been conflated. Using neuroimaging techniques, a study conducted last year by the University of York found that different types of daydreams – for example, those which are fantastical, autobiographical, future orientated or past oriented – were built up of different neuronal activation patterns, and by virtue could not be considered a single psychological construct. Nevertheless, if we consider all these types of mind-wandering together, you would be surprised about how much of our waking time we spend daydreaming. In 2008, Professor Matthew Killingsworth, then at Harvard University, used an app that contacted a large group of people at random points of the day to find out how often they were daydreaming. The app would ask its users what they were doing, and whether they were thinking about something else entirely. They found that 46.9 per cent of the time, the user was mind-wandering.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24852 - Posted: 04.11.2018

by Meeri Kim On a beautiful autumn afternoon in New York’s Central Park, Carol Berman had the horrifying realization that her husband of 40 years no longer recognized her as his wife. In his eyes, she wasn’t the real Carol but rather some strange woman pretending to be Carol — effectively, an impostor. They were out for a stroll when he started yelling at a woman with a similar hairdo farther up the street: “Carol! Carol, come here!” Shocked, his wife faced him head-on, looked deep into his eyes and reassured him that she was right here. But he refused to acknowledge her as the real Carol. Marty Berman had been a warmhearted, highly intelligent and hard-working patent lawyer for much of his life. But at 74, he began to show signs of dementia. Once proficient in math and engineering, he could no longer subtract simple numbers correctly. A man who had walked the whole of Manhattan couldn’t go a few blocks by himself anymore without getting lost. Perhaps the most painful part for Carol was when her husband’s delusion developed a year or two after his initial symptoms arose. Capgras syndrome is a psychological condition that prompts a person to believe that loved ones have been replaced by identical duplicates of themselves. As a clinical assistant professor of psychiatry at New York University, Carol had treated several Capgras patients. But witnessing the delusion in the person she loved the most, whom she was already losing to dementia, was agonizing. © 1996-2018 The Washington Post

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24833 - Posted: 04.07.2018

By Simon Makin Everyone has unwelcome thoughts from time to time. But such intrusions can signal serious psychiatric conditions—from “flashbacks” in post-traumatic stress disorder (PTSD) to obsessive negative thinking in depression to hallucinations in schizophrenia. “These are some of the most debilitating symptoms,” says neuroscientist Michael Anderson of the University of Cambridge. New research led by Anderson and neuroscientist Taylor Schmitz, now at McGill University, suggests these symptoms may all stem from a faulty brain mechanism responsible for blocking thoughts. Researchers studying this faculty usually focus on the prefrontal cortex (PFC), a control center that directs the activity of other brain regions. But Anderson and his colleagues noticed that conditions featuring intrusive thoughts—such as schizophrenia—often involve increased activity in the hippocampus, an important memory region. The severity of symptoms such as hallucinations also increases with this elevated activity. In the new study, Anderson and his team had healthy participants learn a series of word pairs. The subjects were presented with one word and had to either recall or suppress the associated one. When participants suppressed thoughts, brain scans detected increased activity in part of the PFC and reduced activity in the hippocampus. The findings, which were published last November in Nature Communications, are consistent with a brain circuit in which a “stop” command from the PFC suppresses hippocampus activity. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 24775 - Posted: 03.21.2018

Laura Sanders We can’t see it, but brains hum with electrical activity. Brain waves created by the coordinated firing of huge collections of nerve cells pinball around the brain. The waves can ricochet from the front of the brain to the back, or from deep structures all the way to the scalp and then back again. Called neuronal oscillations, these signals are known to accompany certain mental states. Quiet alpha waves ripple soothingly across the brains of meditating monks. Beta waves rise and fall during intense conversational turns. Fast gamma waves accompany sharp insights. Sluggish delta rhythms lull deep sleepers, while dreamers shift into slightly quicker theta rhythms. Researchers have long argued over whether these waves have purpose, and what those purposes might be. Some scientists see waves as inevitable but useless by-products of the signals that really matter — messages sent by individual nerve cells. Waves are simply a consequence of collective neural behavior, and nothing more, that view holds. But a growing body of evidence suggests just the opposite: Instead of by-products of important signals, brain waves are key to how the brain operates, routing information among far-flung brain regions that need to work together. MIT’s Earl Miller is among the neuro­scientists amassing evidence that waves are an essential part of how the brain operates. Brain oscillations deftly route information in a way that allows the brain to choose which signals in the world to pay attention to and which to ignore, his recent studies suggest. |© Society for Science & the Public 2000 - 2018

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 24750 - Posted: 03.14.2018

By NEIL GENZLINGER Anne M. Treisman, whose insights into how we perceive the world around us provided some of the core theories for the field of cognitive psychology, died on Friday at her home in Manhattan. She was 82. Her daughter Deborah Treisman said the cause was a stroke after a long illness. Dr. Treisman considered a fundamental question: How does the brain make sense of the bombardment of input it is receiving and focus attention on a particular object or activity? What she came up with is called the feature integration theory of attention, detailed in a much-cited 1980 article written with Garry Gelade in the journal Cognitive Psychology, then refined and elaborated on in later work. “Perhaps Anne’s central insight in the field of visual attention was that she realized that you could see basic features like color, orientation and shape everywhere in the visual field, but that there was a problem in knowing how those colors, orientations, shapes, etc., were ‘bound’ together into objects,” Jeremy M. Wolfe, director of the Visual Attention Lab of Harvard Medical School and Brigham and Women’s Hospital, explained in an email. “Her seminal feature integration theory,” he continued, “proposed that selective attention to an object or location enabled the binding of those features and, thus, enabled object recognition. Much argument has followed, but her formulation of the problem has shaped the field for almost four decades.” Dr. Treisman did not merely theorize about how perception works; she tested her ideas with countless experiments in which subjects were asked, for instance, to pick a particular letter out of a visual field, or to identify black digits and colored letters flashing by. The work showed not only how we perceive, but also how we can sometimes misperceive. © 2018 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 24657 - Posted: 02.14.2018

By NATALIE ANGIER Every night during breeding season, the male túngara frog of Central America will stake out a performance patch in the local pond and spend unbroken hours broadcasting his splendor to the world. The mud-brown frog is barely the size of a shelled pecan, but his call is large and dynamic, a long downward sweep that sounds remarkably like a phaser weapon on “Star Trek,” followed by a brief, twangy, harmonically dense chuck. Unless, that is, a competing male starts calling nearby, in which case the first frog is likely to add two chucks to the tail of his sweep. And should his rival respond likewise, Male A will tack on three chucks. Back and forth they go, call and raise, until the frogs hit their respiratory limit at six to seven rapid-fire chucks. The acoustic one-upfrogship is energetically draining and risks attracting predators like bats. Yet the male frogs have no choice but to keep count of the competition, for the simple reason that female túngaras are doing the same: listening, counting and ultimately mating with the male of maximum chucks. Behind the frog’s surprisingly sophisticated number sense, scientists have found, are specialized cells located in the amphibian midbrain that tally up sound signals and the intervals between them. “The neurons are counting the number of appropriate pulses, and they’re highly selective,” said Gary Rose, a biologist at the University of Utah. If the timing between pulses is off by just a fraction of a second, the neurons don’t fire and the counting process breaks down. “It’s game over,” Dr. Rose said. “Just as in human communication, an inappropriate comment can end the whole conversation.” © 2018 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24623 - Posted: 02.06.2018

By ALAN BURDICK In his first year in office President Trump gave himself credit for numerous accomplishments that he had little or nothing to do with: the passage of the Republican tax bill; Walmart’s creating 10,000 jobs in the United States; the invention of the phrase “prime the pump”; and the fact that in his brief tenure, nobody died in a commercial aviation accident. (The last fatal crash on a domestic commercial airline in the United States was in 2009.) But one thing that Mr. Trump almost certainly managed to do, without effort or notice, is alter our perception of time. We’re all aware that our experience of time is fungible: Days fly by, conversations drag on, that weeklong vacation seems to last forever until suddenly it doesn’t. As long ago as 1890 the psychologist William James noted that our feelings of time “harmonize with different mental moods.” There now exists a large body of scientific literature demonstrating that emotions play a large part in generating these temporal flexions. For instance, when viewing faces on a computer monitor, lab subjects report that happy faces seem to last longer onscreen than nonexpressive ones, and angry faces seem to last longer still. Fear, alarm and stress are factors too. Forty-five seconds with a live spider seems to last far longer to people who are afraid of spiders. Watching three minutes of video clips of the Sept. 11 attacks feels longer than watching a three-minute clip from “The Wizard of Oz.” Now consider that Mr. Trump’s first year in office must rank as the most chaotic and tumultuous in modern presidential history. Virtually every week served up a new drama: the firing of the national security adviser Michael Flynn; the firing of the F.B.I. director James Comey; the appointment of Robert Mueller as special counsel; Mr. Trump’s announcement, via Twitter, banning transgender people from the military; his bungled phone call to the widow of a soldier killed in Niger; his support of the Senate candidacy of Roy Moore; his pardon of the former Arizona sheriff Joe Arpaio; his mockery of the television host Mika Brzezinski; his failure to immediately denounce the white supremacist marchers in Charlottesville, Va.; his rants about the peaceful protests of professional football players; his taunting of the North Korean leader Kim Jong-un with his bigger “nuclear button.” It has been a 12-month-long emotional roller coaster, even for Mr. Trump’s supporters. © 2018 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 24551 - Posted: 01.22.2018

Ian Sample Science editor Donatella Versace finds it in the conflict of ideas, Jack White under pressure of deadlines. For William S Burroughs, an old Dadaist trick helped: cutting pages into pieces and rearranging the words. Every artist has their own way of generating original ideas, but what is happening inside the brain might not be so individual. In new research, scientists report signature patterns of neural activity that mark out those who are most creative. “We have identified a pattern of brain connectivity that varies across people, but is associated with the ability to come up with creative ideas,” said Roger Beaty, a psychologist at Harvard University. “It’s not like we can predict with perfect accuracy who’s going to be the next Einstein, but we can get a pretty good sense of how flexible a given person’s thinking is.” Creative thinking is one of the primary drivers of cultural and technological change, but the brain activity that underpins original thought has been hard to pin down. In an effort to shed light on the creative process, Beaty teamed up with colleagues in Austria and China to scan people’s brains as they came up with original ideas. The scientists asked the volunteers to perform a creative thinking task as they lay inside a brain scanner. While the machine recorded their white matter at work, the participants had 12 seconds to come up with the most imaginative use for an object that flashed up on a screen. Three independent scorers then rated their answers. © 2018 Guardian News and Media Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 24531 - Posted: 01.16.2018

By Adam Bear, Rebecca Fortgang and Michael Bronstein Have you ever felt as though you predicted exactly when the light was going to turn green or sensed that the doorbell was about to ring? Imagine the possibility that these moments of clairvoyance occur simply because of a glitch in your mind’s time logs. What happened first — your thought about the doorbell or its actual ringing? It may have felt as if the thought came first, but when two events (ringing of doorbell, thought about doorbell) occur close together, we can mistake their order. This leads to the sense that we accurately predicted the future when, in fact, all we did is notice the past. In a recent study published in the Proceedings of the National Academy of Sciences, we found that this tendency to mix up the timing of thoughts and events may be more than a simple mental hiccup. We supposed that if some people are prone to mixing up the order of their thoughts and perceptions in this way, they could develop a host of odd beliefs. Most obviously, they might come to believe they are clairvoyant or psychic — having abilities to predict such things as whether it is going to rain. Further, these individuals might confabulate — unconsciously make up — explanations for why they have these special abilities, inferring that they are particularly important (even godlike) or are tapping into magical forces that transcend the physical world. Such beliefs are hallmarks of psychosis, seen in mental illnesses such as schizophrenia and bipolar disorder, but they are not uncommon in less-extreme forms in the general population. Would even ordinary people who mistime their thoughts and perceptions be more likely to hold ­delusion-like ideas? © 1996-2018 The Washington Post

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 12: Psychopathology: The Biology of Behavioral Disorders
Link ID: 24527 - Posted: 01.15.2018