Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 589

By Kelly Servick PHILADELPHIA, PENNSYLVANIA—While artificial intelligence (AI) has been busy trouncing humans at Go and spawning eerily personable Alexas, some neuroscientists have harbored a different hope: that the types of algorithms driving those technologies can also yield some insight into the squishy, wet computers in our skulls. At the Conference on Cognitive Computational Neuroscience here this month, researchers presented new tools for comparing data from living brains with readouts from computational models known as deep neural networks. Such comparisons might offer up new hypotheses about how humans process sights and sounds, understand language, or navigate the world. “People have fantasized about that since the 1980s,” says Josh McDermott, a computational neuroscientist at the Massachusetts Institute of Technology (MIT) in Cambridge. Until recently, AI couldn’t come close to human performance on tasks such as recognizing sounds or classifying images. But deep neural networks, loosely inspired by the brain, have logged increasingly impressive performances, especially on visual tasks. That “brings the question back to mind,” says neuroscientist Chris Baker of the National Institute of Mental Health in Bethesda, Maryland. Deep neural networks work by passing information between computational “nodes” that are arranged in successive layers. The systems hone skills on huge sets of data; for networks that classify images, that usually means collections of labeled photos. Performance improves with feedback as the systems repeatedly adjust the strengths of the connections between nodes. © 2018 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 25466 - Posted: 09.18.2018

By: Richard Restak, M.D. Editor’s Note: Unthinkable’s author, a British neuroscientist, tracked down nine people with rare brain disorders to tell their stories. From the man who thinks he's a tiger to the doctor who feels the pain of others just by looking at them to a woman who hears music that’s not there, their experiences illustrate how the brain can shape our lives in unexpected and, in some cases, brilliant and alarming ways. Several years ago, science writer Helen Thomson, consultant to New Scientist and contributor to the Washington Post and Nature, decided to travel around the world to interview people with "the most extraordinary brains." In the process, as described in Unthinkable: An Extraordinary Journey Through the World's Strangest Brains (Ecco/Harper Collins 2018), Thomas discovered that "by putting their lives side-by-side, I was able to create a picture of how the brain functions in us all. Through their stories, I uncovered the mysterious manner in which the brain can shape our lives in unexpected—and, some cases, brilliant and alarming ways." Thomson wasn't just learning about the most extraordinary brains in the world, but in the process was "uncovering the secrets of my own." During her journey Thomson encounters Bob, who can remember days from 40 years ago with as much clarity and detail as yesterday; Sharon, who has lost her navigational abilities and on occasion becomes lost in her own home; Tommy who, after a ruptured aneurysm that damaged his left temporal lobe, underwent a total personality change; Sylvia, an otherwise normal retired school teacher who experiences near constant musical hallucinations; and Louise, who is afflicted with a permanent sense of detachment from herself and everyone around her. Beyond skillfully portraying each of these and other fascinating individuals, Thomson places them in historical and scientific context: when neuroscientists first encountered similar patients, along with past and current explanations of what has gone amiss in their brains. © 2018 The Dana Foundation

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 14: Attention and Consciousness
Link ID: 25420 - Posted: 09.07.2018

By Bahar Gholipour Milena Canning can see steam rising from a coffee cup but not the cup. She can see her daughter’s ponytail swing from side to side, but she can’t see her daughter. Canning is blind, yet moving objects somehow find a way into her perception. Scientists studying her condition say it could reveal secrets about how humans process vision in general. Canning was 29 when a stroke destroyed her entire occipital lobe, the brain region housing the visual system. The event left her sightless, but one day she saw a flash of light from a metallic gift bag next to her. Her doctors told her she was hallucinating. Nevertheless, “I thought there must be something happening within my brain [allowing me to see],” she says. She went from doctor to doctor until she met Gordon Dutton, an ophthalmologist in Glasgow, Scotland. Dutton had encountered this mystery before—in a 1917 paper by neurologist George Riddoch describing brain-injured World War I soldiers. To help enhance Canning’s motion-based vision, Dutton prescribed her a rocking chair. Canning is one of a handful of people who have been diagnosed with the “Riddoch phenomenon,” the ability to perceive motion while blind to other visual stimuli. Jody Culham, a neuroscientist at Western University in Ontario, and her colleagues launched a 10-year investigation into Canning’s remarkable vision and published the results online in May in Neuropsychologia. The team confirmed that Canning was able to detect motion and its direction. She could see a hand moving toward her, but she could not tell a thumbs-up from a thumbs-down. She was also able to navigate around obstacles, reach and grasp, and catch a ball thrown at her. © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 25409 - Posted: 09.01.2018

Megan MolteniMegan Molteni It’s been more than a century since Spanish neuroanatomist Santiago Ramón y Cajal won the Nobel Prize for illustrating the way neurons allow you to walk, talk, think, and be. In the intervening hundred years, modern neuroscience hasn’t progressed that much in how it distinguishes one kind of neuron from another. Sure, the microscopes are better, but brain cells are still primarily defined by two labor-intensive characteristics: how they look and how they fire. Which is why neuroscientists around the world are rushing to adopt new, more nuanced ways to characterize neurons. Sequencing technologies, for one, can reveal how cells with the same exact DNA turn their genes on or off in unique ways—and these methods are beginning to reveal that the brain is a more diverse forest of bristling nodes and branching energies than even Ramón y Cajal could have imagined. On Monday, an international team of researchers introduced the world to a new kind of neuron, which, at this point, is believed to exist only in the human brain. The long nerve fibers known as axons of these densely bundled cells bulge in a way that reminded their discoverers of a rose without its petals—so much that they named them “rose hip cells.” Described in the latest issue of Nature Neuroscience, these new neurons might use their specialized shape to control the flow of information from one region of the brain to another. “They can really act as a sort of brake on the system,” says Ed Lein, an investigator at the Allen Institute for Brain Science—home to several ambitious brain mapping projects—and one of the lead authors on the study. Neurons come in two basic flavors: Excitatory cells send information to the cells next to them, while inhibitory cells slow down or stop excitatory cells from firing. Rose hip cells belong to this latter type, and based on their physiology, seem to be a particularly potent current-curber. © 2018 Condé Nast

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 25391 - Posted: 08.28.2018

By Anna Clemens In 2003 a 65-year-old man brought a strange problem to neurologist Adam Zeman, now at the University of Exeter in England. The patient, later dubbed “MX,” claimed he could not conjure images of friends, family members or recently visited places. All his life, MX, a retired surveyor, had loved reading novels and had routinely drifted off to sleep visualizing buildings, loved ones and recent events. But after undergoing a procedure to open arteries in his heart, during which he probably suffered a minor stroke, his mind’s eye went blind. He could see normally, but he could not form pictures in his mind. Zeman had never encountered anything like it and set out to learn more. He has since given the condition a name—aphantasia (phantasia means “imagination” in Greek). And he and others are exploring its neurological underpinnings. Zeman and his colleagues began their analysis by testing MX’s visual imagination in several ways. Compared with control subjects, MX scored poorly on questionnaires assessing the ability to produce visual imagery. Surprisingly, though, he was able to accomplish tasks that typically involve visualization. Advertisement For example, when asked to say which is a lighter color of green—grass or pine trees—most people would decide by imagining both grass and tree and comparing them. MX correctly said that pine trees are darker than grass, but he insisted he had used no visual imagery to make the decision. “I just know the answer,” he said. © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25279 - Posted: 08.01.2018

By Stephen T. Casper The case report is dead. At least, it seems all but so in the realm of evidence-based medicine. It is thus thoroughly refreshing to read Helen Thomson’s Unthinkable: An Extraordinary Journey Through the World’s Strangest Brains and Eric R. Kandel’s The Disordered Mind: What Unusual Brains Tell Us About Ourselves, two ambitious books that draw on clinical profiles to tell stories about our brains and minds. Thomson’s memoir aims to help us understand our brains through stories about exceptional others, who, she argues, may serve as proxies for ourselves. Kandel’s book argues from neuroscience research and individual illness experiences for a biologically informed account of mind and brain. Both authors are unapologetic in their focus on what might be dismissed as merely anecdotal. Each foregrounds neurological and psychiatric patient narratives and experiences and from these draws out larger philosophical and scientific lessons. By profiling and seeking meaning in individuals with curious neurological conditions, Thomson’s Unthinkable follows a well-worn literary path but revitalizes the genre with an original and subtle shift to the personal. Perfected by neurologist Oliver Sacks, Thomson’s technique was invented before the 19th century but most famously pioneered in the 20th century by such eminent neurologists as Morton Prince, Sigmund Freud, and Alexander Luria. Where those authors represented patients as medical mysteries or as object lessons in physiology and philosophy, Thomson finds a timelier focus that corresponds with the growing advocacy for, and social attention to, individual patients’ rights. Unlike her predecessors in the genre, Thomson enters her subject’s lives—their restaurants, homes, families, communities, and online selves. © 2017 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 25278 - Posted: 08.01.2018

By Darold A. Treffert Savant syndrome comes in different forms. In congenital savant syndrome the extraordinary savant ability surfaces in early childhood. In acquired savant syndrome astonishing new abilities, typically in music, art or mathematics, appear unexpectedly in ordinary persons after a head injury, stroke or other central nervous system (CNS) incident where no such abilities or interests were present pre-incident. But in sudden savant syndrome an ordinary person with no such prior interest or ability and no precipitating injury or other CNS incident has an unanticipated, spontaneous epiphanylike moment where the rules and intricacies of music, art or mathematics, for example, are experienced and revealed, producing almost instantaneous giftedness and ability in the affected area of skill sets. Because there is no underlying disability such as that which occurs in congenital or acquired savant syndromes, technically sudden savant syndrome would be better termed sudden genius A 28-year-old gentleman from Israel, K. A., sent his description of his epiphany moment. He was in a mall where there was a piano. Whereas he could play simple popular songs from rote memory before, “suddenly at age 28 after what I can best describe as a ‘just getting it moment,’ it all seemed so simple. I suddenly was playing like a well-educated pianist.” His friends were astonished as he played and suddenly understood music in an entirely intricate way. “I suddenly realized what the major scale and minor scale were, what their chords were and where to put my fingers in order to play certain parts of the scale. I was instantly able to recognize harmonies of the scales in songs I knew as well as the ability to play melody by interval recognition.” He began to search the internet for information on music theory and to his amazement “most of what they had to teach I already knew, which baffled me as to how could I know something I had never studied." © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 25255 - Posted: 07.26.2018

By Jocelyn Kaiser Basic brain and behavioral researchers will get more than a year to comply with a new U.S. policy that will treat many of their studies as clinical trials. The announcement from the National Institutes of Health (NIH) appears to defuse, for now, a yearlong controversy over whether basic research on humans should follow the same rules as studies testing drugs. Although research groups had hoped NIH would drop its plans to tag basic studies with humans as trials, they say they’re relieved they get more time to prepare and give the agency input. “It’s a positive step forward,” says Paula Skedsvold, executive director of the Federation of Associations in Behavioral & Brain Sciences in Washington, D.C. At issue is a recently revised definition of a clinical trial along with a set of rules in effect since January that are meant to increase the rigor and transparency of NIH-funded clinical trials. About a year ago, basic scientists who study human cognition—for example, using brain imaging with healthy volunteers—were alarmed to realize many of these studies fit the new clinical trial definition. Researchers protested that many requirements, such as registering and reporting results in the ClinicalTrials.gov federal database, made no sense for studies that weren’t testing a treatment and would confuse the public. NIH then issued a set of case studies explaining that only some basic studies would fall under the trials definition. But concerns remained about confusing criteria and burdensome new paperwork. © 2018 American Association for the Advancement of Science

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 25248 - Posted: 07.25.2018

Allison Aubrey Can't cool off this summer? Heat waves can slow us down in ways we may not realize. New research suggests heat stress can muddle our thinking, making simple math a little harder to do. "There's evidence that our brains are susceptible to temperature abnormalities," says Joe Allen, co-director of the Center for Climate, Health and the Global Environment at Harvard University. And as the climate changes, temperatures spike and heat waves are more frequent. To learn more about how the heat influences young, healthy adults, Allen and his colleagues studied college students living in dorms during a summer heat wave in Boston. Half of the students lived in buildings with central AC, where the indoor air temperature averaged 71 degrees. The other half lived in dorms with no AC, where air temperatures averaged almost 80 degrees. "In the morning, when they woke up, we pushed tests out to their cellphones," explains Allen. The students took two tests a day for 12 consecutive days. One test, which included basic addition and subtraction, measured cognitive speed and memory. A second test assessed attention and processing speed. "We found that the students who were in the non-air-conditioned buildings actually had slower reaction times: 13 percent lower performance on basic arithmetic tests, and nearly a 10 percent reduction in the number of correct responses per minute," Allen explains. The results, published in PLOS Medicine, may come as a surprise. "I think it's a little bit akin to the frog in the boiling water," Allen says. There's a "slow, steady — largely imperceptible — rise in temperature, and you don't realize it's having an impact on you." © 2018 npr

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25211 - Posted: 07.16.2018

By Erica Goode Suppose that, seeking a fun evening out, you pay $175 for a ticket to a new Broadway musical. Seated in the balcony, you quickly realize that the acting is bad, the sets are ugly and no one, you suspect, will go home humming the melodies. Do you head out the door at the intermission, or stick it out for the duration? Studies of human decision-making suggest that most people will stay put, even though money spent in the past logically should have no bearing on the choice. This “sunk cost fallacy,” as economists call it, is one of many ways that humans allow emotions to affect their choices, sometimes to their own detriment. But the tendency to factor past investments into decision-making is apparently not limited to Homo sapiens. In a study published on Thursday in the journal Science, investigators at the University of Minnesota reported that mice and rats were just as likely as humans to be influenced by sunk costs. The more time they invested in waiting for a reward — in the case of the rodents, flavored pellets; in the case of the humans, entertaining videos — the less likely they were to quit the pursuit before the delay ended. “Whatever is going on in the humans is also going on in the nonhuman animals,” said A. David Redish, a professor of neuroscience at the University of Minnesota and an author of the study. This cross-species consistency, he and others said, suggested that in some decision-making situations, taking account of how much has already been invested might pay off. “Evolution by natural selection would not promote any behavior unless it had some — perhaps obscure — net overall benefit,” said Alex Kacelnik, a professor of behavioral ecology at Oxford, who praised the new study as “rigorous” in its methodology and “well designed.” © 2018 The New York Times Company

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 25203 - Posted: 07.13.2018

/ By Eric Allen Been ‘Anew generation of scientists is not satisfied merely to watch and describe brain activity,” writes David Adam. “They want to interfere, to change and improve the brain — to neuroenhance it.” In his new book “The Genius Within: Unlocking Your Brain’s Potential” (Pegasus), Adam offers a many-sided investigation of neuroenhancement — a hodgepodge of technologies and drug treatments aimed at improving intelligence. A London-based science writer and editor, he previously wrote about obsessive-compulsive disorder, its history, and his own struggle with it in “The Man Who Couldn’t Stop” (2014). “We wonder at the stars, and then we start to work out how far away things are. And then we design a spacecraft that’s going to take us up there. I think that’s happened with neuroscience.” For this installment of the Undark Five, I talked with Adam about neuroenhancement — among other things, whether it’s fair to enhance some people’s cognitive abilities but not others’, why the subject of intelligence makes so many people uncomfortable, and whether “smart drugs” will one day make us all Einsteins. Here’s our conversation, edited for length and clarity. UNDARK — There’s been a shift within neuroscience from not just trying to understand how the brain works but to enhance it. How did that happen? Copyright 2018 Undark

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 25164 - Posted: 07.02.2018

By Simon Makin The electrical oscillations we call brain waves have intrigued scientists and the public for more than a century. But their function—and even whether they have one, rather than just reflecting brain activity like an engine’s hum—is still debated. Many neuroscientists have assumed that if brain waves do anything, it is by oscillating in synchrony in different locations. Yet a growing body of research suggests many brain waves are actually “traveling waves” that physically move through the brain like waves on the sea. Now a new study from a team at Columbia University led by neuroscientist Joshua Jacobs suggests traveling waves are widespread in the human cortex—the seat of higher cognitive functions—and that they become more organized depending on how well the brain is performing a task. This shows the waves are relevant to behavior, bolstering previous research suggesting they are an important but overlooked brain mechanism that contributes to memory, perception, attention and even consciousness. Brain waves were first discovered using electroencephalogram (EEG) techniques, which involve placing electrodes on the scalp. Researchers have noted activity over a range of different frequencies, from delta (0.5 to 4 hertz) through to gamma (25 to 140 Hz) waves. The slowest occur during deep sleep, with increasing frequency associated with increasing levels of consciousness and concentration. Interpreting EEG data is difficult due to its poor ability to pinpoint the location of activity, and the fact that passage through the head blurs the signals. The new study, published earlier this month in Neuron, used a more recent technique called electrocorticography (ECoG). This involves placing electrode arrays directly on the brain’s surface, minimizing distortions and vastly improving spatial resolution. © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 25159 - Posted: 06.29.2018

David Levari Why do many problems in life seem to stubbornly stick around, no matter how hard people work to fix them? It turns out that a quirk in the way human brains process information means that when something becomes rare, we sometimes see it in more places than ever. Think of a “neighborhood watch” made up of volunteers who call the police when they see anything suspicious. Imagine a new volunteer who joins the watch to help lower crime in the area. When they first start volunteering, they raise the alarm when they see signs of serious crimes, like assault or burglary. Let’s assume these efforts help and, over time, assaults and burglaries become rarer in the neighborhood. What would the volunteer do next? One possibility is that they would relax and stop calling the police. After all, the serious crimes they used to worry about are a thing of the past. But you may share the intuition my research group had – that many volunteers in this situation wouldn’t relax just because crime went down. Instead, they’d start calling things “suspicious” that they would never have cared about back when crime was high, like jaywalking or loitering at night. You can probably think of many similar situations in which problems never seem to go away, because people keep changing how they define them. This is sometimes called “concept creep,” or “moving the goalposts,” and it can be a frustrating experience. How can you know if you’re making progress solving a problem, when you keep redefining what it means to solve it? My colleagues and I wanted to understand when this kind of behavior happens, why, and if it can be prevented. © 2010–2018, The Conversation US, Inc.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 25158 - Posted: 06.29.2018

By Frank Bures Even if there is no sonic weapon, or genitals are not truly shrinking, these conditions are all quite real to the sufferers, just as depression and anxiety are real. One afternoon in May of 2004, a third-grade boy at a local school in Fuhu reported feeling that his genitals were shrinking. He panicked, ran home, and his parents fetched the local healer — an 80-year-old woman who had seen this sort of thing before: In 1963, she said, around the time of the Great Leap Forward, an “evil wind” had blown through the village and many people were struck by this illness known as “suo-yang.” She treated the boy by traditional means and he recovered quickly. Two days later when the school principal learned of the incident, he gathered all 680 students in the school courtyard and, according to a report by Dr. Li Jie of the Guangzhou Psychiatric Hospital, “explained to the students in detail what had happened, and warned them to be cautious, and to take emergency measures if they experienced similar symptoms.” Within two days, 64 other boys were struck with suo-yang, which in its epidemic form, is referred to in the scientific literature as a “mass psychogenic illness” or a “collective stress response.” The Fuhu case was a textbook example of how such an illness can spread through a group of people, and the headmaster did the worst possible thing by explaining the symptoms in detail and assuring students they were in danger. He all but caused epidemic. Copyright 2018 Undark

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25123 - Posted: 06.22.2018

Leslie Henderson Anti-immigrant policies, race-related demonstrations, Title IX disputes, affirmative action court cases, same-sex marriage litigation. These issues are continually in the headlines. But even thoughtful articles on these subjects seem always to devolve to pitting warring factions against each other: black versus white, women versus men, gay versus straight. At the most fundamental level of biology, people recognize the innate advantage of defining differences in species. But even within species, is there something in our neural circuits that leads us to find comfort in those like us and unease with those who may differ? As in all animals, human brains balance two primordial systems. One includes a brain region called the amygdala that can generate fear and distrust of things that pose a danger – think predators or or being lost somewhere unknown. The other, a group of connected structures called the mesolimbic system, can give rise to pleasure and feelings of reward in response to things that make it more likely we’ll flourish and survive – think not only food, but also social pleasure, like trust. But how do these systems interact to influence how we form our concepts of community? Implicit association tests can uncover the strength of unconscious associations. Scientists have shown that many people harbor an implicit preference for their in-group – those like themselves – even when they show no outward or obvious signs of bias. For example, in studies whites perceive blacks as more violent and more apt to do harm, solely because they are black, and this unconscious bias is evident even toward black boys as young as five years old. © 2010–2018, The Conversation US, Inc.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25122 - Posted: 06.22.2018

Some human brains are nearly twice the size of others – but how might that matter? Researchers at the National Institute of Mental Health (NIMH) and their NIH grant-funded colleagues have discovered that these differences in size are related to the brain’s shape and the way it is organized. The bigger the brain, the more its additional area is accounted for by growth in thinking areas of the cortex, or outer mantle – at the expense of relatively slower growth in lower order emotional, sensory, and motor areas. This mirrors the pattern of brain changes seen in evolution and individual development – with higher-order areas showing greatest expansion. The researchers also found evidence linking the high-expanding regions to higher connectivity between neurons and higher energy consumption. “Just as different parts are required to scale-up a garden shed to the size of a mansion, it seems that big primate brains have to be built to different proportions,” explained Armin Raznahan, M.D., Ph.D., of the NIMH Intramural Research Program (IRP). “An extra investment has to be made in the part that integrates information – but that’s not to say that it’s better to have a bigger brain. Our findings speak more to the different organizational needs of larger vs. smaller brains.” Raznahan, P.K. Reardon, Jakob Seidlitz, and colleagues at more than six collaborating research centers report on their study incorporating brain scan data from more than 3,000 people in Science. Reardon and Seidlitz are students in the NIH Oxford-Cambridge Scholars Program.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25045 - Posted: 06.01.2018

By Simon Makin Everyone has unwelcome thoughts from time to time. But such intrusions can signal serious psychiatric conditions—from “flashbacks” in post-traumatic stress disorder (PTSD) to obsessive negative thinking in depression to hallucinations in schizophrenia. “These are some of the most debilitating symptoms,” says neuroscientist Michael Anderson of the University of Cambridge. New research led by Anderson and neuroscientist Taylor Schmitz, now at McGill University, suggests these symptoms may all stem from a faulty brain mechanism responsible for blocking thoughts. Researchers studying this faculty usually focus on the prefrontal cortex (PFC), a control center that directs the activity of other brain regions. But Anderson and his colleagues noticed that conditions featuring intrusive thoughts—such as schizophrenia—often involve increased activity in the hippocampus, an important memory region. The severity of symptoms such as hallucinations also increases with this elevated activity. In the new study, Anderson and his team had healthy participants learn a series of word pairs. The subjects were presented with one word and had to either recall or suppress the associated one. When participants suppressed thoughts, brain scans detected increased activity in part of the PFC and reduced activity in the hippocampus. The findings, which were published last November in Nature Communications, are consistent with a brain circuit in which a “stop” command from the PFC suppresses hippocampus activity. © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 24864 - Posted: 04.13.2018

Jason Murugesu We all daydream, whether about marrying Rihanna, discovering a sudden ability to sing opera or never having to answer another email again. Yet it is only in the last few decades that the science behind daydreaming, or mind-wandering as it is termed in most academic literature, has transitioned from the realms of pseudoscience to the cutting edge of cognitive neuroscience. At its most basic, daydreaming is your mind wandering from the here and now. Traditionally, daydreaming was considered to be a single psychological state of mind. This, however, caused conflict in academic literature, and the resulting confusion is the reason why you might read that daydreaming is linked to happiness in one paper, but to depression in the next. Different types of mind-wandering have been conflated. Using neuroimaging techniques, a study conducted last year by the University of York found that different types of daydreams – for example, those which are fantastical, autobiographical, future orientated or past oriented – were built up of different neuronal activation patterns, and by virtue could not be considered a single psychological construct. Nevertheless, if we consider all these types of mind-wandering together, you would be surprised about how much of our waking time we spend daydreaming. In 2008, Professor Matthew Killingsworth, then at Harvard University, used an app that contacted a large group of people at random points of the day to find out how often they were daydreaming. The app would ask its users what they were doing, and whether they were thinking about something else entirely. They found that 46.9 per cent of the time, the user was mind-wandering.

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 24852 - Posted: 04.11.2018

by Meeri Kim On a beautiful autumn afternoon in New York’s Central Park, Carol Berman had the horrifying realization that her husband of 40 years no longer recognized her as his wife. In his eyes, she wasn’t the real Carol but rather some strange woman pretending to be Carol — effectively, an impostor. They were out for a stroll when he started yelling at a woman with a similar hairdo farther up the street: “Carol! Carol, come here!” Shocked, his wife faced him head-on, looked deep into his eyes and reassured him that she was right here. But he refused to acknowledge her as the real Carol. Marty Berman had been a warmhearted, highly intelligent and hard-working patent lawyer for much of his life. But at 74, he began to show signs of dementia. Once proficient in math and engineering, he could no longer subtract simple numbers correctly. A man who had walked the whole of Manhattan couldn’t go a few blocks by himself anymore without getting lost. Perhaps the most painful part for Carol was when her husband’s delusion developed a year or two after his initial symptoms arose. Capgras syndrome is a psychological condition that prompts a person to believe that loved ones have been replaced by identical duplicates of themselves. As a clinical assistant professor of psychiatry at New York University, Carol had treated several Capgras patients. But witnessing the delusion in the person she loved the most, whom she was already losing to dementia, was agonizing. © 1996-2018 The Washington Post

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 24833 - Posted: 04.07.2018

By Simon Makin Everyone has unwelcome thoughts from time to time. But such intrusions can signal serious psychiatric conditions—from “flashbacks” in post-traumatic stress disorder (PTSD) to obsessive negative thinking in depression to hallucinations in schizophrenia. “These are some of the most debilitating symptoms,” says neuroscientist Michael Anderson of the University of Cambridge. New research led by Anderson and neuroscientist Taylor Schmitz, now at McGill University, suggests these symptoms may all stem from a faulty brain mechanism responsible for blocking thoughts. Researchers studying this faculty usually focus on the prefrontal cortex (PFC), a control center that directs the activity of other brain regions. But Anderson and his colleagues noticed that conditions featuring intrusive thoughts—such as schizophrenia—often involve increased activity in the hippocampus, an important memory region. The severity of symptoms such as hallucinations also increases with this elevated activity. In the new study, Anderson and his team had healthy participants learn a series of word pairs. The subjects were presented with one word and had to either recall or suppress the associated one. When participants suppressed thoughts, brain scans detected increased activity in part of the PFC and reduced activity in the hippocampus. The findings, which were published last November in Nature Communications, are consistent with a brain circuit in which a “stop” command from the PFC suppresses hippocampus activity. © 2018 Scientific American

Related chapters from BN8e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 24775 - Posted: 03.21.2018