Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 699

By Scott Barry Kaufman What is going on in our brains when we are creating? How does our brain look different when we are engaging in art vs. science? How does the brain of genius creators differ from the rest of us? What are some of the limitations of studying the creative brain? The neuroscience of creativity is booming. There is now a society (with an annual conference), an edited volume, a handbook, and now an entire textbook on the topic. Bringing the latest research together from a number of scientists*, Anna Abraham wrote a wonderful resource that covers some of the most hot button topics in the field. She was gracious enough to do a Q & A with me. Enjoy! SBK: How’d you get interested in the neuroscience of creativity? AA: I have always been curious about creativity. At the most fundamental level I think I simply wanted to get my head around the mystery of this marvelous ability that each of us possesses. In particular, I hoped to find out what makes some people more creative than others. When I saw an opportunity to pursue a PhD in Neuroscience in the early 2000s in any topic of my choice, I went all in - it was an exciting and promising approach that had until then only been limitedly used to explore the creative mind. SBK: What is creativity? Does the field have a unified, agreed upon definition of creativity that you are satisfied with? AA: There is a surprising level of unanimity in the field when it comes to a boilerplate definition. Most experts agree that two elements are central to creativity. First and foremost, it reflects our capacity to generate ideas that are original, unusual or novel in some way. The second element is that these ideas also need to be satisfying, appropriate or suited to the context in question. I am reasonably satisfied with this definition but not in how it guides scientific enquiry.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25839 - Posted: 01.05.2019

By Christina Karns ’Tis the season when the conversation shifts to what you’re thankful for. Gathered with family and friends around a holiday feast, for instance, people may recount some of the biggies — such as their health or their children — or smaller things that enhance everyday life — such as happening upon a great movie while channel surfing or enjoying a favorite seasonal food. Psychology researchers recognize that taking time to be thankful has benefits for well-being. Gratitude not only goes along with more optimism, less anxiety and depression, and greater goal attainment, but also is associated with fewer symptoms of illness and other physical benefits. In recent years, researchers have been making connections between the internal experience of gratitude and the external practice of altruism. How does being thankful about things in your own life relate to any selfless concern you may have about the well-being of others? As a neuroscientist, I’m particularly interested in the brain regions and connections that support gratitude and altruism. I’ve been exploring how changes in one might lead to changes in the other. To study the relationship between gratitude and altruism in the brain, my colleagues and I first asked volunteers questions meant to tease out how frequently they feel thankful and the degree to which they tend to care about the well-being of others. Then we used statistics to determine the extent to which someone’s gratitude could predict their altruism. As others have found, the more grateful people in this group tended to be more altruistic. © 1996-2018 The Washington Post

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25823 - Posted: 12.26.2018

Phil Jaekl Driving a car is a complex task for a brain to coordinate. A driver may drink a cup of coffee and have a conversation with a passenger, all while safely piloting a vehicle through traffic. But all of this activity requires attention—that is, concentrating on the tasks and sources of information that matter and blocking out those that don’t. How the brain manages that orchestration is a long-standing scientific mystery. One prominent view, based on findings from human behavioral studies, is that the brain guides us through a world chock-full of sensory inputs by focusing a metaphorical spotlight on what it deems important, while filtering out relatively trivial details. Unlike some other, functionally well-defined aspects of cognition, this attentional spotlight has eluded scientific understanding. Its neural substrates have been particularly difficult to pin down to specific activities and locations in the brain—although several studies have implicated the frontoparietal network, which spans the frontal and parietal lobes of the brain. Meanwhile, attention studies involving visual tasks that require continuous focus—detecting a small object flashing on a cluttered computer screen, for example—have shown that task performance varies over short time intervals, with episodes of peak performance and of poor performance alternating on millisecond timescales. Such research suggests that the attentional spotlight might not be as constant as once thought. Yet, until now, researchers have not been able to directly connect these changes in performance to fluctuations in brain activity. © 1986 - 2018 The Scientist

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25800 - Posted: 12.20.2018

By Kuheli Dutt When neuroscientist Ben Barres delivered his first seminar, an audience member praised him, commenting that Ben’s work was much better than that of his sister, Barbara Barres. The irony? Ben Barres (now deceased), a transgender scientist was Barbara Barres before he transitioned to male. When New York Times columnist Brent Staples was a graduate student in Chicago’s Hyde Park, he found that white people on the street perceived him, an African American, as a threat to their safety. They were visibly tense around him, clutched their purses and sometimes even crossed the street to avoid him. But when he started whistling tunes from classical music, people suddenly weren’t afraid of him anymore—they relaxed and some even smiled at him. Implicit bias runs far deeper than we realize. A riddle used at implicit bias trainings goes like this: A father and his son are in a terrible car crash. The father dies at the scene. His son, in critical condition, is rushed to the hospital; he’s in the operating room, about to go under the knife. The surgeon says, “I can’t operate on this boy—he’s my son!” The audience is then asked how that’s possible. Responses include several scenarios: two gay fathers; one biological and one adopted father; one father and one priest (religious father); all of which are possible. However, an obvious answer that most people miss: the surgeon is the boy’s mother. Whether we like it or not, we are conditioned to associating surgeons with being male. © 2018 Scientific American,

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25799 - Posted: 12.20.2018

Alison Abbott Doris Tsao launched her career deciphering faces — but for a few weeks in September, she struggled to control the expression on her own. Tsao had just won a MacArthur Foundation ‘genius’ award, an honour that comes with more than half a million dollars to use however the recipient wants. But she was sworn to secrecy — even when the foundation sent a film crew to her laboratory at the California Institute of Technology (Caltech) in Pasadena. Thrilled and embarrassed at the same time, she had to invent an explanation, all while keeping her face in check. It was her work on faces that won Tsao awards and acclaim. Last year, she cracked the code that the brain uses to recognize faces from a multitude of minuscule differences in shapes, distances between features, tones and textures. The simplicity of the coding surprised and impressed the neuroscience community. “Her work has been transformative,” says Tom Mrsic-Flogel, director of the Sainsbury Wellcome Centre for Neural Circuits and Behaviour at University College London. But Tsao doesn’t want to be remembered just as the scientist who discovered the face code. It is a means to an end, she says, a good tool for approaching the question that really interests her: how does the brain build up a complete, coherent model of the world by filling in gaps in perception? “This idea has an elegant mathematical formulation,” she says, but it has been notoriously hard to put to the test. Tsao now has an idea of how to begin. © 2018 Springer Nature Publishing AG

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25773 - Posted: 12.11.2018

By Aaron E. Carroll Even before the recent news that a group of researchers managed to get several ridiculous fake studies published in reputable academic journals, people have been aware of problems with peer review. Throwing out the system — which deems whether research is robust and worth being published — would do more harm than good. But it makes sense to be aware of peer review’s potential weaknesses. Reviewers may be overworked and underprepared. Although they’re experts in the subject they are reading about, they get no specific training to do peer review, and are rarely paid for it. With 2.5 million peer-reviewed papers published annually worldwide — and more that are reviewed but never published — it can be hard to find enough people to review all the work. There is evidence that reviewers are not always consistent. A 2010 paper describes a study in which two researchers selected 12 articles already accepted by highly regarded journals, swapped the real names and academic affiliations for false ones, and resubmitted the identical material to the same journals that had already accepted them in the previous 18 to 32 months. Only 8 percent of editors or reviewers noticed the duplication, and three papers were detected and pulled. Of the nine papers that continued through the review process, eight were turned down, with 89 percent of reviewers recommending rejection. Peer review may be inhibiting innovation. It takes significant reviewer agreement to have a paper accepted. One potential downside is that important research bucking a trend or overturning accepted wisdom may face challenges surviving peer review. In 2015, a study published in P.N.A.S. tracked more than 1,000 manuscripts submitted to three prestigious medical journals. Of the 808 that were published at some point, the 2 percent that were most frequently cited had been rejected by the journals. An even bigger issue is that peer review may be biased. Reviewers can usually see the names of the authors and their institutions, and multiple studies have shown that reviews preferentially accept or reject articles based on a number of demographic factors. In a study published in eLife last year, researchers created a database consisting of more than 9,000 editors, 43,000 reviewers and 126,000 authors whose work led to about 41,000 articles in 142 journals in a number of domains. They found that women made up only 26 percent of editors, 28 percent of reviewers and 37 percent of authors. Analyses showed that this was not because fewer women were available for each role. © 2018 The New York Times Compan

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 20:
Link ID: 25643 - Posted: 11.05.2018

Jon Hamilton An ancient part of the brain long ignored by the scientific world appears to play a critical role in everything from language and emotions to daily planning. It's the cerebellum, which is found in fish and lizards as well as people. But in the human brain, this structure is wired to areas involved in higher-order thinking, a team led by researchers from Washington University in St. Louis reports Thursday in the journal Neuron. "We think that the cerebellum is acting as the brain's ultimate quality control unit," says Scott Marek, a postdoctoral research scholar and the study's first author. The finding adds to the growing evidence that the cerebellum "isn't only involved in sensory-motor function, it's involved in everything we do," says Dr. Jeremy Schmahmann, a neurology professor at Harvard and director of the ataxia unit at Massachusetts General Hospital. Schmahmann, who wasn't involved in the new study, has been arguing for decades that the cerebellum plays a key role in many aspects of human behavior, as well as mental disorders such as schizophrenia. But only a handful of scientists have explored functions of the cerebellum beyond motor control. "It's been woefully understudied," says Dr. Nico Dosenbach, a professor of neurology at Washington University whose lab conducted the study. Even now, many scientists think of the cerebellum as the part of the brain that lets you pass a roadside sobriety test. It helps you do things like walk in a straight line or stand on one leg or track a moving object — if you're not drunk. © 2018 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 25624 - Posted: 10.27.2018

By Frankie Schembri Humans are awful at estimating a person’s age based on their face alone. This can lead not only to uncomfortable social situations, but also to critical errors in criminal investigations and enforcing age-based restrictions on such things as alcohol and gambling. New research shows people are usually off by about 8 years, and their estimate might be shaped by the last face they saw. To conduct the study, researchers collected 3968 pictures of consenting participants from the Australian Passport Office—31 men and 31 women at each age from 7 through 70. Then, they showed 81 people photographs of a man and woman at each age in a random sequence, and asked them to guess their ages. The faces above are computer-generated averages of more than 100 pictures from the study of people aged 19 to 22, 50 to 53, and 63 to 66. Volunteers consistently guessed that young faces were several years older than they actually were and that older faces were several years younger than they actually were, the team reports today in Royal Society Open Science. The results also showed that people’s estimates were affected by the previous face they had viewed—if they had just seen a young face, they usually lowballed the next face’s age, and vice versa. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25581 - Posted: 10.17.2018

By Frankie Schembri Think of all the faces you know. As you flick through your mental Rolodex, your friends, family, and co-workers probably come first—along with celebrities—followed by the faces of the nameless strangers you encounter during your daily routine. But how many faces can the human Rolodex store? To ballpark the size of the average person’s “facial vocabulary,” researchers gave 25 people 1 hour to list as many faces from their personal lives as possible, and then another hour to do the same with famous faces, like those of actors, politicians, and musicians. If the participants couldn’t remember a person’s name, but could imagine their face, they used a descriptive phrase like “the high school janitor,” or “the actress from Friends with the haircut.” People came up with lots of faces during the first minutes of the test, but the rate of remembrance dropped over the course of the hour. By graphing this relationship and extrapolating it to when most people would run out of faces, the researchers estimated the number of faces an average person can recall from memory. To figure out how many additional faces people recognized but were unable to recall without prompting, researchers showed the participants photographs of 3441 celebrities, including Barack Obama and Tom Cruise. To qualify as “knowing” a face, the participants had to recognize two different photos of each person. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25552 - Posted: 10.10.2018

By Emily Underwood The ornately folded outer layer of the human brain, the cerebral cortex, has long received nearly all the credit for our ability to perform complex cognitive tasks such as composing a sonata, imagining the plot of a novel or reflecting on our own thoughts. One explanation for how we got these abilities is that the cortex rapidly expanded relative to body size as primates evolved — the human cortex has 10 times the surface area of a monkey’s cortex, for example, and 1,000 times that of a mouse. But the cortex is not the only brain region that has gotten bigger and more complex throughout evolution. Nestled beneath the cortex, a pair of egg-shaped structures called the thalamus has also grown, and its wiring became much more intricate as mammals diverged from reptiles. The thalamus — from the Greek thalamos, or inner chamber — transmits 98 percent of sensory information to the cortex, including vision, taste, touch and balance; the only sense that doesn’t pass through this brain region is smell. The thalamus also conducts motor signals and relays information from the brain stem to the cortex, coordinating shifts in consciousness such as waking up and falling asleep. Scientists have known for decades that the thalamus faithfully transmits information about the visual world from the retina to the cortex, leading to the impression that it is largely a messenger of sensory information rather than a center of complex cognition itself. But that limited, passive view of the thalamus is outdated, maintains Michael Halassa, a neuroscientist at the Massachusetts Institute of Technology who recently coauthored (with Ralf D. Wimmer and Rajeev V. Rikhye) an article in the Annual Review of Neuroscience exploring the thalamus’s role. © 2018 Annual Reviews, Inc

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 25542 - Posted: 10.08.2018

By Kelly Servick PHILADELPHIA, PENNSYLVANIA—While artificial intelligence (AI) has been busy trouncing humans at Go and spawning eerily personable Alexas, some neuroscientists have harbored a different hope: that the types of algorithms driving those technologies can also yield some insight into the squishy, wet computers in our skulls. At the Conference on Cognitive Computational Neuroscience here this month, researchers presented new tools for comparing data from living brains with readouts from computational models known as deep neural networks. Such comparisons might offer up new hypotheses about how humans process sights and sounds, understand language, or navigate the world. “People have fantasized about that since the 1980s,” says Josh McDermott, a computational neuroscientist at the Massachusetts Institute of Technology (MIT) in Cambridge. Until recently, AI couldn’t come close to human performance on tasks such as recognizing sounds or classifying images. But deep neural networks, loosely inspired by the brain, have logged increasingly impressive performances, especially on visual tasks. That “brings the question back to mind,” says neuroscientist Chris Baker of the National Institute of Mental Health in Bethesda, Maryland. Deep neural networks work by passing information between computational “nodes” that are arranged in successive layers. The systems hone skills on huge sets of data; for networks that classify images, that usually means collections of labeled photos. Performance improves with feedback as the systems repeatedly adjust the strengths of the connections between nodes. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 25466 - Posted: 09.18.2018

By: Richard Restak, M.D. Editor’s Note: Unthinkable’s author, a British neuroscientist, tracked down nine people with rare brain disorders to tell their stories. From the man who thinks he's a tiger to the doctor who feels the pain of others just by looking at them to a woman who hears music that’s not there, their experiences illustrate how the brain can shape our lives in unexpected and, in some cases, brilliant and alarming ways. Several years ago, science writer Helen Thomson, consultant to New Scientist and contributor to the Washington Post and Nature, decided to travel around the world to interview people with "the most extraordinary brains." In the process, as described in Unthinkable: An Extraordinary Journey Through the World's Strangest Brains (Ecco/Harper Collins 2018), Thomas discovered that "by putting their lives side-by-side, I was able to create a picture of how the brain functions in us all. Through their stories, I uncovered the mysterious manner in which the brain can shape our lives in unexpected—and, some cases, brilliant and alarming ways." Thomson wasn't just learning about the most extraordinary brains in the world, but in the process was "uncovering the secrets of my own." During her journey Thomson encounters Bob, who can remember days from 40 years ago with as much clarity and detail as yesterday; Sharon, who has lost her navigational abilities and on occasion becomes lost in her own home; Tommy who, after a ruptured aneurysm that damaged his left temporal lobe, underwent a total personality change; Sylvia, an otherwise normal retired school teacher who experiences near constant musical hallucinations; and Louise, who is afflicted with a permanent sense of detachment from herself and everyone around her. Beyond skillfully portraying each of these and other fascinating individuals, Thomson places them in historical and scientific context: when neuroscientists first encountered similar patients, along with past and current explanations of what has gone amiss in their brains. © 2018 The Dana Foundation

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 20: ; Chapter 14: Attention and Higher Cognition
Link ID: 25420 - Posted: 09.07.2018

By Bahar Gholipour Milena Canning can see steam rising from a coffee cup but not the cup. She can see her daughter’s ponytail swing from side to side, but she can’t see her daughter. Canning is blind, yet moving objects somehow find a way into her perception. Scientists studying her condition say it could reveal secrets about how humans process vision in general. Canning was 29 when a stroke destroyed her entire occipital lobe, the brain region housing the visual system. The event left her sightless, but one day she saw a flash of light from a metallic gift bag next to her. Her doctors told her she was hallucinating. Nevertheless, “I thought there must be something happening within my brain [allowing me to see],” she says. She went from doctor to doctor until she met Gordon Dutton, an ophthalmologist in Glasgow, Scotland. Dutton had encountered this mystery before—in a 1917 paper by neurologist George Riddoch describing brain-injured World War I soldiers. To help enhance Canning’s motion-based vision, Dutton prescribed her a rocking chair. Canning is one of a handful of people who have been diagnosed with the “Riddoch phenomenon,” the ability to perceive motion while blind to other visual stimuli. Jody Culham, a neuroscientist at Western University in Ontario, and her colleagues launched a 10-year investigation into Canning’s remarkable vision and published the results online in May in Neuropsychologia. The team confirmed that Canning was able to detect motion and its direction. She could see a hand moving toward her, but she could not tell a thumbs-up from a thumbs-down. She was also able to navigate around obstacles, reach and grasp, and catch a ball thrown at her. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 25409 - Posted: 09.01.2018

Megan MolteniMegan Molteni It’s been more than a century since Spanish neuroanatomist Santiago Ramón y Cajal won the Nobel Prize for illustrating the way neurons allow you to walk, talk, think, and be. In the intervening hundred years, modern neuroscience hasn’t progressed that much in how it distinguishes one kind of neuron from another. Sure, the microscopes are better, but brain cells are still primarily defined by two labor-intensive characteristics: how they look and how they fire. Which is why neuroscientists around the world are rushing to adopt new, more nuanced ways to characterize neurons. Sequencing technologies, for one, can reveal how cells with the same exact DNA turn their genes on or off in unique ways—and these methods are beginning to reveal that the brain is a more diverse forest of bristling nodes and branching energies than even Ramón y Cajal could have imagined. On Monday, an international team of researchers introduced the world to a new kind of neuron, which, at this point, is believed to exist only in the human brain. The long nerve fibers known as axons of these densely bundled cells bulge in a way that reminded their discoverers of a rose without its petals—so much that they named them “rose hip cells.” Described in the latest issue of Nature Neuroscience, these new neurons might use their specialized shape to control the flow of information from one region of the brain to another. “They can really act as a sort of brake on the system,” says Ed Lein, an investigator at the Allen Institute for Brain Science—home to several ambitious brain mapping projects—and one of the lead authors on the study. Neurons come in two basic flavors: Excitatory cells send information to the cells next to them, while inhibitory cells slow down or stop excitatory cells from firing. Rose hip cells belong to this latter type, and based on their physiology, seem to be a particularly potent current-curber. © 2018 Condé Nast

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 25391 - Posted: 08.28.2018

By Anna Clemens In 2003 a 65-year-old man brought a strange problem to neurologist Adam Zeman, now at the University of Exeter in England. The patient, later dubbed “MX,” claimed he could not conjure images of friends, family members or recently visited places. All his life, MX, a retired surveyor, had loved reading novels and had routinely drifted off to sleep visualizing buildings, loved ones and recent events. But after undergoing a procedure to open arteries in his heart, during which he probably suffered a minor stroke, his mind’s eye went blind. He could see normally, but he could not form pictures in his mind. Zeman had never encountered anything like it and set out to learn more. He has since given the condition a name—aphantasia (phantasia means “imagination” in Greek). And he and others are exploring its neurological underpinnings. Zeman and his colleagues began their analysis by testing MX’s visual imagination in several ways. Compared with control subjects, MX scored poorly on questionnaires assessing the ability to produce visual imagery. Surprisingly, though, he was able to accomplish tasks that typically involve visualization. Advertisement For example, when asked to say which is a lighter color of green—grass or pine trees—most people would decide by imagining both grass and tree and comparing them. MX correctly said that pine trees are darker than grass, but he insisted he had used no visual imagery to make the decision. “I just know the answer,” he said. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25279 - Posted: 08.01.2018

By Stephen T. Casper The case report is dead. At least, it seems all but so in the realm of evidence-based medicine. It is thus thoroughly refreshing to read Helen Thomson’s Unthinkable: An Extraordinary Journey Through the World’s Strangest Brains and Eric R. Kandel’s The Disordered Mind: What Unusual Brains Tell Us About Ourselves, two ambitious books that draw on clinical profiles to tell stories about our brains and minds. Thomson’s memoir aims to help us understand our brains through stories about exceptional others, who, she argues, may serve as proxies for ourselves. Kandel’s book argues from neuroscience research and individual illness experiences for a biologically informed account of mind and brain. Both authors are unapologetic in their focus on what might be dismissed as merely anecdotal. Each foregrounds neurological and psychiatric patient narratives and experiences and from these draws out larger philosophical and scientific lessons. By profiling and seeking meaning in individuals with curious neurological conditions, Thomson’s Unthinkable follows a well-worn literary path but revitalizes the genre with an original and subtle shift to the personal. Perfected by neurologist Oliver Sacks, Thomson’s technique was invented before the 19th century but most famously pioneered in the 20th century by such eminent neurologists as Morton Prince, Sigmund Freud, and Alexander Luria. Where those authors represented patients as medical mysteries or as object lessons in physiology and philosophy, Thomson finds a timelier focus that corresponds with the growing advocacy for, and social attention to, individual patients’ rights. Unlike her predecessors in the genre, Thomson enters her subject’s lives—their restaurants, homes, families, communities, and online selves. © 2017 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 15: Language and Lateralization
Link ID: 25278 - Posted: 08.01.2018

By Darold A. Treffert Savant syndrome comes in different forms. In congenital savant syndrome the extraordinary savant ability surfaces in early childhood. In acquired savant syndrome astonishing new abilities, typically in music, art or mathematics, appear unexpectedly in ordinary persons after a head injury, stroke or other central nervous system (CNS) incident where no such abilities or interests were present pre-incident. But in sudden savant syndrome an ordinary person with no such prior interest or ability and no precipitating injury or other CNS incident has an unanticipated, spontaneous epiphanylike moment where the rules and intricacies of music, art or mathematics, for example, are experienced and revealed, producing almost instantaneous giftedness and ability in the affected area of skill sets. Because there is no underlying disability such as that which occurs in congenital or acquired savant syndromes, technically sudden savant syndrome would be better termed sudden genius A 28-year-old gentleman from Israel, K. A., sent his description of his epiphany moment. He was in a mall where there was a piano. Whereas he could play simple popular songs from rote memory before, “suddenly at age 28 after what I can best describe as a ‘just getting it moment,’ it all seemed so simple. I suddenly was playing like a well-educated pianist.” His friends were astonished as he played and suddenly understood music in an entirely intricate way. “I suddenly realized what the major scale and minor scale were, what their chords were and where to put my fingers in order to play certain parts of the scale. I was instantly able to recognize harmonies of the scales in songs I knew as well as the ability to play melody by interval recognition.” He began to search the internet for information on music theory and to his amazement “most of what they had to teach I already knew, which baffled me as to how could I know something I had never studied." © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25255 - Posted: 07.26.2018

By Jocelyn Kaiser Basic brain and behavioral researchers will get more than a year to comply with a new U.S. policy that will treat many of their studies as clinical trials. The announcement from the National Institutes of Health (NIH) appears to defuse, for now, a yearlong controversy over whether basic research on humans should follow the same rules as studies testing drugs. Although research groups had hoped NIH would drop its plans to tag basic studies with humans as trials, they say they’re relieved they get more time to prepare and give the agency input. “It’s a positive step forward,” says Paula Skedsvold, executive director of the Federation of Associations in Behavioral & Brain Sciences in Washington, D.C. At issue is a recently revised definition of a clinical trial along with a set of rules in effect since January that are meant to increase the rigor and transparency of NIH-funded clinical trials. About a year ago, basic scientists who study human cognition—for example, using brain imaging with healthy volunteers—were alarmed to realize many of these studies fit the new clinical trial definition. Researchers protested that many requirements, such as registering and reporting results in the ClinicalTrials.gov federal database, made no sense for studies that weren’t testing a treatment and would confuse the public. NIH then issued a set of case studies explaining that only some basic studies would fall under the trials definition. But concerns remained about confusing criteria and burdensome new paperwork. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25248 - Posted: 07.25.2018

Allison Aubrey Can't cool off this summer? Heat waves can slow us down in ways we may not realize. New research suggests heat stress can muddle our thinking, making simple math a little harder to do. "There's evidence that our brains are susceptible to temperature abnormalities," says Joe Allen, co-director of the Center for Climate, Health and the Global Environment at Harvard University. And as the climate changes, temperatures spike and heat waves are more frequent. To learn more about how the heat influences young, healthy adults, Allen and his colleagues studied college students living in dorms during a summer heat wave in Boston. Half of the students lived in buildings with central AC, where the indoor air temperature averaged 71 degrees. The other half lived in dorms with no AC, where air temperatures averaged almost 80 degrees. "In the morning, when they woke up, we pushed tests out to their cellphones," explains Allen. The students took two tests a day for 12 consecutive days. One test, which included basic addition and subtraction, measured cognitive speed and memory. A second test assessed attention and processing speed. "We found that the students who were in the non-air-conditioned buildings actually had slower reaction times: 13 percent lower performance on basic arithmetic tests, and nearly a 10 percent reduction in the number of correct responses per minute," Allen explains. The results, published in PLOS Medicine, may come as a surprise. "I think it's a little bit akin to the frog in the boiling water," Allen says. There's a "slow, steady — largely imperceptible — rise in temperature, and you don't realize it's having an impact on you." © 2018 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25211 - Posted: 07.16.2018

By Erica Goode Suppose that, seeking a fun evening out, you pay $175 for a ticket to a new Broadway musical. Seated in the balcony, you quickly realize that the acting is bad, the sets are ugly and no one, you suspect, will go home humming the melodies. Do you head out the door at the intermission, or stick it out for the duration? Studies of human decision-making suggest that most people will stay put, even though money spent in the past logically should have no bearing on the choice. This “sunk cost fallacy,” as economists call it, is one of many ways that humans allow emotions to affect their choices, sometimes to their own detriment. But the tendency to factor past investments into decision-making is apparently not limited to Homo sapiens. In a study published on Thursday in the journal Science, investigators at the University of Minnesota reported that mice and rats were just as likely as humans to be influenced by sunk costs. The more time they invested in waiting for a reward — in the case of the rodents, flavored pellets; in the case of the humans, entertaining videos — the less likely they were to quit the pursuit before the delay ended. “Whatever is going on in the humans is also going on in the nonhuman animals,” said A. David Redish, a professor of neuroscience at the University of Minnesota and an author of the study. This cross-species consistency, he and others said, suggested that in some decision-making situations, taking account of how much has already been invested might pay off. “Evolution by natural selection would not promote any behavior unless it had some — perhaps obscure — net overall benefit,” said Alex Kacelnik, a professor of behavioral ecology at Oxford, who praised the new study as “rigorous” in its methodology and “well designed.” © 2018 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25203 - Posted: 07.13.2018