Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 703

By Max Evans BBC News A stranger once waved at Boo James on a bus. She did not think any more of it - until it later emerged it was her mother. She has a relatively rare condition called face blindness, which means she cannot recognise the faces of her family, friends, or even herself. Scientists have now launched a study they hope could help train people like Boo to recognise people better. Boo said for many years she thought she was "from another planet". "It is immensely stressful and very emotionally upsetting to sit and dwell upon so I try not to do that," she said. "It's very hard work. It can be physically and emotionally exhausting to spend a day out in public constantly wondering whether you should have spoken to someone." For most of her life, she didn't know she had the condition - also known as prosopagnosia - and blamed herself for the "social awkwardness" caused when she failed to recognise people. "I had to try and find a way to explain that. I really couldn't very well, except to think that I was just the one to blame for not being bothered to remember who people were. "[Like it was] some sort of laziness: I didn't want to know them, obviously I wasn't interested enough to remember them, so that was some kind of deficiency, perhaps, in me." But the penny dropped in her early 40s when she saw a news item about the condition on television. "I then knew that the only reason I wasn't recognising that person was because my brain physically wasn't able to do it," she said. "I could immediately engage more self-understanding and forgive myself and try to approach things from a different angle." Image caption Boo has developed techniques to try to help her cope, including remembering what people wear She said her childhood was punctuated by "traumatic experiences" with fellow children, childminders and teachers she could not recognise. © 2019 BBC

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 26011 - Posted: 03.06.2019

Bruce Bower WASHINGTON — Beliefs among some university professors that intelligence is fixed, rather than capable of growth, contribute to a racial achievement gap in STEM courses, a new study suggests. Those professors may subtly communicate stereotypes about blacks, Hispanics and Native Americans allegedly being less intelligent than Asians and whites, say psychologist Elizabeth Canning of Indiana University in Bloomington and her colleagues. In turn, black, Hispanic and Native American undergraduates may respond by becoming less academically motivated and more anxious about their studies, leading to lower grades. Even small dips in STEM grades — especially for students near pass/fail cutoffs — can accumulate across the 15 or more science, technology, engineering and math classes needed to become a physician or an engineer, Canning says. That could jeopardize access to financial aid and acceptance to graduate programs. “Our work suggests that academic benefits could accrue over time if all students, and particularly underrepresented minority students, took STEM classes with faculty who endorse a growth mind-set,” Canning says. Underrepresented minority students’ reactions to professors with fixed or flexible beliefs about intelligence have yet to be studied. But over a two-year period, the disparity in grade point averages separating Asian and white STEM students from black, Hispanic and Native American peers was nearly twice as large in courses taught by professors who regarded intelligence as set in stone, versus malleable, Canning’s team reports online February 15 in Science Advances. |© Society for Science & the Public 2000 - 2019.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 25970 - Posted: 02.18.2019

By Alex Fox If math is the language of the universe, bees may have just uttered their first words. New research suggests these busybodies of the insect world are capable of addition and subtraction—using colors in the place of plus and minus symbols. In the animal kingdom, the ability to count—or at least distinguish between differing quantities—isn’t unusual: It has been seen in frogs, spiders, and even fish. But solving equations using symbols is rare air, so far only achieved by famously brainy animals such as chimpanzees and African grey parrots. Enter the honey bee (Apis mellifera). Building on prior research that says the social insects can count to four and understand the concept of zero, researchers wanted to test the limits of what their tiny brains can do. Scientists trained 14 bees to link the colors blue and yellow to addition and subtraction, respectively. They placed the bees at the entrance of a Y-shaped maze, where they were shown several shapes in either yellow or blue. If the shapes were blue, bees got a reward if they went to the end of the maze with one more blue shape (the other end had one less blue shape); if the shapes were yellow, they got a reward if they went to the end of the maze with one less yellow shape. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25938 - Posted: 02.08.2019

By Benedict Carey The world’s most common digital habit is not easy to break, even in a fit of moral outrage over the privacy risks and political divisions Facebook has created, or amid concerns about how the habit might affect emotional health. Although four in 10 Facebook users say they have taken long breaks from it, the digital platform keeps growing. A recent study found that the average user would have to be paid $1,000 to $2,000 to be pried away for a year. So what happens if you actually do quit? A new study, the most comprehensive to date, offers a preview. Expect the consequences to be fairly immediate: More in-person time with friends and family. Less political knowledge, but also less partisan fever. A small bump in one’s daily moods and life satisfaction. And, for the average Facebook user, an extra hour a day of downtime. The study, by researchers at Stanford University and New York University, helps clarify the ceaseless debate over Facebook’s influence on the behavior, thinking and politics of its active monthly users, who number some 2.3 billion worldwide. The study was posted recently on the Social Science Research Network, an open access site. “For me, Facebook is one of those compulsive things,” said Aaron Kelly, 23, a college student in Madison, Wis. “It’s really useful, but I always felt like I was wasting time on it, distracting myself from study, using it whenever I got bored.” Mr. Kelly, who estimated that he spent about an hour a day on the platform, took part in the study “because it was kind of nice to have an excuse to deactivate and see what happened,” he said. Well before news broke that Facebook had shared users’ data without consent, scientists and habitual users debated how the platform had changed the experience of daily life. A cadre of psychologists has argued for years that the use of Facebook and other social media is linked to mental distress, especially in adolescents. Others have likened habitual Facebook use to a mental disorder, comparing it to drug addiction and even publishing magnetic-resonance images of what Facebook addiction “looks like in the brain.” © 2019 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 12: Psychopathology: The Biology of Behavioral Disorders
Link ID: 25919 - Posted: 01.31.2019

By Scott Barry Kaufman What is going on in our brains when we are creating? How does our brain look different when we are engaging in art vs. science? How does the brain of genius creators differ from the rest of us? What are some of the limitations of studying the creative brain? The neuroscience of creativity is booming. There is now a society (with an annual conference), an edited volume, a handbook, and now an entire textbook on the topic. Bringing the latest research together from a number of scientists*, Anna Abraham wrote a wonderful resource that covers some of the most hot button topics in the field. She was gracious enough to do a Q & A with me. Enjoy! SBK: How’d you get interested in the neuroscience of creativity? AA: I have always been curious about creativity. At the most fundamental level I think I simply wanted to get my head around the mystery of this marvelous ability that each of us possesses. In particular, I hoped to find out what makes some people more creative than others. When I saw an opportunity to pursue a PhD in Neuroscience in the early 2000s in any topic of my choice, I went all in - it was an exciting and promising approach that had until then only been limitedly used to explore the creative mind. SBK: What is creativity? Does the field have a unified, agreed upon definition of creativity that you are satisfied with? AA: There is a surprising level of unanimity in the field when it comes to a boilerplate definition. Most experts agree that two elements are central to creativity. First and foremost, it reflects our capacity to generate ideas that are original, unusual or novel in some way. The second element is that these ideas also need to be satisfying, appropriate or suited to the context in question. I am reasonably satisfied with this definition but not in how it guides scientific enquiry.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25839 - Posted: 01.05.2019

By Christina Karns ’Tis the season when the conversation shifts to what you’re thankful for. Gathered with family and friends around a holiday feast, for instance, people may recount some of the biggies — such as their health or their children — or smaller things that enhance everyday life — such as happening upon a great movie while channel surfing or enjoying a favorite seasonal food. Psychology researchers recognize that taking time to be thankful has benefits for well-being. Gratitude not only goes along with more optimism, less anxiety and depression, and greater goal attainment, but also is associated with fewer symptoms of illness and other physical benefits. In recent years, researchers have been making connections between the internal experience of gratitude and the external practice of altruism. How does being thankful about things in your own life relate to any selfless concern you may have about the well-being of others? As a neuroscientist, I’m particularly interested in the brain regions and connections that support gratitude and altruism. I’ve been exploring how changes in one might lead to changes in the other. To study the relationship between gratitude and altruism in the brain, my colleagues and I first asked volunteers questions meant to tease out how frequently they feel thankful and the degree to which they tend to care about the well-being of others. Then we used statistics to determine the extent to which someone’s gratitude could predict their altruism. As others have found, the more grateful people in this group tended to be more altruistic. © 1996-2018 The Washington Post

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25823 - Posted: 12.26.2018

Phil Jaekl Driving a car is a complex task for a brain to coordinate. A driver may drink a cup of coffee and have a conversation with a passenger, all while safely piloting a vehicle through traffic. But all of this activity requires attention—that is, concentrating on the tasks and sources of information that matter and blocking out those that don’t. How the brain manages that orchestration is a long-standing scientific mystery. One prominent view, based on findings from human behavioral studies, is that the brain guides us through a world chock-full of sensory inputs by focusing a metaphorical spotlight on what it deems important, while filtering out relatively trivial details. Unlike some other, functionally well-defined aspects of cognition, this attentional spotlight has eluded scientific understanding. Its neural substrates have been particularly difficult to pin down to specific activities and locations in the brain—although several studies have implicated the frontoparietal network, which spans the frontal and parietal lobes of the brain. Meanwhile, attention studies involving visual tasks that require continuous focus—detecting a small object flashing on a cluttered computer screen, for example—have shown that task performance varies over short time intervals, with episodes of peak performance and of poor performance alternating on millisecond timescales. Such research suggests that the attentional spotlight might not be as constant as once thought. Yet, until now, researchers have not been able to directly connect these changes in performance to fluctuations in brain activity. © 1986 - 2018 The Scientist

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25800 - Posted: 12.20.2018

By Kuheli Dutt When neuroscientist Ben Barres delivered his first seminar, an audience member praised him, commenting that Ben’s work was much better than that of his sister, Barbara Barres. The irony? Ben Barres (now deceased), a transgender scientist was Barbara Barres before he transitioned to male. When New York Times columnist Brent Staples was a graduate student in Chicago’s Hyde Park, he found that white people on the street perceived him, an African American, as a threat to their safety. They were visibly tense around him, clutched their purses and sometimes even crossed the street to avoid him. But when he started whistling tunes from classical music, people suddenly weren’t afraid of him anymore—they relaxed and some even smiled at him. Implicit bias runs far deeper than we realize. A riddle used at implicit bias trainings goes like this: A father and his son are in a terrible car crash. The father dies at the scene. His son, in critical condition, is rushed to the hospital; he’s in the operating room, about to go under the knife. The surgeon says, “I can’t operate on this boy—he’s my son!” The audience is then asked how that’s possible. Responses include several scenarios: two gay fathers; one biological and one adopted father; one father and one priest (religious father); all of which are possible. However, an obvious answer that most people miss: the surgeon is the boy’s mother. Whether we like it or not, we are conditioned to associating surgeons with being male. © 2018 Scientific American,

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25799 - Posted: 12.20.2018

Alison Abbott Doris Tsao launched her career deciphering faces — but for a few weeks in September, she struggled to control the expression on her own. Tsao had just won a MacArthur Foundation ‘genius’ award, an honour that comes with more than half a million dollars to use however the recipient wants. But she was sworn to secrecy — even when the foundation sent a film crew to her laboratory at the California Institute of Technology (Caltech) in Pasadena. Thrilled and embarrassed at the same time, she had to invent an explanation, all while keeping her face in check. It was her work on faces that won Tsao awards and acclaim. Last year, she cracked the code that the brain uses to recognize faces from a multitude of minuscule differences in shapes, distances between features, tones and textures. The simplicity of the coding surprised and impressed the neuroscience community. “Her work has been transformative,” says Tom Mrsic-Flogel, director of the Sainsbury Wellcome Centre for Neural Circuits and Behaviour at University College London. But Tsao doesn’t want to be remembered just as the scientist who discovered the face code. It is a means to an end, she says, a good tool for approaching the question that really interests her: how does the brain build up a complete, coherent model of the world by filling in gaps in perception? “This idea has an elegant mathematical formulation,” she says, but it has been notoriously hard to put to the test. Tsao now has an idea of how to begin. © 2018 Springer Nature Publishing AG

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25773 - Posted: 12.11.2018

By Aaron E. Carroll Even before the recent news that a group of researchers managed to get several ridiculous fake studies published in reputable academic journals, people have been aware of problems with peer review. Throwing out the system — which deems whether research is robust and worth being published — would do more harm than good. But it makes sense to be aware of peer review’s potential weaknesses. Reviewers may be overworked and underprepared. Although they’re experts in the subject they are reading about, they get no specific training to do peer review, and are rarely paid for it. With 2.5 million peer-reviewed papers published annually worldwide — and more that are reviewed but never published — it can be hard to find enough people to review all the work. There is evidence that reviewers are not always consistent. A 2010 paper describes a study in which two researchers selected 12 articles already accepted by highly regarded journals, swapped the real names and academic affiliations for false ones, and resubmitted the identical material to the same journals that had already accepted them in the previous 18 to 32 months. Only 8 percent of editors or reviewers noticed the duplication, and three papers were detected and pulled. Of the nine papers that continued through the review process, eight were turned down, with 89 percent of reviewers recommending rejection. Peer review may be inhibiting innovation. It takes significant reviewer agreement to have a paper accepted. One potential downside is that important research bucking a trend or overturning accepted wisdom may face challenges surviving peer review. In 2015, a study published in P.N.A.S. tracked more than 1,000 manuscripts submitted to three prestigious medical journals. Of the 808 that were published at some point, the 2 percent that were most frequently cited had been rejected by the journals. An even bigger issue is that peer review may be biased. Reviewers can usually see the names of the authors and their institutions, and multiple studies have shown that reviews preferentially accept or reject articles based on a number of demographic factors. In a study published in eLife last year, researchers created a database consisting of more than 9,000 editors, 43,000 reviewers and 126,000 authors whose work led to about 41,000 articles in 142 journals in a number of domains. They found that women made up only 26 percent of editors, 28 percent of reviewers and 37 percent of authors. Analyses showed that this was not because fewer women were available for each role. © 2018 The New York Times Compan

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 20:
Link ID: 25643 - Posted: 11.05.2018

Jon Hamilton An ancient part of the brain long ignored by the scientific world appears to play a critical role in everything from language and emotions to daily planning. It's the cerebellum, which is found in fish and lizards as well as people. But in the human brain, this structure is wired to areas involved in higher-order thinking, a team led by researchers from Washington University in St. Louis reports Thursday in the journal Neuron. "We think that the cerebellum is acting as the brain's ultimate quality control unit," says Scott Marek, a postdoctoral research scholar and the study's first author. The finding adds to the growing evidence that the cerebellum "isn't only involved in sensory-motor function, it's involved in everything we do," says Dr. Jeremy Schmahmann, a neurology professor at Harvard and director of the ataxia unit at Massachusetts General Hospital. Schmahmann, who wasn't involved in the new study, has been arguing for decades that the cerebellum plays a key role in many aspects of human behavior, as well as mental disorders such as schizophrenia. But only a handful of scientists have explored functions of the cerebellum beyond motor control. "It's been woefully understudied," says Dr. Nico Dosenbach, a professor of neurology at Washington University whose lab conducted the study. Even now, many scientists think of the cerebellum as the part of the brain that lets you pass a roadside sobriety test. It helps you do things like walk in a straight line or stand on one leg or track a moving object — if you're not drunk. © 2018 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 25624 - Posted: 10.27.2018

By Frankie Schembri Humans are awful at estimating a person’s age based on their face alone. This can lead not only to uncomfortable social situations, but also to critical errors in criminal investigations and enforcing age-based restrictions on such things as alcohol and gambling. New research shows people are usually off by about 8 years, and their estimate might be shaped by the last face they saw. To conduct the study, researchers collected 3968 pictures of consenting participants from the Australian Passport Office—31 men and 31 women at each age from 7 through 70. Then, they showed 81 people photographs of a man and woman at each age in a random sequence, and asked them to guess their ages. The faces above are computer-generated averages of more than 100 pictures from the study of people aged 19 to 22, 50 to 53, and 63 to 66. Volunteers consistently guessed that young faces were several years older than they actually were and that older faces were several years younger than they actually were, the team reports today in Royal Society Open Science. The results also showed that people’s estimates were affected by the previous face they had viewed—if they had just seen a young face, they usually lowballed the next face’s age, and vice versa. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25581 - Posted: 10.17.2018

By Frankie Schembri Think of all the faces you know. As you flick through your mental Rolodex, your friends, family, and co-workers probably come first—along with celebrities—followed by the faces of the nameless strangers you encounter during your daily routine. But how many faces can the human Rolodex store? To ballpark the size of the average person’s “facial vocabulary,” researchers gave 25 people 1 hour to list as many faces from their personal lives as possible, and then another hour to do the same with famous faces, like those of actors, politicians, and musicians. If the participants couldn’t remember a person’s name, but could imagine their face, they used a descriptive phrase like “the high school janitor,” or “the actress from Friends with the haircut.” People came up with lots of faces during the first minutes of the test, but the rate of remembrance dropped over the course of the hour. By graphing this relationship and extrapolating it to when most people would run out of faces, the researchers estimated the number of faces an average person can recall from memory. To figure out how many additional faces people recognized but were unable to recall without prompting, researchers showed the participants photographs of 3441 celebrities, including Barack Obama and Tom Cruise. To qualify as “knowing” a face, the participants had to recognize two different photos of each person. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25552 - Posted: 10.10.2018

By Emily Underwood The ornately folded outer layer of the human brain, the cerebral cortex, has long received nearly all the credit for our ability to perform complex cognitive tasks such as composing a sonata, imagining the plot of a novel or reflecting on our own thoughts. One explanation for how we got these abilities is that the cortex rapidly expanded relative to body size as primates evolved — the human cortex has 10 times the surface area of a monkey’s cortex, for example, and 1,000 times that of a mouse. But the cortex is not the only brain region that has gotten bigger and more complex throughout evolution. Nestled beneath the cortex, a pair of egg-shaped structures called the thalamus has also grown, and its wiring became much more intricate as mammals diverged from reptiles. The thalamus — from the Greek thalamos, or inner chamber — transmits 98 percent of sensory information to the cortex, including vision, taste, touch and balance; the only sense that doesn’t pass through this brain region is smell. The thalamus also conducts motor signals and relays information from the brain stem to the cortex, coordinating shifts in consciousness such as waking up and falling asleep. Scientists have known for decades that the thalamus faithfully transmits information about the visual world from the retina to the cortex, leading to the impression that it is largely a messenger of sensory information rather than a center of complex cognition itself. But that limited, passive view of the thalamus is outdated, maintains Michael Halassa, a neuroscientist at the Massachusetts Institute of Technology who recently coauthored (with Ralf D. Wimmer and Rajeev V. Rikhye) an article in the Annual Review of Neuroscience exploring the thalamus’s role. © 2018 Annual Reviews, Inc

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 25542 - Posted: 10.08.2018

By Kelly Servick PHILADELPHIA, PENNSYLVANIA—While artificial intelligence (AI) has been busy trouncing humans at Go and spawning eerily personable Alexas, some neuroscientists have harbored a different hope: that the types of algorithms driving those technologies can also yield some insight into the squishy, wet computers in our skulls. At the Conference on Cognitive Computational Neuroscience here this month, researchers presented new tools for comparing data from living brains with readouts from computational models known as deep neural networks. Such comparisons might offer up new hypotheses about how humans process sights and sounds, understand language, or navigate the world. “People have fantasized about that since the 1980s,” says Josh McDermott, a computational neuroscientist at the Massachusetts Institute of Technology (MIT) in Cambridge. Until recently, AI couldn’t come close to human performance on tasks such as recognizing sounds or classifying images. But deep neural networks, loosely inspired by the brain, have logged increasingly impressive performances, especially on visual tasks. That “brings the question back to mind,” says neuroscientist Chris Baker of the National Institute of Mental Health in Bethesda, Maryland. Deep neural networks work by passing information between computational “nodes” that are arranged in successive layers. The systems hone skills on huge sets of data; for networks that classify images, that usually means collections of labeled photos. Performance improves with feedback as the systems repeatedly adjust the strengths of the connections between nodes. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 25466 - Posted: 09.18.2018

By: Richard Restak, M.D. Editor’s Note: Unthinkable’s author, a British neuroscientist, tracked down nine people with rare brain disorders to tell their stories. From the man who thinks he's a tiger to the doctor who feels the pain of others just by looking at them to a woman who hears music that’s not there, their experiences illustrate how the brain can shape our lives in unexpected and, in some cases, brilliant and alarming ways. Several years ago, science writer Helen Thomson, consultant to New Scientist and contributor to the Washington Post and Nature, decided to travel around the world to interview people with "the most extraordinary brains." In the process, as described in Unthinkable: An Extraordinary Journey Through the World's Strangest Brains (Ecco/Harper Collins 2018), Thomas discovered that "by putting their lives side-by-side, I was able to create a picture of how the brain functions in us all. Through their stories, I uncovered the mysterious manner in which the brain can shape our lives in unexpected—and, some cases, brilliant and alarming ways." Thomson wasn't just learning about the most extraordinary brains in the world, but in the process was "uncovering the secrets of my own." During her journey Thomson encounters Bob, who can remember days from 40 years ago with as much clarity and detail as yesterday; Sharon, who has lost her navigational abilities and on occasion becomes lost in her own home; Tommy who, after a ruptured aneurysm that damaged his left temporal lobe, underwent a total personality change; Sylvia, an otherwise normal retired school teacher who experiences near constant musical hallucinations; and Louise, who is afflicted with a permanent sense of detachment from herself and everyone around her. Beyond skillfully portraying each of these and other fascinating individuals, Thomson places them in historical and scientific context: when neuroscientists first encountered similar patients, along with past and current explanations of what has gone amiss in their brains. © 2018 The Dana Foundation

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 20: ; Chapter 14: Attention and Higher Cognition
Link ID: 25420 - Posted: 09.07.2018

By Bahar Gholipour Milena Canning can see steam rising from a coffee cup but not the cup. She can see her daughter’s ponytail swing from side to side, but she can’t see her daughter. Canning is blind, yet moving objects somehow find a way into her perception. Scientists studying her condition say it could reveal secrets about how humans process vision in general. Canning was 29 when a stroke destroyed her entire occipital lobe, the brain region housing the visual system. The event left her sightless, but one day she saw a flash of light from a metallic gift bag next to her. Her doctors told her she was hallucinating. Nevertheless, “I thought there must be something happening within my brain [allowing me to see],” she says. She went from doctor to doctor until she met Gordon Dutton, an ophthalmologist in Glasgow, Scotland. Dutton had encountered this mystery before—in a 1917 paper by neurologist George Riddoch describing brain-injured World War I soldiers. To help enhance Canning’s motion-based vision, Dutton prescribed her a rocking chair. Canning is one of a handful of people who have been diagnosed with the “Riddoch phenomenon,” the ability to perceive motion while blind to other visual stimuli. Jody Culham, a neuroscientist at Western University in Ontario, and her colleagues launched a 10-year investigation into Canning’s remarkable vision and published the results online in May in Neuropsychologia. The team confirmed that Canning was able to detect motion and its direction. She could see a hand moving toward her, but she could not tell a thumbs-up from a thumbs-down. She was also able to navigate around obstacles, reach and grasp, and catch a ball thrown at her. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 25409 - Posted: 09.01.2018

Megan MolteniMegan Molteni It’s been more than a century since Spanish neuroanatomist Santiago Ramón y Cajal won the Nobel Prize for illustrating the way neurons allow you to walk, talk, think, and be. In the intervening hundred years, modern neuroscience hasn’t progressed that much in how it distinguishes one kind of neuron from another. Sure, the microscopes are better, but brain cells are still primarily defined by two labor-intensive characteristics: how they look and how they fire. Which is why neuroscientists around the world are rushing to adopt new, more nuanced ways to characterize neurons. Sequencing technologies, for one, can reveal how cells with the same exact DNA turn their genes on or off in unique ways—and these methods are beginning to reveal that the brain is a more diverse forest of bristling nodes and branching energies than even Ramón y Cajal could have imagined. On Monday, an international team of researchers introduced the world to a new kind of neuron, which, at this point, is believed to exist only in the human brain. The long nerve fibers known as axons of these densely bundled cells bulge in a way that reminded their discoverers of a rose without its petals—so much that they named them “rose hip cells.” Described in the latest issue of Nature Neuroscience, these new neurons might use their specialized shape to control the flow of information from one region of the brain to another. “They can really act as a sort of brake on the system,” says Ed Lein, an investigator at the Allen Institute for Brain Science—home to several ambitious brain mapping projects—and one of the lead authors on the study. Neurons come in two basic flavors: Excitatory cells send information to the cells next to them, while inhibitory cells slow down or stop excitatory cells from firing. Rose hip cells belong to this latter type, and based on their physiology, seem to be a particularly potent current-curber. © 2018 Condé Nast

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 25391 - Posted: 08.28.2018

By Anna Clemens In 2003 a 65-year-old man brought a strange problem to neurologist Adam Zeman, now at the University of Exeter in England. The patient, later dubbed “MX,” claimed he could not conjure images of friends, family members or recently visited places. All his life, MX, a retired surveyor, had loved reading novels and had routinely drifted off to sleep visualizing buildings, loved ones and recent events. But after undergoing a procedure to open arteries in his heart, during which he probably suffered a minor stroke, his mind’s eye went blind. He could see normally, but he could not form pictures in his mind. Zeman had never encountered anything like it and set out to learn more. He has since given the condition a name—aphantasia (phantasia means “imagination” in Greek). And he and others are exploring its neurological underpinnings. Zeman and his colleagues began their analysis by testing MX’s visual imagination in several ways. Compared with control subjects, MX scored poorly on questionnaires assessing the ability to produce visual imagery. Surprisingly, though, he was able to accomplish tasks that typically involve visualization. Advertisement For example, when asked to say which is a lighter color of green—grass or pine trees—most people would decide by imagining both grass and tree and comparing them. MX correctly said that pine trees are darker than grass, but he insisted he had used no visual imagery to make the decision. “I just know the answer,” he said. © 2018 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 25279 - Posted: 08.01.2018

By Stephen T. Casper The case report is dead. At least, it seems all but so in the realm of evidence-based medicine. It is thus thoroughly refreshing to read Helen Thomson’s Unthinkable: An Extraordinary Journey Through the World’s Strangest Brains and Eric R. Kandel’s The Disordered Mind: What Unusual Brains Tell Us About Ourselves, two ambitious books that draw on clinical profiles to tell stories about our brains and minds. Thomson’s memoir aims to help us understand our brains through stories about exceptional others, who, she argues, may serve as proxies for ourselves. Kandel’s book argues from neuroscience research and individual illness experiences for a biologically informed account of mind and brain. Both authors are unapologetic in their focus on what might be dismissed as merely anecdotal. Each foregrounds neurological and psychiatric patient narratives and experiences and from these draws out larger philosophical and scientific lessons. By profiling and seeking meaning in individuals with curious neurological conditions, Thomson’s Unthinkable follows a well-worn literary path but revitalizes the genre with an original and subtle shift to the personal. Perfected by neurologist Oliver Sacks, Thomson’s technique was invented before the 19th century but most famously pioneered in the 20th century by such eminent neurologists as Morton Prince, Sigmund Freud, and Alexander Luria. Where those authors represented patients as medical mysteries or as object lessons in physiology and philosophy, Thomson finds a timelier focus that corresponds with the growing advocacy for, and social attention to, individual patients’ rights. Unlike her predecessors in the genre, Thomson enters her subject’s lives—their restaurants, homes, families, communities, and online selves. © 2017 American Association for the Advancement of Science

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 15: Language and Lateralization
Link ID: 25278 - Posted: 08.01.2018