Chapter 7. Vision: From Eye to Brain

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 21 - 40 of 1152

A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily

Keyword: Attention; Vision
Link ID: 22627 - Posted: 09.05.2016

Laura Sanders Despite its name, the newly identified GluMI cell (pronounced “gloomy”) is no downer. It’s a nerve cell, spied in a mouse retina, that looks like one type of cell but behaves like another. Like neighboring retina nerve cells that subdue, or deaden, activity of other nerve cells, GluMI cells have a single arm extending from their body. But unlike those cells, GluMI cells actually seem to ramp up activity of nearby cells in a way that could aid vision. GLuMIs don’t seem to detect light firsthand, but they respond to it, Luca Della Santina of the University of Washington in Seattle and colleagues found. GluMIs are among a growing list of unexpected and mysterious cells found in the retinas of vertebrates, the researchers write August 8 in Current Biology. Citations L. Della Santina et al. Glutamatergic monopolar interneurons provide a novel pathway of excitation in the mouse retina. Current Biology. Vol. 26, August 8, 2016. doi:10.1016/j.cub.2016.06.016. |© Society for Science & the Public 2000 - 2016

Keyword: Vision
Link ID: 22610 - Posted: 08.30.2016

By Anna Azvolinsky Sets of neurons in the brain that behave together—firing synchronously in response to sensory or motor stimuli—are thought to be functionally and physiologically connected. These naturally occurring ensembles of neurons are one of the ways memories may be programmed in the brain. Now, in a paper published today (August 11) in Science, researchers at Columbia University and their colleagues show that it is possible to stimulate visual cortex neurons in living, awake mice and induce a new ensemble of neurons that behave as a group and maintain their concerted firing for several days. “This work takes the concept of correlated [neuronal] firing patterns in a new and important causal direction,” David Kleinfeld, a neurophysicist at the University of California, San Diego, who was not involved in the work told The Scientist. “In a sense, [the researchers] created a memory for a visual feature that does not exist in the physical world as a proof of principal of how real visual memories are formed.” “Researchers have previously related optogenetic stimulation to behavior [in animals], but this study breaks new ground by investigating the dynamics of neural activity in relation to the ensemble to which these neurons belong,” said Sebastian Seung, a computational neuroscientist at the Princeton Neuroscience Institute in New Jersey who also was not involved in the study. Columbia’s Rafael Yuste and colleagues stimulated randomly selected sets of individual neurons in the visual cortices of living mice using two-photon stimulation while the animals ran on a treadmill. © 1986-2016 The Scientist

Keyword: Learning & Memory; Vision
Link ID: 22558 - Posted: 08.13.2016

Amy McDermott You’ve got to see it to be it. A heightened sense of red color vision arose in ancient reptiles before bright red skin, scales and feathers, a new study suggests. The finding bolsters evidence that dinosaurs probably saw red and perhaps displayed red color. The new finding, published in the Aug. 17 Proceedings of the Royal Society B, rests on the discovery that birds and turtles share a gene used both for red vision and red coloration. More bird and turtle species use the gene, called CYP2J19, for vision than for coloration, however, suggesting that its first job was in sight. “We have this single gene that has two very different functions,” says evolutionary biologist Nicholas Mundy of the University of Cambridge. Mundy’s team wondered which function came first: the red vision or the ornamentation. In evolution, what an animal can see is often linked with what others can display, says paleontologist Martin Sander of the University of Bonn in Germany, who did not work on the new study. “We’re always getting at color from these two sides,” he says, because the point of seeing a strong color is often reading visual signals. Scientists already knew that birds use CYP2J19 for vision and color. In bird eyes, the gene contains instructions for making bright red oil droplets that filter red light. Other forms of red color vision evolved earlier in other animals, but this form allows birds to see more shades of red than humans can. Elsewhere in the body, the same gene can code for pigments that stain feathers red. Turtles are the only other land vertebrates with bright red oil droplets in their eyes. But scientists weren’t sure if the same gene was responsible, Mundy says. |© Society for Science & the Public 2000 - 2016

Keyword: Vision; Evolution
Link ID: 22535 - Posted: 08.10.2016

Davide Castelvecchi People can detect flashes of light as feeble as a single photon, an experiment has demonstrated — a finding that seems to conclude a 70-year quest to test the limits of human vision. The study, published in Nature Communications on 19 July1, “finally answers a long-standing question about whether humans can see single photons — they can!” says Paul Kwiat, a quantum optics researcher at the University of Illinois at Urbana–Champaign. The techniques used in the study also open up ways of testing how quantum properties — such as the ability of photons to be in two places at the same time — affect biology, he adds. “The most amazing thing is that it’s not like seeing light. It’s almost a feeling, at the threshold of imagination,” says Alipasha Vaziri, a physicist at the Rockefeller University in New York City, who led the work and tried out the experience himself. Experiments on cells from frogs have shown that sensitive light-detecting cells in vertebrate eyes, called rod cells, do fire in response to single photons2. But, in part because the retina processes its information to reduce ‘noise’ from false alarms, researchers hadn’t been able to confirm whether the firing of one rod cell would trigger a signal that would be transmitted all the way to the brain. Nor was it clear whether people would be able to consciously sense such a signal if it did reach the brain. Experiments to test the limits of human vision have also had to wait for the arrival of quantum-optics technologies that can reliably produce one photon of light at a time. © 2016 Macmillan Publishers Limited

Keyword: Vision
Link ID: 22461 - Posted: 07.20.2016

Jon Hamilton Letting mice watch Orson Welles movies may help scientists explain human consciousness. At least that's one premise of the Allen Brain Observatory, which launched Wednesday and lets anyone with an Internet connection study a mouse brain as it responds to visual information. "Think of it as a telescope, but a telescope that is looking at the brain," says Christof Koch, chief scientific officer of the Allen Institute for Brain Science, which created the observatory. The hope is that thousands of scientists and would-be scientists will look through that telescope and help solve one of the great mysteries of human consciousness, Koch says. "You look out at the world and there's a picture in your head," he says. "You see faces, you see your wife, you see something on TV." But how does the brain create those images from the chaotic stream of visual information it receives? "That's the mystery," Koch says. There's no easy way to study a person's brain as it makes sense of visual information. So the observatory has been gathering huge amounts of data on mice, which have a visual system that is very similar to the one found in people. The data come from mice that run on a wheel as still images and movies appear on a screen in front of them. For the mice, it's a lot like watching TV on a treadmill at the gym. But these mice have been genetically altered in a way that allows a computer to monitor the activity of about 18,000 neurons as they respond to different images. "We can look at those neurons and from that decode literally what goes through the mind of the mouse," Koch says. Those neurons were pretty active when the mice watched the first few minutes of Orson Welles' film noir classic Touch of Evil. The film is good for mouse experiments because "It's black and white and it has nice contrasts and it has a long shot without having many interruptions," Koch says. © 2016 npr

Keyword: Vision; Consciousness
Link ID: 22438 - Posted: 07.14.2016

By Karen Weintraub Researchers at Stanford University have coaxed brain cells involved in vision to regrow and make functional connections—helping to upend the conventional dogma that mammalian brain cells, once damaged, can never be restored. The work was carried out in visually impaired mice but suggests that human maladies including glaucoma, Alzheimer’s disease and spinal cord injuries might be more repairable than has long been believed. Frogs, fish and chickens are known to regrow brain cells, and previous research has offered clues that it might be possible in mammals. The Stanford scientists say their new study confirms this and shows that, although fewer than 5 percent of the damaged retinal ganglion cells grew back, it was still enough to make a difference in the mice’s vision. “The brain is very good at coping with deprived inputs,” says Andrew Huberman, the Stanford neurobiologist who led the work. “The study also supports the idea that we may not need to regenerate every neuron in a system to get meaningful recovery.” Other researchers praised the study, published Monday in Nature Neuroscience. “I think it’s a significant step forward toward getting to the point where we really can regenerate optic nerves,” says Don Zack, a professor of ophthalmology at Johns Hopkins University who was not involved in the research. He calls it “one more indication that it may be possible to bring that ability back in humans.” © 2016 Scientific American

Keyword: Vision; Regeneration
Link ID: 22428 - Posted: 07.12.2016

By Shayla Love In 2005, astronaut John Phillips took a break from his work on the International Space Station and looked out the window at Earth. He was about halfway through a mission that had begun in April and would end in October. When he gazed down at the planet, the Earth was blurry. He couldn’t focus on it clearly. That was strange — his vision had always been 20/20. He wondered: Was his eyesight getting worse? “I’m not sure if I reported that to the ground,” he said. “I think I didn’t. I thought it would be something that would just go away, and fix itself when I got to Earth.” It didn’t go away. During Phillips’ post-flight physical, NASA found that his vision had gone from 20/20 to 20/100 in six months. John Phillips began experiencing sight issues during his time on the International Space Station in 2005, but was reluctant to say anything while in space. (NASA) Rigorous testing followed. Phillips got MRIs, retinal scans, neurological tests and a spinal tap. The tests showed that not only had his vision changed, but his eyes had changed as well. The backs of his eyes had gotten flatter, pushing his retinas forward. He had choroidal folds, which are like stretch marks. His optic nerves were inflamed. Phillips case became the first widely recognized one of a mysterious syndrome that affects 80 percent of astronauts on long-duration missions in space. The syndrome could interfere with plans for future crewed space missions, including any trips to Mars.

Keyword: Vision
Link ID: 22422 - Posted: 07.11.2016

By Patrick Monahan Animals like cuttlefish and octopuses can rapidly change color to blend into the background and dazzle prospective mates. But there’s only one problem: As far as we know, they can’t see in color. Unlike our eyes, the eyes of cephalopods—cuttlefish, octopuses, and their relatives—contain just one kind of color-sensitive protein, apparently restricting them to a black and white view of the world. But a new study shows how they might make do. By rapidly focusing their eyes at different depths, cephalopods could be taking advantage of a lensing property called “chromatic blur.” Each color of light has a different wavelength—and because lenses bend some wavelengths more than others, one color of light shining through a lens can be in focus while another is still blurry. So with the right kind of eye, a quick sweep of focus would let the viewer figure out the actual color of an object based on when it blurs. The off-center pupils of many cephalopods—including the w-shaped pupils of cuttlefish (above)—make this blurring effect more extreme, according to a study published this week in the Proceedings of the National Academy of Sciences. In that study, scientists built a computer model of an octopus eye and showed that—for an object at least one body length away—it could determine the object’s color just by changing focus. Because this is all still theoretical, the next step is testing whether live cephalopods actually see color this way—and whether any other “colorblind” animals might, too. © 2016 American Association for the Advancement of Science.

Keyword: Vision; Evolution
Link ID: 22402 - Posted: 07.07.2016

By Jessica Hamzelou Imagine if each of the words in this article had their own taste, or the music you’re listening to played out as visual scene in your mind. For synaesthetes – a small proportion of people whose senses intertwine – this is the stuff of the every day. “Most people describe it as a gift,” says Jamie Ward, a neuroscientist at the University of Sussex in the UK. Now, he and his colleagues have found a new form of synaesthesia – one that moves beyond written language to sign language. It is the first time the phenomenon has been observed. “People with synaesthesia experience the ordinary world in extraordinary ways,” says Ward. In theory, any two senses can overlap. Some synaesthetes connect textures with words, while others can taste them. More commonly, written letters seem to have corresponding colours. An individual synaesthete may always associate the letter A with the colour pink, for instance. This type of synaesthesia has been found across many written languages, prompting Ward’s team to wonder if it can also apply to sign language. They recruited 50 volunteers with the type of synaesthesia that means they experience colours with letters, around half of whom were fluent in sign language too. All the participants watched a video of sign language and were asked if it triggered any colours. © Copyright Reed Business Information Ltd.

Keyword: Vision
Link ID: 22390 - Posted: 07.02.2016

When you walk into a room, your eyes process your surroundings immediately: refrigerator, sink, table, chairs. "This is the kitchen," you realize. Your brain has taken data and come to a clear conclusion about the world around you, in an instant. But how does this actually happen? Elissa Aminoff, a research scientist in the Department of Psychology and the Center for the Neural Basis of Cognition at Carnegie Mellon University, shares her insights on what computer modeling can tell us about human vision and memory. What do you do? What interests me is how the brain and the mind understand our visual environment. The visual world is really rich with information, and it’s extremely complex. So we have to find ways to break visual data down. What specific parts of our [visual] world is the brain using to give us what we see? In order to answer that question, we’re collaborating with computer scientists and using computer vision algorithms. The goal is to compare these digital methods with the brain. Perhaps they can help us find out what types of data the brain is working with. Does that mean that our brains function like a computer? That’s something you hear a lot about these days. No, I wouldn’t say that. It’s that computers are giving us the closest thing that we have right now to an analogous mechanism. The brain is really, really complex. It deals with massive amounts of data. We need help in organizing these data and computers can do that. Right now, there are algorithms that can identify an object as a phone or as a mug, just like the brain. But are they doing the same thing? Probably not. © 2016 Scientific American,

Keyword: Vision
Link ID: 22379 - Posted: 06.30.2016

By Aviva Rutkin Machine minds are often described as black boxes, their decision-making processes all but inscrutable. But in the case of machine intelligence, researchers are cracking that black box open and peering inside. What they find is that humans and machines don’t pay attention to the same things when they look at pictures – not at all. Researchers at Facebook and Virginia Tech in Blacksburg got humans and machines to look at pictures and answer simple questions – a task that neural-network-based artificial intelligence can handle. But the researchers weren’t interested in the answers. They wanted to map human and AI attention, in order to shed a little light on the differences between us and them. “These attention maps are something we can measure in both humans and machines, which is pretty rare,” says Lawrence Zitnick at Facebook AI Research. Comparing the two could provide insight “into whether computers are looking in the right place”. First, Zitnick and his colleagues asked human workers on Amazon Mechanical Turk to answer simple questions about a set of pictures, such as “What is the man doing?” or “What number of cats are lying on the bed?” Each picture was blurred, and the worker would have to click around to sharpen it. A map of those clicks served as a guide to what part of the picture they were paying attention to. © Copyright Reed Business Information Ltd.

Keyword: Vision
Link ID: 22378 - Posted: 06.30.2016

Worldwide voting for the BEST ILLUSION OF THE YEAR will take place online from 4pm EST on June 29th to 4pm EST on June 30th. The winning illusions will receive a $3,000 award for 1st place, a $2,000 award for 2nd place, and a $1,000 award for 3rd place. Anybody with an internet connection (that means YOU!) can vote to pick the Top 3 Winners from the current Top 10 List! The Best illusion of the Year Contest is a celebration of the ingenuity and creativity of the world’s premier illusion research community. Contestants from all around the world submitted novel illusions (unpublished, or published no earlier than 2015), and an international panel of judges rated them and narrowed them to the TOP TEN.

Keyword: Vision
Link ID: 22375 - Posted: 06.29.2016

By Vinicius Donisete Goulart The “new world” monkeys of South and Central America range from large muriquis to tiny pygmy marmosets. Some are cute and furry, others bald and bright red, and one even has an extraordinary moustache. Yet, with the exception of owl and howler monkeys, the 130 or so remaining species have one thing in common: A good chunk of the females, and all of the males, are colorblind. This is quite different from “old world” primates, including us Homo sapiens, who are routinely able to see the world in what we humans imagine as full color. In evolutionary terms, colorblindness sounds like a disadvantage, one which should really have been eliminated by natural selection long ago. So how can we explain a continent of the colorblind monkeys? I have long wondered what makes primates in the region colorblind and visually diverse, and how evolutionary forces are acting to maintain this variation. We don’t yet know exactly what kept these seemingly disadvantaged monkeys alive and flourishing—but what is becoming clear is that colorblindness is an adaptation not a defect. The first thing to understand is that what we humans consider “color” is only a small portion of the spectrum. Our “trichromatic” vision is superior to most mammals, who typically share the “dichromatic” vision of new world monkeys and colorblind humans, yet fish, amphibians, reptiles, birds, and even insects are able to see a wider range, even into the UV spectrum. There is a whole world of color out there that humans and our primate cousins are unaware of. What is becoming clear is that color blindness is an adaptation not a defect.

Keyword: Vision; Evolution
Link ID: 22345 - Posted: 06.22.2016

By Sarah Kaplan Some 250 million years ago, when dinosaurs roamed the Earth and early mammals were little more than tiny, fuzzy creatures that scurried around attempting to evade notice, our ancestors evolved a nifty trick. They started to become active at night. They developed sensitive whiskers and an acute sense of hearing. Their circadian rhythms shifted to let them sleep during the day. Most importantly, the composition of their eyes changed — instead of color-sensing cone photoreceptor cells, they gained thousands of light-sensitive rod cells, which allowed them to navigate a landscape lit only by the moon and stars. Mammals may no longer have to hide from the dinosaurs, but we bear the indelible marks of our scrappy, nocturnal past. Unlike every other vertebrate on land and sea, we still have rod-dominated eyes — human retinas, for example, are 95 percent rods, even though we're no longer active at night. "How did that happen? What is the mechanism that made mammals become so different?" asked Anand Swaroop, chief of the Neurobiology Neurodegeneration and Repair Laboratory at the National Eye Institute. He provides some answers to those questions in a study published in the journal Developmental Cell Monday. The findings are interesting from an evolutionary standpoint, he said, but they're also the keys to a medical mystery. If Swaroop and his colleagues can understand how our eyes evolved, perhaps they can fix some of the problems that evolved with them.

Keyword: Vision; Evolution
Link ID: 22342 - Posted: 06.21.2016

By Stephen L. Macknik Every few decades there’s a major new neuroscience discovery that changes everything. I’m not talking about your garden variety discovery. Those happen frequently (this is the golden age of neuroscience after all). But no, what I’m talking about are the holy-moly, scales-falling-from-your-eyes, time-to-rewrite-the-textbooks, game-changing discoveries. Well one was reported in this last month—simultaneously by two separate labs—and it redefines the primary organizational principle of the visual system in the cortex of the brain. This may sound technical, but it concerns how we see light and dark, and the perception of contrast. Since all sensation functions at the pleasure of contrast, these new discoveries impact neuroscience and psychology as a whole. I’ll explain below. The old way of thinking about how the wiring of the visual cortex was organized orbited around the concept of visual-edge orientation. David Hubel (my old mentor) and Torsten Wiesel (my current fellow Brooklynite)—who shared the Nobel Prize in Physiology or Medicine in 1981—arguably made the first major breakthrough concerning how information was organized in the cortex versus earlier stages of visual processing. Before their discovery, the retina (and the whole visual system) was thought to be a kind of neural camera that communicated its image into the brain. The optic nerves connect the eyes’ retinas to the thalamus at the center of the brain—and then the thalamus connects to the visual cortex at the back of the brain through a neural information superhighway called the optic radiations. Scientists knew, even way back then, that neurons at a given point of the visual scene lie physically next to the neuron that sees the neighboring piece of the visual scene. The discovery of this so called retinotopic map in the primary visual cortex (by Talbot and Marshall) was of course important, but because it matched the retinotopic mapping of the retina and thalamus, it didn’t constitute a new way of thinking. It wasn’t a game-changing discovery. © 2016 Scientific American

Keyword: Vision
Link ID: 22301 - Posted: 06.09.2016

By Jordana Cepelewicz Colors exist on a seamless spectrum, yet we assign hues to discrete categories such as “red” and “orange.” Past studies have found that a person's native language can influence the way colors are categorized and even perceived. In Russian, for example, light blue and dark blue are named as different colors, and studies find that Russian speakers can more readily distinguish between the shades. Yet scientists have wondered about the extent of such verbal influence. Are color categories purely a construct of language, or is there a physiological basis for the distinction between green and blue? A new study in infants suggests that even before acquiring language, our brain already sorts colors into the familiar groups. A team of researchers in Japan tracked neural activity in 12 prelinguistic infants as they looked at a series of geometric figures. When the shapes' color switched between green and blue, activity increased in the occipitotemporal region of the brain, an area known to process visual stimuli. When the color changed within a category, such as between two shades of green, brain activity remained steady. The team found the same pattern in six adult participants. The infants used both brain hemispheres to process color changes. Language areas are usually in the left hemisphere, so the finding provides further evidence that color categorization is not entirely dependent on language. At some point as a child grows, language must start playing a role—just ask a Russian whether a cloudless sky is the same color as the deep sea. The researchers hope to study that developmental process next. “Our results imply that the categorical color distinctions arise before the development of linguistic abilities,” says Jiale Yang, a psychologist at Chuo University and lead author of the study, published in February in PNAS. “But maybe they are later shaped by language learning.” © 2016 Scientific American

Keyword: Vision; Development of the Brain
Link ID: 22291 - Posted: 06.07.2016

By Jane E. Brody Joanne Reitano is a professor of history at LaGuardia Community College in Long Island City, Queens. She writes wonderful books about the history of the city and state, and has recently been spending many hours — sometimes all day — at her computer to revise her first book, “The Restless City.” But while sitting in front of the screen, she told me, “I developed burning in my eyes that made it very difficult to work.” After resting her eyes for a while, the discomfort abates, but it quickly returns when she goes back to the computer. “If I was playing computer games, I’d turn off the computer, but I need it to work,” the frustrated professor said. Dr. Reitano has a condition called computer vision syndrome. She is hardly alone. It can affect anyone who spends three or more hours a day in front of computer monitors, and the population at risk is potentially huge. Worldwide, up to 70 million workers are at risk for computer vision syndrome, and those numbers are only likely to grow. In a report about the condition written by eye care specialists in Nigeria and Botswana and published in Medical Practice and Reviews, the authors detail an expanding list of professionals at risk — accountants, architects, bankers, engineers, flight controllers, graphic artists, journalists, academicians, secretaries and students — all of whom “cannot work without the help of computer.” And that’s not counting the millions of children and adolescents who spend many hours a day playing computer games. Studies have indicated 70 percent to 90 percent of people who use computers extensively, whether for work or play, have one or more symptoms of computer vision syndrome. The effects of prolonged computer use are not just vision-related. Complaints include neurological symptoms like chronic headaches and musculoskeletal problems like neck and back pain. © 2016 The New York Times Company

Keyword: Vision
Link ID: 22262 - Posted: 05.30.2016

Bradley George All sorts of health information is now a few taps away on your smartphone, from how many steps you take — to how well you sleep at night. But what if you could use your phone and a computer to test your vision? A company is doing just that — and eye care professionals are upset. Some states have even banned it. A Chicago-based company called Opternative offers the test. The site asks some questions about your eyes and overall health; it also wants to know your shoe size to make sure you're the right distance from your computer monitor. You keep your smartphone in your hand and use the Web browser to answer questions about what you see on the computer screen. Like a traditional eye test, there are shapes, lines and letters. It takes about 30 minutes. "We're trying to identify how bad your vision is, so we're kind of testing your vision to failure, is the way I would describe it," says Aaron Dallek, CEO of Opternative. Dallek co-founded the company with an optometrist, who was searching for ways to offer eye exams online. "Me being a lifetime glasses and contact wearer, I was like 'Where do we start?' So, that was about 3 1/2 years ago, and we've been working on it ever since," Dallek says. © 2016 npr

Keyword: ADHD
Link ID: 22250 - Posted: 05.26.2016

By Jessica Hamzelou People who experience migraines that are made worse by light might be better off seeing the world in green. While white, blue, red and amber light all increase migraine pain, low-intensity green light seems to reduce it. The team behind the finding hope that specially developed sunglasses that screen out all wavelengths of light except green could help migraineurs. Many people experience sensitivity to light during a migraine. Photophobia, as it is known, can leave migraineurs resorting to sunglasses in well-lit rooms, or seeking the comfort of darkness. The reaction is thought to be due to the brain’s wiring. In a brain region called the thalamus, neurons that transmit sensory information from our retinas cross over with other neurons that signal pain. As a result, during migraine, light can worsen pain and pain can cause visual disturbance, says Rami Burstein at Harvard University. But not all colours of light have the same effect. Six years ago, Burstein and his colleagues studied migraine in sufferers who are blind, either due to the loss of an eye or retina, or because of retinal damage. They found that people who had some remaining retinal cells had worse migraines when they were in brightly lit environments, and that blue light seemed to have the strongest impact. The finding caused a flurry of excitement, and the promotion of sunglasses that filter out blue light. © Copyright Reed Business Information Ltd.

Keyword: Pain & Touch; Vision
Link ID: 22237 - Posted: 05.23.2016