Chapter 10. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Puneet Kollipara Blind fish that spend their lives in dark, underwater caves have lost a huge chunk of their ability to hear, scientists report in the March 27 Biology Letters. Two of the fish species studied could not hear high-pitched sounds. “I was really surprised,” says study coauthor Daphne Soares of the University of Maryland, College Park. “I expected them to hear much better than the surface fishes.” Cave-dwelling fish can lose their vision and even their eyes over many generations. And without light, eyesight can lose its importance in fish survival. Only two previous studies have explored what happens to hearing after fish lose their vision; both found no differences in hearing between cave fish and those that experience daylight. Soares and her colleagues collected fish of two blind cave-dwelling species, Typhlichthys subterraneus and Amblyopsis spelaea, from lakes in Kentucky. Specimens of a surface-dwelling species, Forbesichthys agassizii, which is closely related to the cave dwellers, came from a lake in south-central Tennessee. Back in the lab, the researchers tested fish hearing by seeing whether sounds across a range of pitches could stimulate nerve activity in the fishes’ brains. The researchers also measured the density of sound-detecting hair cells in the creatures’ ears. © Society for Science & the Public 2000 - 2013
By Brian Palmer, As a columnist who tries to explain scientific and other puzzles, I get asked a lot of strange questions. Here’s one that has been bugging me for some time: Are there visually impaired animals? Are there nearsighted deer that could use glasses or farsighted elephants that could benefit from an enormous set of contacts? How about astigmatic alligators? It seems like an animal question, but, at its core, it’s motivated by an astute comparison with humans. We’re undeniably visual creatures, yet many of us have trouble seeing well. According to some estimates, up to 42 percent of Americans are myopic, or nearsighted. Isn’t this a failure of natural selection? Shouldn’t our blurry-sighted ancestors have starved to death or been consumed by predators because of their visual handicaps? Does nature allow other animals to have such poor vision? These questions turn out to be surprisingly complicated. Let’s start out with the non-human animals and work back to our own visual shortcomings. Ophthalmologists can’t ask lions to read an eye chart or put glasses on a whale. Instead, they shine a light into the animal’s eye to see how it refracts and focuses on the retina. And with a trainable animal, such as a hawk or a horse, researchers can teach it to respond to a visual cue, then determine how well the animal picks up the cue when it is far away, very close or somehow obscured. © 1996-2013 The Washington Post
Link ID: 17942 - Posted: 03.25.2013
Philip Ball No one with even a passing interest in scientific trends will have failed to notice that the brain is the next big thing. It has been said for at least a decade, but now it’s getting serious — with, for example, the recent award by the European Commission of €500 million (US$646 million) to the Human Brain Project to build a new “infrastructure for future neuroscience” and a $1-billion initiative endorsed by President Obama. Having failed to ‘find ourselves’ in our genome, we’re starting a search in the grey matter. It’s a reasonable objective, but only if we have a clear idea of what we hope and expect to find. Some neuroscientists have grand visions, such as that adduced by Semir Zeki of University College London: “It is only by understanding the neural laws that dictate human activity in all spheres — in law, morality, religion and even economics and politics, no less than in art — that we can ever hope to achieve a more proper understanding of the nature of man.” Zeki heads the UCL Institute of Neuroesthetics. This is one of many fields that attaches ‘neuro’ to some human trait with the implication that the techniques of neuroscience, such as functional magnetic resonance imaging, will explain it. We have neurotheology, neuroethics, neurocriminology and so on. Meanwhile, in popular media, a rash of books and articles proclaim (in a profoundly ugly trope) that “this is your brain on drugs/music/religion/sport”. It seems unlikely that studies of the brain will ever be able to wholly explain how we respond to art. © 2013 Nature Publishing Group
At 7 months of age, children who are later diagnosed with autism take a split second longer to shift their gaze during a task measuring eye movements and visual attention than do typically developing infants of the same age, according to researchers supported by the National Institutes of Health. The difference between the groups’ test results was 25 to 50 milliseconds on average, the researchers found, too brief to be detected in social interactions with an infant. However, they showed that this measurable delay could be accounted for by differences in the structure and organization of actively developing neurological circuits of a child’s brain. Image of brain structure known as the splenium of the corpus callosum When they were infants, children who were later diagnosed with autism took longer to shift their gaze during a measure of eye movements than did infants who were not diagnosed with autism. The researchers believe that brain circuits involved with a brain structure known as the splenium of the corpus callosum (shown in this scan) may account for the differences in gaze shifting between the two groups. Image courtesy of Jason Wolff, Ph.D., University of North Carolina at Chapel Hill. Efficiently shifting attention early in infancy is thought to be important for later social and cognitive development. Split-second delays, the researchers suggested, could be a precursor to such well known symptoms of autism as difficulty making eye contact or following a parent’s pointing finger, problems that generally emerge after a child turns 1. Typically, autism spectrum disorder (ASD) is not diagnosed until after 3 or 4 years of age. The study appears in the American Journal of Psychiatry.
by Lizzie Wade Hundreds of millions of years ago, the Earth's seas teemed with trilobites, hard-shelled critters that resembled spiny aquatic cockroaches. Because their exoskeletons lent themselves to fossilization, scientists know a lot about what the outside of their bodies looked like. Their inner workings, however, have remained mysterious. Now, a new study has revealed the structure of the trilobite eye, bringing researchers one step closer to understanding the evolution of vision. Like today's insects and crustaceans, trilobites had compound eyes, with many different lenses focusing light onto clusters of sensory cells lying below them. The resulting image was put together a lot like a picture on your computer screen, with each lens producing one "pixel" of the whole. Because the lenses themselves were made of the mineral calcite, they often fossilized along with the rest of the trilobite's tough exoskeleton. The sensory cells underneath the lenses, however, were ephemeral, and scientists had always assumed that they had decayed without a trace. So imagine Brigitte Schoenemann's surprise when she spotted fossilized versions of these delicate sensory cells while x-raying a long dead trilobite with a computed tomography (CT) scanner. "I expected that we would see [something] in the lens of trilobites, but then suddenly we saw structures of cells below the lens," recalls Schoenemann, a physiologist at the University of Bonn and the University of Cologne, both in Germany. Inspired, she applied to take more fossils to the European Synchrotron Radiation Facility in Grenoble, France, where she could use a particle accelerator's high energy x-rays to peer deeper into the trilobites' eyes. Now, she says, she's created images of the extinct animal's entire visual system, down to the level of fossilized individual cells. © 2010 American Association for the Advancement of Science
by Michael Marshall Neanderthals may have had bigger eyes than modern humans, but while this helped them see better, it may have meant that they did not have brainpower to spare for complex social lives. If true, this may have been a disadvantage when the ice age reduced access to food, as they would not have had the skills to procure help from beyond their normal social group, speculates Robin Dunbar at the University of Oxford. Neanderthals' brains were roughly the same size as modern humans, but may have been organised differently. To find out, a team led by Dunbar studied the skulls of 13 Neanderthals and 32 anatomically modern humans. The Neanderthals had larger eye sockets. There are no Neanderthal brains to examine, but primates with larger eyes tend to have larger visual systems in their brains, suggesting Neanderthals did too. Their large bodies would also have required extra brain power to manage. Together, their larger eyes and bodies would have left them with less grey matter to dedicate to other tasks. Neanderthals may have evolved enhanced visual systems to help them see in the gloom of the northern hemisphere, Dunbar says. "It makes them better at detecting things in grim, grey conditions." As a by-product of larger eyes, they may not have been able to expand their frontal lobes – a brain area vital for social interaction – as much as modern humans. As a result, Dunbar estimates they could only maintain a social group size of around 115 individuals, rather than the 150 that we manage. © Copyright Reed Business Information Ltd.
By Tina Hesman Saey If someone shouts “look behind you,” tadpoles in Michael Levin’s laboratory may be ready. The tadpoles can see out of eyes growing from their tails, even though the organs aren’t directly wired to the animals’ brains, Levin and Douglas Blackiston, both of Tufts University in Medford, Mass., report online February 27 in the Journal of Experimental Biology. Levin and Blackiston’s findings may help scientists better understand how the brain and body communicate, including in humans, and could be important for regenerative medicine or designing prosthetic devices to replace missing body parts, says Günther Zupanc, a neuroscientist at Northeastern University in Boston. Researchers have transplanted frog eyes to other body parts for decades, but until now, no one had shown that those oddly placed eyes (called “ectopic” eyes) actually worked. Ectopic eyes on tadpoles’ tails allow the animals to distinguish blue light from red light, the Tufts team found. Levin wanted to know whether the brain is hardwired to get visual information only from eyes in the head, or whether the brain could use data coming from elsewhere. To find out, he and Blackiston started with African clawed frog tadpoles (Xenopus laevis) and removed the normal eyes. They then transplanted cells that would grow into eyes onto the animals’ tails. The experiment seemed like a natural to test how well the brain can adapt, Levin says. “There’s no way the tadpole’s brain is expecting an eye on its tail.” Expected or not, some of the tadpoles managed to detect red and blue light from their tail eyes. The researchers placed tadpoles with transplanted eyes in chambers in which half of the chamber was illuminated in blue light and the other half in red light. A mild electric shock zapped the tadpole when it was in one half of the dish so that the animal learned to associate the color with the shock. The researchers periodically switched the colors in the chamber so that the tadpoles didn’t learn that staying still would save them. © Society for Science & the Public 2000 - 2013
By Maria Konnikova Georg Tobias Ludwig Sachs was born on April 22, 1786, in the mountain village of St. Ruprecht, Kärnthen, or Carinthia – the south of present-day Austria. From the first, he was notably different from his parents and siblings: he was an albino. (His youngest sister, eleven years his junior, would be one as well.) We don’t know if this physical distinction had any negative impact on the young Georg—but it certainly piqued his curiosity. He proceeded to embark on the scientific study of albinism at the universities in Tübingen, Altdorf, and Erlangen, and at the last of these, produced his 1812 doctoral dissertation. It was about albinism: “A Natural History of Two Albinos, the Author and His Sister.” Today, though, Sachs is remembered not for his thoughts on the nature of the albino, but rather those on another curious condition that was far less noticeable—but received a chapter of its very own in his thesis all the same: synesthesia. Georg Sachs just so happens to be the first known synesthete in the medical or psychological literature. Synesthesia means, literally, a cross-mingling of the senses, when two or more senses talk to each other in a way that is not usually associated with either sense on its own. For instance, you see color when you listen to a song on the radio. Taste shapes as you take a bite of your spaghetti. Frown at the 3 on that piece of paper because it’s giving you attitude—it seems irritable. Smile at the woman you just met because her name comes with a beautiful orange glow. The variations are many, but in every scenario, there is a sensory cross-talk that reaches to a neural level. As in, if I were to put you in a scanner while you took that bite or listened to that musical composition, the relevant areas of the brain would light up: your brain would actually be experiencing color, shape, or whatever you say you’re experiencing as if you were exposed to that very stimulus. It’s a condition that affects, by the most recent estimates, roughly 4% of the population. © 2013 Scientific American
Link ID: 17854 - Posted: 02.27.2013
Canadian researchers have found out how to restore normal vision to kittens with a lazy eye without using an eye patch. The cure was relatively simple — putting the kittens in complete darkness for 10 days. Once the kittens were returned to daylight, they regained normal vision in the lazy eye within a week, reported researchers at Dalhousie University in Halifax in the journal Current Biology this month. Lazy eye is a condition where the brain effectively turns off one eye. It affects about four per cent of the population in humans, and the most common treatment is fix the vision problem (for example, by using glasses) and then patch the good eye, forcing the person to use their bad eye. Kevin Duffy, a neuroscientist who co-authored the new study, told CBC's Quirks & Quarks that the condition is typically the result of a vision problem such as a cataract, a misalignment of the eyes, or poor focus in one eye, which then causes the brain to develop abnormally. "If the eye is providing abnormal vision, then the circuits that connect to that eye are going to develop abnormally," he said. The brain "becomes effectively disconnected." © CBC 2013
By Susan Milius Slight electric fields that form around flowers may lure pollinators much as floral colors and fragrances do. In lab setups, bumblebees learned to distinguish fake flowers by their electrical fields, says sensory biologist Daniel Robert at the University of Bristol in England. Combining an electrical charge with a color helped the bees learn faster, Robert and his colleagues report online February 21 in Science. Plants, a bit like lightning rods, tend to conduct electrical charges to the ground, Robert says. And bees pick up a positive charge from the atmosphere’s invisible rain of charged particles. “Anything flying through the air, whether it’s a baseball, 767 jumbo jet, or a bee, acquires a strong positive electrostatic charge due to interaction with air molecules,” says Stephen Buchmann of the University of Arizona in Tucson. Robert and his colleagues checked whether bees could choose flowers based solely on the electric fields the plants produce. Purple metal disks (encased in plastic so as not to shock bees) stood in for flowers. Half of them, wired for 30 volts, held sips of sugar water. The unwired ones offered a bitter quinine solution that bees don’t like. Bombus terrestris bumblebees learned to choose sweet, wired disks more than 80 percent of the time. When researchers unplugged the wired disks, the bees bumbled, scoring sugar only by chance. © Society for Science & the Public 2000 - 2013
Link ID: 17832 - Posted: 02.23.2013
by Gisela Telis A stint in the dark may be just what the doctor ordered—at least if you have "lazy eye." Researchers report that kittens with the disorder, a visual impairment medically known as amblyopia that leads to poor sight or blindness in one eye, can completely recover their vision by simply spending 10 days in total darkness. "It's a remarkable study, with real potential to change how we think about recovery from amblyopia," says neuroscientist Frank Sengpiel of Cardiff University in the United Kingdom who was not involved in the work. Amblyopia affects about 4% of the human population. It's thought to start with an imbalance in vision early in life: If one eye doesn't see as well as the other—because, for example, of a cataract or astigmatism—the brain reroutes most of the connections needed for visual processing to the "good" eye. Doctors often treat the condition by patching the good eye and forcing the brain to rely on the other eye, but the treatment risks damaging vision in the good eye, and if it doesn't succeed or occur early enough in a child's visual development, the vision loss in the impaired eye can be permanent. Earlier studies with cats, whose complex visual systems are good stand-ins for human vision, showed that neurons in the brain's visual centers shrink when the brain decides to disconnect from the bad eye, but that they grow again when the cats are placed in darkness. So neuroscientists Kevin Duffy and Donald Mitchell of Dalhousie University in Halifax, Canada, set out to test darkness itself as a treatment. © 2010 American Association for the Advancement of Science
By PAM BELLUCK The device allows people with a certain type of blindness to detect crosswalks on the street, the presence of people or cars, and sometimes even large numbers or letters. The approval of the system marks a milestone in a new frontier in vision research, a field in which scientists are making strides with gene therapy, optogenetics, stem cells and other strategies. “This is just the beginning,” said Grace Shen, a director of the retinal diseases program at the National Eye Institute, which helped finance the artificial retina research and is supporting many other blindness therapy projects. “We have a lot of exciting things sitting in the wings.” The artificial retina is a sheet of electrodes implanted in the eye. The patient is also given glasses with an attached camera and a portable video processor. This system, called Argus II, allows visual signals to bypass the damaged portion of the retina and be transmitted to the brain. With the artificial retina or retinal prosthesis, a blind person cannot see in the conventional sense, but can identify outlines and boundaries of objects, especially when there is contrast between light and dark — fireworks against a night sky or black socks mixed with white ones. “Without the system, I wouldn’t be able to see anything at all, and if you were in front of me and you moved left and right, I’m not going to realize any of this,” said Elias Konstantopolous, 74, a retired electrician in Baltimore, one of about 50 Americans and Europeans who have been using the device in clinical trials. He said it helps him differentiate curbs from roads, and detect contours of objects and people. “When you have nothing, this is something. It’s a lot.” The F.D.A. approved Argus II, made by Second Sight Medical Products, to treat people with severe retinitis pigmentosa, in which photoreceptor cells, which take in light, deteriorate. © 2013 The New York Times Company
The latest bionic superhero is a rat: its brain hooked up to an infrared detector, it's become the first animal to be given a sixth sense. Developed by Miguel Nicolelis and colleagues at Duke University in Durham, North Carolina, the system connects a head-mounted sensor to a brain region that normally processes touch sensations from whiskers. As shown in this video, the rat's brain is tricked when infrared light is detected, giving it a new sense organ. "Instead of seeing, the rats learned how to touch the light," says Nicolelis. Even though the touch-processing brain area acquires a new role, the team found that it continues to process touch sensations from whiskers, somehow dividing its time between both types of signal. "The adult brain is a lot more plastic than we thought," says Nicolelis. The finding could lead to new brain prostheses that restore sight in humans with a damaged visual cortex. By bypassing the damaged part of the brain altogether, it might be possible to wire up a video camera to a part of the brain that processes touch, letting people "touch" what the camera sees. According to Nicolelis, it could also lead to superhero powers for humans. "It could be X-rays, radio waves, anything," he says. "Superman probably had a prosthetic device that nobody knew of." © Copyright Reed Business Information Ltd.
If optimists see the world through rose-colored lenses, some birds see it through ultraviolet ones. Avians have evolved ultraviolet vision quite a few times in history, a new study finds. Birds depend on their color vision for selecting mates, hunting or foraging for food, and spotting predators. Until recently, ultraviolet vision was thought to have arisen as a one-time development in birds. But a new DNA analysis of 40 bird species, reported Feb. 11 in the journal BMC Evolutionary Biology, shows the shift between violet (shorter wavelengths on the electromagnetic spectrum) and ultraviolet vision has occurred at least 14 times. "Birds see color in a different way from humans," study co-author Anders Ödeen, an animal ecologist at Uppsala University in Sweden, told LiveScience. Human eyes have three different color receptors, or cones, that are sensitive to light of different wavelengths and mix together to reveal all the colors we see. Birds, by contrast, have four cones, so "they see potentially more colors than humans do," Ödeen said. Birds themselves are split into two groups based on the color of light (wavelength) that their cones detect most acutely. Scientists define them as violet-sensitive or ultraviolet-sensitive, and the two groups don't overlap, according to Ödeen. Birds of each group would see the same objects as different hues. The specialization of color vision has its advantages. For instance, a bird with ultraviolet-sensitive vision might have spectacularly bright plumage in order to impress a female, but that same plumage might appear dull to predator birds that see only in the violet range. © 2013 Discovery Communications, LLC.
Steve Connor Scientists believe they may be able to discover why children who spend much of their time indoors rather than playing outside are more likely to develop short-sightedness following a breakthrough study into the genetics of myopia. More than two dozen genes have been linked to an increased risk of developing myopia, a finding that may finally allow researchers to understand why children today are more likely to become short-sighted than children in the past. Myopia now affects about one in three people in the West and up to 80 per cent of people in Asia. In some countries in the Far East as many as 90 per cent of children are short-sighted, compared to less than 20 per cent a couple of decades ago. Although short-sightedness tends to run in families and has a strong inherited component, the explosive increase in the condition over recent years has been linked with an increase in the time that children spend indoors either studying or playing computer games and watching TV, scientists believe. A study of more than 45,000 people from Europe and Asia has identified 24 new genes that appear to be involved in triggering the onset of myopia. It has also confirmed the role of two further genes that were already suspected of being involved with short-sightedness, the scientists said. © independent.co.uk
By Sam McNerney and Txchnologist Why do humans see colors? For years the leading hypothesis was that color vision evolved to help us spot nutritious fruits and vegetation in the forest. But in 2006, evolutionary neurobiologist Mark Changizi and colleagues proposed that color vision evolved to perceive oxygenation and hemoglobin variations in skin in order to detect social cues, emotions and the states of our friends or enemies. Just think about the reddening and whitening of the face called blushing and blanching. They elicit distinct physiological reactions that would be impossible without color vision. A few years ago Changizi left Rensselaer Polytechnic Institute where he was professor to co-found 2AI Labs with Dr. Tim Barber. Their Boise, Idaho-based research institute, funded via technology spin-offs coming out of their work, aimed at solving foundational problems in cognitive science and artificial intelligence. The move allowed Changizi to continue to conduct academic work with more intellectual freedom and less of a reliance on grants. Last summer the team at 2AI developed three pairs of glasses called O2Amps based on Changizi’s color vision theory. By visually enhancing oxygenated blood and blood pooling, the lenses amplify the social cues that allow users to perceive emotions more clearly. The eyewear is being used for a number of innovative applications. The first is medical. The lenses enhance vasculature beneath skin, helping nurses identify veins; they also amplify trauma and bruising that might be invisible to the naked eye. Many hospitals are putting the O2Amps through trials, and seeing positive results. The eyewear is also potentially useful for police and security officers– imagine if a TSA agent could more easily perceive nervousness– as well as poker players. © 2013 Scientific American,
Link ID: 17773 - Posted: 02.06.2013
By Melissa Dahl, NBC News Scarlet fever plays the villain in some of the best children's books: It got "Little Women's" Beth March. It got the child in "The Velveteen Rabbit" (although the kid survives, so, really, the fever got the stuffed rabbit). And it robbed Mary Ingalls, sweet sister of "Little House" series author Laura Ingalls Wilder, of her sight. Or so we were told. But today, the journal Pediatrics asserts that it wasn't scarlet fever that caused Mary's blindness -- it was viral meningoencephalitis, an inflammatory disease that attacks the brain. This is the sort of thing that is extremely interesting if you are interested in this sort of thing. And we'd wager many people are: The "Little House" books have remained in print ever since the initial publication of "Little House in the Big Woods" in 1932, and they're still popular today, with three titles landing on the School Library Journal's 2012 list of best children's chapter books. Even if you never read the books, you probably remember the TV series, which aired from 1974 to 1983. Dr. Beth Tarini, assistant professor of pediatrics at the University of Michigan, and her co-authors make their claim after scouring epidemiological data on blindness and infectious disease around the time of Mary's illness, plus analyzing local newspapers and Laura's unpublished memoir, "Pioneer Girl." For Tarini, it's the culmination of a project she began in medical school 10 years ago, after a confusing conversation with a professor. © 2013 NBCNews.com
Link ID: 17757 - Posted: 02.05.2013
By Christof Koch Blindness is a private matter between a person and the eyes with which he or she was born. The sentiment expressed by the late Portuguese writer José Saramago in his famous novel Blindness may be appropriate for a person born unable to see. But what about the tens of millions of people worldwide who suffer from a variety of degenerative diseases that progressively rob them of their eyesight? The problem arises in the nerve cells that line the back of their eyes, their retinas. Fortunately, help is on the way to restore some of the lost vision using advanced neuroengineering. The hallmark of the two most common forms of adult-onset blindness in the West, age-related macular degeneration and retinitis pigmentosa, is that the photoreceptors responsible for converting the incoming rays of light into nervous energy gradually die off. Yet the roughly one million ganglion cells, whose output wires bundle up and leave the eyeball in the form of the optic nerve, remain intact. So visionary (pun intended) clinical ophthalmologists have paired up with technologists to bypass the defective parts of the retina by directly stimulating ganglion cells via advanced electronics. One of the most successful of such prosthetic devices, manufactured by a California company called Second Sight, uses a camera integrated into eyeglasses to convert images into electronic patterns. These patterns are sent to a small, 10- by six-pixel microelectrode array surgically positioned onto the retina. It stimulates neural processes that relay their information in the form of binary electrical pulses, so-called action potentials or spikes, to the brain proper. © 2013 Scientific American,
By Nathan Seppa Rogers Hornsby, one of the best hitters ever to swing a baseball bat, had a reputation for being standoffish. Teammates complained that he didn’t socialize, even balking at attending movies — prime entertainment during the 1920s. Sitting in a dark theater watching a bright screen made it difficult to hit a baseball, Hornsby used to say. Hard to argue with a guy who reportedly had terrific eyesight and who finished three seasons with a batting average better than .400. Hornsby might have been onto something that scientists are only now coming to embrace: Too much time spent indoors may contribute to nearsightedness, also called myopia. Nearsightedness has increased steadily in North America and Europe in recent decades, with one-third of adults in the United States now nearsighted. That figure alone is cause for concern. But the rise of myopia in East Asia is downright alarming. Recent studies of young men in Seoul and college students in Shanghai find that more than 95 percent are nearsighted. Increases also have shown up across other urban centers in the Far East. Studies first uncovered a link between myopia and limited outdoor time during childhood just a few years ago. At the time, many researchers were taken aback. The notion that child’s play might promote normal eye growth seemed almost magical. “Certainly, before five years ago, I don’t think anybody had taken much notice of how much time people spent outdoors,” says Jeremy Guggenheim, an optometrist who has researched myopia in Wales and is currently at Hong Kong Polytechnic University. He believes the findings offer a “new and exciting direction” for research. © Society for Science & the Public 2000 - 2013
By James Gallagher Health and science reporter, BBC News People who regularly take aspirin for many years, such as those with heart problems, are more likely to develop a form of blindness, researchers say. A study on 2,389 people, in the journal JAMA Internal Medicine, showed aspirin takers had twice the risk of "wet" age-related macular degeneration. The disease damages the 'sweet spot' in the retina, obscuring details in the centre of a patient's field of vision. The researchers said there was not yet enough evidence to change aspirin use. Taking low doses of aspirin every day does reduce the risk of a stroke or heart attack in patients with cardiovascular disease. There are even suggestions it could prevent cancer. One in 10 people in the study, conducted at the University of Sydney, were taking aspirin at least once a week. On average the participants were in their mid-60s. Eye tests were performed after five, 10 and 15 years. By the end of the study, the researchers showed that 9.3% of patients taking aspirin developed wet age-related macular degeneration (AMD) compared with 3.7% of patients who did not take aspirin. Their report said: "The increased risk of [wet] AMD was detected only after 10 or 15 years, suggesting that cumulative dosing is important. BBC © 2013
Link ID: 17704 - Posted: 01.22.2013