Chapter 7. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
2014 by Andy Coghlan Seeing is definitely believing when it comes to stem cell therapy. A blind man has recovered enough sight to ride his horse. A woman who could see no letters at all on a standard eye test chart can now read the letters on the top four lines. Others have recovered the ability to see colour. All have had injections of specialised retinal cells in their eyes to replace ones lost through age or disease. A trial in 18 people with degenerative eye conditions is being hailed as the most promising yet for a treatment based on human embryonic stem cells. "We've been hearing about their potential for more than a decade, but the results have always been in mice and rats, and no one has shown they're safe or effective in humans long term," says Robert Lanza of Advanced Cell Technology in Marlborough, Massachusetts, the company that carried out the stem cell intervention. "Now, we've shown both that they're safe and that there's a real chance these cells can help people." Ten years ago, the team at Advanced Cell Technology announced that it had successfully converted human embryonic stem cells into retinal pigment epithelial cells. These cells help keep the eyes' light-detecting rods and cones healthy. But when retinal pigment epithelial cells deteriorate, blindness can occur. This happens in age-related macular degeneration and Stargardt's macular dystrophy. In a bid to reverse this, Lanza's team injected retinal cells into one of each of the 18 participants' eyes, half of whom had age-related macular degeneration and half had Stargardt's. A year later, 10 people's eyes had improved, and the eyes of the others had stabilised. Untreated eyes had continued to deteriorate. © Copyright Reed Business Information Ltd.
David Cyranoski A Japanese patient with a debilitating eye disease is about to become the first person to be treated with induced pluripotent stem cells, which have generated enthusiastic expectations and earned their inventor a Nobel Prize. A health-ministry committee has vetted researchers' safety tests and cleared the team to begin the experimental procedure. Masayo Takahashi, an ophthalmologist at the RIKEN Center for Developmental Biology (CDB) in Kobe, has been using induced pluripotent stem (iPS) cells to prepare a treatment for age-related macular degeneration. Unlike embryonic stem cells, iPS cells are produced from adult cells, so they can be genetically tailored to each recipient. They are capable of becoming any cell type in the body, and have the potential to treat a wide range of diseases. The CDB trial will be the first opportunity for the technology to prove its clinical value. In age-related macular degeneration, extra blood vessels form in the eye, destabilizing a supportive base layer of the retina known as the retinal pigment epithelium. This results in the loss of the light-sensitive photoreceptors that are anchored in the epithelium, and often leads to blindness. Takahashi took skin cells from people with the disease and converted them to iPS cells. She then coaxed these cells to become retinal pigment epithelium cells, and then to grow into thin sheets that can be transplanted to the damaged retina. Takahashi and her collaborators have shown in monkey studies1 that iPS cells generated from the recipients' own cells do not provoke an immune reaction that causes them to be rejected. There have been concerns that iPS cells could cause tumours, but Takahashi's team has found that to be unlikely in mice2 and monkeys1. © 2014 Nature Publishing Group
Corie Lok Tami Morehouse's vision was not great as a child, but as a teenager she noticed it slipping even further. The words she was trying to read began disappearing into the page and eventually everything faded to a dull, grey haze. The culprit was a form of Leber's congenital amaurosis (LCA), a group of genetic disorders in which light-sensing cells in the retina die off, usually resulting in total blindness by the time people reach their thirties or forties. But Morehouse got a reprieve. In 2009, at the age of 44, the social worker from Ashtabula, Ohio, became the oldest participant in a ground-breaking clinical trial to test a gene therapy for LCA. Now, she says, she can see her children's eyes, and the colours of the sunset seem brighter than before. Morehouse calls these improvements life-changing, but they are minor compared with the changes in some of the younger trial participants. Corey Haas was eight years old when he was treated in 2008 — the youngest person to receive the therapy. He went from using a white cane to riding a bicycle and playing softball. Morehouse often wonders what she would be able to see now if she had been closer to Haas's age when she had the therapy. “I was born a little too soon,” she says. Visual impairment affects some 285 million people worldwide, about 39 million of whom are considered blind, according to a 2010 estimate from the World Health Organization. Roughly 80% of visual impairment is preventable or curable, including operable conditions such as cataracts that account for much of the blindness in the developing world. But retinal-degeneration disorders — including age-related macular degeneration, the leading cause of blindness in the developed world — have no cure. © 2014 Nature Publishing Group
Link ID: 20064 - Posted: 09.11.2014
By Meeri Kim The pervasive glow of electronic devices may be an impediment to a good night’s sleep. That’s particularly noticeable now, when families are adjusting to early wake-up times for school. Teenagers can find it especially hard to get started in the morning. For nocturnal animals, it spurs activity. For daytime species such as humans, melatonin signals that it’s time to sleep. As lamps switch off in teens’ bedrooms across America, the lights from their computer screens, smartphones and tablets often stay on throughout the night. These devices emit light of all colors, but it’s the blues in particular that pose a danger to sleep. Blue light is especially good at preventing the release of melatonin, a hormone associated with nighttime. Ordinarily, the pineal gland, a pea-size organ in the brain, begins to release melatonin a couple of hours before your regular bedtime. The hormone is no sleeping pill, but it does reduce alertness and make sleep more inviting. However, light — particularly of the blue variety — can keep the pineal gland from releasing melatonin, thus warding off sleepiness. You don’t have to be staring directly at a television or computer screen: If enough blue light hits the eye, the gland can stop releasing melatonin. So easing into bed with a tablet or a laptop makes it harder to take a long snooze, especially for sleep-deprived teenagers who are more vulnerable to the effects of light than adults. During adolescence, the circadian rhythm shifts, and teens feel more awake later at night. Switching on a TV show or video game just before bedtime will push off sleepiness even later even if they have to be up by 6 a.m. to get to school on time.
by Penny Sarchet It's a selfie that might save your sight. An implanted sensor could help people with glaucoma monitor the pressure in their eyes using a smartphone camera. The second biggest cause of blindness after cataracts, glaucoma occurs when fluid builds up in the eye. This raises the pressure, damaging the optic nerve. Accurate pressure readings are crucial for giving the right treatment, but one-off measurements during check-ups produce variable results and can be misleading. Yossi Mandel at Bar-Ilan University in Ramat Gan, Israel, and his colleagues have developed a pressure sensor which can be inserted into the eye during surgery to provide easy, regular monitoring from home. A few millimetres in length, the sensor can be embedded into the synthetic lenses used to replace the natural lenses of people with cataracts. It works like a miniature barometer, and contains a fluid column that rises with eye pressure. The level can be read at any time using a smartphone camera fitted with a special optical adapter. Software then analyses the image and calculates the reading. "Continuous monitoring is a clear unmet need in glaucoma," says Francesca Cordeiro, a glaucoma researcher at University College London. Mandel believes self-monitoring will lead to better treatment of glaucoma, and could enable people to skip unnecessary appointments when their eye pressures are on target. © Copyright Reed Business Information Ltd.
By ELEANOR LEW I was watching Diane Sawyer on the evening news, wondering how she manages year after year to look so young, when suddenly her face disappeared. Now you see. Now you don’t. One second. That’s all it took. A dense black inkblot shaped like a map of England and southern Norway suddenly blocked my view of Diane so that all I could see was her blond hair and shoulders. At first, I thought it was the television set. Changing channels didn’t bring her face back, nor did rubbing my eyes. “It’s permanent vision loss,” my ophthalmologist said. “Your optic nerve and retina buckled.” He drew a picture of the inside of my right eye, the affected one, and explained that my degenerative myopia, an inherited condition that is far less common than ordinary nearsightedness but still a leading cause of blindness worldwide, had caused my eyeball to elongate excessively. It looked like a house whose walls had been stretched so thin that the roof caved. The doctor didn’t say much else, didn’t make any recommendations for physical or occupational therapy, didn’t tell me to call him if I noticed any changes. I left his office shaken. “What if it happens in my other eye? What if…?” In the weeks that followed, I began to notice bizarre changes in my right eye. Frequent flashing lights, like a dying neon tube, sometimes flickering color or bright white light, so intense I swore I could hear them buzz. I observed my peripheral vision diminishing. England and Norway morphed into a large, bushy oak tree with a short and wide trunk. At a park, I came upon children playing. When I covered my good eye with my hand, I could see only a sliver of sky, and legs and shoes of children running in and out of the tree. I wrote off the psychedelic changes to the “buckling” and didn’t bother to call my ophthalmologist. But I was scared and needed help. © 2014 The New York Times Company
Link ID: 19982 - Posted: 08.22.2014
by Bethany Brookshire For most of us, where our birthday falls in the year doesn’t matter much in the grand scheme of things. A July baby doesn’t make more mistakes than a Christmas kid — at least, not because of their birthdays. But for neurons, birth date plays an important role in how these cells find their connections in the brain, a new study finds. Nerve cells that form early in development will make lots of connections — and lots of mistakes. Neurons formed later are much more precise in their targeting. The findings are an important clue to help scientists understand how the brain wires itself during development. And with more information on how the brain forms its network, scientists might begin to see what happens when that network is injured or malformed. Many, many brain cells are born as the brain develops. Each one has to reach out and make connections, sometimes to other cells around them and sometimes to other regions of the brain. To do this, these nerve cells send out axons, long, incredibly thin projections that reach out to other regions. How mammalian axons end up at their final destination in the growing brain remains a mystery. To find out how developing brains get wired up, Jessica Osterhout and colleagues at the University of California, San Diego and colleagues started in the eye. They looked at retinal ganglion cells, neurons that connect the brain and the eye. “It’s easy to access,” explains Andrew Huberman, a neuroscientist at UC San Diego and an author on the paper. “Your retina is basically part of the central nervous system that got squeezed into your eye during development.” Retinal ganglion cells all have the same function: To convey visual information from the eyes to the brain. But they are not all the same. © Society for Science & the Public 2000 - 2013
By Emily Underwood Old age may make us wiser, but it rarely makes us quicker. In addition to slowing down physically, most people lose points on intelligence tests as they enter their golden years. Now, new research suggests the loss of certain types of cognitive skills with age may stem from problems with basic sensory tasks, such as making quick judgments based on visual information. Although there’s no clear causal link between the two types of thinking yet, the new work could provide a simple, affordable way to track mental decline in senior citizens, scientists say. Since the 1970s, researchers who study intelligence have hypothesized that smartness, as measured on standard IQ tests, may hinge on the ability to quickly and efficiently sample sensory information from the environment, says Stuart Ritchie, a psychologist at the University of Edinburgh in the United Kingdom. Today it’s well known that people who score high on such tests do, indeed, tend to process such information more quickly than those who do poorly, but it’s not clear how these measures change with age, Ritchie says. Studying older people over time can be challenging given their uncertain health, but Ritchie and his colleagues had an unusual resource in the Lothian Birth Cohort, a group of people born in 1936 whose mental function has been periodically tested by the Scottish government since 1947—their first IQ test was at age 11. After recruiting more than 600 cohort members for their study, Ritchie and colleagues tracked their scores on a simple visual task three times over 10 years, repeating the test at the mean ages of 70, 73, and 76. © 2014 American Association for the Advancement of Science
|By Ingrid Wickelgren One important function of your inner ear is stabilizing your vision when your head is turning. When your head turns one way, your vestibular system moves your eyes in the opposite direction so that what you are looking at remains stable. To see for yourself how your inner ears make this adjustment, called the vestibulo-ocular reflex, hold your thumb upright at arm’s length. Shake your head back and forth about twice per second while looking at your thumb. See that your thumb remains in focus. Now create the same relative motion by swinging your arm back and forth about five inches at the same speed. Notice that your thumb is blurry. To see an object clearly, the image must remain stationary on your retina. When your head turns, your vestibular system very rapidly moves your eyes in the opposite direction to create this stability. When the thumb moves, your visual system similarly directs the eyes to follow, but the movement is too slow to track a fast-moving object, causing blur. © 2014 Scientific American
By Phil Plait From the twisted mind of brusspup comes another brain-hurting illusion. This one is really, really convincing, so tell me: When you look at this video, you’re seeing a circle of eight dots rotating as it spins around inside a bigger circle, right? No, you’re not. As brusspup shows, each individual white dot is moving in a straight line! The trick here is two-fold: One is that the dots aren’t moving at constant velocity (you can see that in the video at the 0:44 mark), and that combined their motion mimics what we’d see if a smaller circle is rolling around inside a big one. Try as I may, when I look at this video I can’t make my brain see the dots moving linearly; it looks like a circle rolling. If I focus on one of the dots I can see it moving back and forth along a line, but the others still look like the rim of a circle rolling around. For most illusions there’s a moment when your brain can see what’s going on and the illusion shatters, but not with this one. It’s maddening. When I was a kid, Spirograph was a very popular “game.” It wasn’t really a game, but a set of clear plastic disks with gear teeth around them (or rings with teeth on the inside). They had holes in them; you’d pin a ring down on a piece of paper, then take another disk, place it inside the ring, put your pencil tip in a hole, and roll the inner disk around inside the outer ring. The results were really lovely and graceful interlocking and overlapping curves. If you’re a lot younger than me and missed this craze, here’s a video that’ll help you picture it: © 2014 The Slate Group LLC.
Link ID: 19874 - Posted: 07.24.2014
By Sid Perkins Forget the phrase “blind as a bat.” New experiments suggest that members of one species of these furry flyers—Myotis myotis, the greater mouse-eared bat—can do something no other mammal is known to do: They detect and use polarized light to calibrate their long-distance navigation. Previous research hinted that these bats reset their magnetic compass each night based on cues visible at sunset, but the particular cue or cues hadn’t been identified. In the new study, researchers placed bats in boxes in which the polarization of light could be controlled and shifted. After letting the bats experience sundown at a site near their typical roost, the team waited until after midnight (when polarized light was no longer visible in the sky), transported the animals to two sites between 20 and 25 kilometers from the roost, strapped radio tracking devices to them, and then released them. In general, bats whose polarization wasn’t shifted took off for home in the proper direction. But those that had seen polarization shifted 90° at sunset headed off in directions that, on average, pointed 90° away from the true bearing of home, the researchers report online today in Nature Communications. It’s not clear how the bats discern the polarized light, but it may be related to the type or alignment of light-detecting pigments in their retinas, the team suggests. The bats may have evolved to reset their navigation system using polarized light because that cue persists long after sunset and is available even when skies are cloudy. Besides these bats (and it’s not known whether other species of bat can do this, too), researchers have found that certain insects, birds, reptiles, and amphibians can navigate using polarized light. © 2014 American Association for the Advancement of Science
Associated Press Scientists at the Massachusetts Institute of Technology are developing an audio reading device to be worn on the index finger of people whose vision is impaired, giving them affordable and immediate access to printed words. The so-called FingerReader, a prototype produced by a 3-D printer, fits like a ring on the user's finger, equipped with a small camera that scans text. A synthesized voice reads words aloud, quickly translating books, restaurant menus and other needed materials for daily living, especially away from home or office. Reading is as easy as pointing the finger at text. Special software tracks the finger movement, identifies words and processes the information. The device has vibration motors that alert readers when they stray from the script, said Roy Shilkrot, who is developing the device at the MIT Media Lab. For Jerry Berrier, 62, who was born blind, the promise of the FingerReader is its portability and offer of real-time functionality at school, a doctor's office and restaurants. "When I go to the doctor's office, there may be forms that I want to read before I sign them," Berrier said. He said there are other optical character recognition devices on the market for those with vision impairments, but none that he knows of that will read in real time. Berrier manages training and evaluation for a federal program that distributes technology to low-income people in Massachusetts and Rhode Island who have lost their sight and hearing. He works from the Perkins School for the Blind in Watertown, Mass. Developing the gizmo has taken three years of software coding, experimenting with various designs and working on feedback from a test group of visually impaired people. Much work remains before it is ready for the market, Shilkrot said, including making it work on cell phones. © 2014 Hearst Communications, Inc.
By NICHOLAS BAKALAR Can too much studying ruin your eyesight? Maybe. A German study has found that the more education a person has, the greater the likelihood that he will be nearsighted. The researchers did ophthalmological and physical examinations on 4,685 people ages 35 to 74. About 38 percent were nearsighted. But of those who graduated after 13 years in the three-tiered German secondary school system, about 60.3 percent were nearsighted, compared with 41.6 percent of those who graduated after 10 years, 27.2 percent of those who graduated after nine years and 26.9 percent of those who never graduated. The percentage of myopic people was also higher among university graduates than among graduates of vocational schools or those who had no professional training at all. The study was published online in Ophthalmology. The association remained after adjusting for age, gender and many known myopia-associated variations in DNA sequences. “The effect on myopia of the genetic variations is much less than the effect of education,” said the lead author, Dr. Alireza Mirshahi, an ophthalmologist at the University Medical Center in Mainz. “We used to think that myopia was predetermined by genetics. This is one proof that environmental factors have a much higher effect than we thought.” © 2014 The New York Times Company
Link ID: 19805 - Posted: 07.09.2014
Check out the winner of the 2014 Best Illusion of the Year Contest. Created by psychologists at the University of Nevada, Reno, this optical illusion starts with an image of a circle surrounded by other circles. As the video begins and the exterior circles grow and shrink, it looks like the center circle is changing size, too—but it isn’t. Dubbed “The Dynamic Ebbinghaus,” the trick is a spinoff of the original Ebbinghaus mirage created in the 1800s.
Link ID: 19800 - Posted: 07.08.2014
Hassan DuRant The colorful little guy pictured above puts the eyes of every other animal to shame. Whereas humans receive color information via three color receptors in our eyes, mantis shrimp (Neogonodactylus oerstedii) have 12. Six of these differentiate five discrete wavelengths of ultraviolet light, researchers report online today in Current Biology. The mantis shrimp’s vision is possible by making use of specially tuned, UV-specific optical filters in its color-detecting cone cells. The optical filters are made of mycosporine-like amino acids (MAAs), a substance commonly found in the skin or exoskeleton of marine organisms. Often referred to as nature’s sunscreens, MAAs are usually employed to protect an organism from DNA-damaging UV rays; however, the mantis shrimp has incorporated them into powerful spectral tuning filters. Though the reason for the mantis shrimp’s complex visual perception is poorly understood, one possibility is that the UV detection could help visualize otherwise difficult-to-see prey on coral reefs. Many organisms absorb UV light—these organisms would be easy to spot as black objects in a bright world. © 2014 American Association for the Advancement of Science
Link ID: 19789 - Posted: 07.04.2014
Simon Makin Running helps mice to recover from a type of blindness caused by sensory deprivation early in life, researchers report. The study, published on 26 June in eLife1, also illuminates processes underlying the brain’s ability to rewire itself in response to experience — a phenomenon known as plasticity, which neuroscientists believe is the basis of learning. More than 50 years ago, neurophysiologists David Hubel and Torsten Wiesel cracked the 'code' used to send information from the eyes to the brain. They also showed that the visual cortex develops properly only if it receives input from both eyes early in life. If one eye is deprived of sight during this ‘critical period’, the result is amblyopia, or ‘lazy eye’, a state of near blindness. This can happen to someone born with a droopy eyelid, cataract or other defect not corrected in time. If the eye is opened in adulthood, recovery can be slow and incomplete. In 2010, neuroscientists Christopher Niell and Michael Stryker, both at the University of California, San Francisco (UCSF), showed that running more than doubled the response of mice's visual cortex neurons to visual stimulation2 (see 'Neuroscience: Through the eyes of a mouse'). Stryker says that it is probably more important, and taxing, to keep track of the environment when navigating it at speed, and that lower responsiveness at rest may have evolved to conserve energy in less-demanding situations. “It makes sense to put the visual system in a high-gain state when you’re moving through the environment, because vision tells you about far away things, whereas touch only tells you about things that are close,” he says. © 2014 Nature Publishing Group
by Sarah Zielinski Would you recognize a stop sign if it was a different shape, though still red and white? Probably, though there might be a bit of a delay. After all, your brain has long been trained to expect a red-and-white octagon to mean “stop.” The animal and plant world also uses colorful signals. And it would make sense if a species always used the same pattern to signal the same thing — like how we can identify western black widows by the distinctive red hourglass found on the adult spiders’ back. But that doesn’t always happen. Even with really important signals, such as the ones that tell a predator, “Don’t eat me — I’m poisonous.” Consider the dyeing dart frog (Dendrobates tinctorius), which is found in lowland forests of the Guianas and Brazil. The backs of the 5-centimeter-long frogs are covered with a yellow-and-black pattern, which warns of its poisonous nature. But that pattern isn’t the same from frog to frog. Some are decorated with an elongated pattern; others have more complex, sometimes interrupted patterns. The difference in patterns should make it harder for predators to recognize the warning signal. So why is there such variety? Because the patterns aren’t always viewed on a static frog, and the different ways that the frogs move affects how predators see the amphibians, according to a study published June 18 in Biology Letters. Bibiana Rojas of Deakin University in Geelong, Australia, and colleagues studied the frogs in a nature reserve in French Guiana from February to July 2011. They found 25 female and 14 male frogs, following each for two hours from about 2.5 meters away, where the frog wouldn’t notice a scientist. As a frog moved, a researcher would follow, recording how far it went and in what direction. Each frog was then photographed. © Society for Science & the Public 2000 - 2013.
By HELENE STAPINSKI A few months ago, my 10-year-old daughter, Paulina, was suffering from a bad headache right before bedtime. She went to lie down and I sat beside her, stroking her head. After a few minutes, she looked up at me and said, “Everything in the room looks really small.” And I suddenly remembered: When I was young, I too would “see things far away,” as I once described it to my mother — as if everything in the room were at the wrong end of a telescope. The episodes could last anywhere from a few minutes to an hour, but they eventually faded as I grew older. I asked Paulina if this was the first time she had experienced such a thing. She shook her head and said it happened every now and then. When I was a little girl, I told her, it would happen to me when I had a fever or was nervous. I told her not to worry and that it would go away on its own. Soon she fell asleep, and I ran straight to my computer. Within minutes, I discovered that there was an actual name for what turns out to be a very rare affliction — Alice in Wonderland Syndrome. Episodes usually include micropsia (objects appear small) or macropsia (objects appear large). Some sufferers perceive their own body parts to be larger or smaller. For me, and Paulina, furniture a few feet away seemed small enough to fit inside a dollhouse. Dr. John Todd, a British psychiatrist, gave the disorder its name in a 1955 paper, noting that the misperceptions resemble Lewis Carroll’s descriptions of what happened to Alice. It’s also known as Todd’s Syndrome. Alice in Wonderland Syndrome is not an optical problem or a hallucination. Instead, it is most likely caused by a change in a portion of the brain, likely the parietal lobe, that processes perceptions of the environment. Some specialists consider it a type of aura, a sensory warning preceding a migraine. And the doctors confirmed that it usually goes away by adulthood. © 2014 The New York Times Company
By Gary Stix James DiCarlo: We all have this intuitive feel for what object recognition is. It’s the ability to discriminate your face from other faces, a car from other cars, a dog from a camel, that ability we all intuitively feel. But making progress in understanding how our brains are able to accomplish that is a very challenging problem and part of the reason is that it’s challenging to define what it isn’t and is. We take this problem for granted because it seems effortless to us. However, a computer vision person would tell you is that this is an extremely challenging problem because each object presents an essentially infinite number of images to your retina so you essentially never see the same image of each object twice. SA: It seems like object recognition is actually one of the big problems both in neuroscience and in the computational science of machine learning? DiCarlo: That’s right., not only machine learning but also in psychology or cognitive science because the objects that we see are the sources in the world of what we use to build higher cognition, things like memory and decision-making. Should I reach for this, should I avoid it? Our brains can’t do what you would call higher cognition without these foundational elements that we often take for granted. SA: Maybe you can talk about what’s actually happening in the brain during this process. DiCarlo: It’s been known for several decades that there’s a portion of the brain, the temporal lobe down the sides of our head, that, when lost or damaged in humans and non-human primates, leads to deficits of recognition. So we had clues that that’s where these algorithms for object recognition are living. But just saying that part of your brain solves the problem is not really specific. It’s still a very large piece of tissue. Anatomy tells us that there’s a whole network of areas that exist there, and now the tools of neurophysiology and still more advanced tools allow us to go in and look more closely at the neural activity, especially in non-human primates. We can then begin to decipher the actual computations to the level that an engineer might, for instance, in order to emulate what’s going on in our heads. © 2014 Scientific American
By Adam Brimelow Health Correspondent, BBC News Researchers from Oxford University say they've made a breakthrough in developing smart glasses for people with severe sight loss. The glasses enhance images of nearby people and objects on to the lenses, providing a much clearer sense of surroundings. They have allowed some people to see their guide dogs for the first time. The Royal National Institute of Blind People says they could be "incredibly important". Lyn Oliver has a progressive eye disease which means she has very limited vision. Now 70, she was diagnosed with retinitis pigmentosa in her early twenties. She can spot movement but describes her sight as "smudged and splattered". Her guide dog Jess helps her find her way around - avoiding most obstacles and hazards - but can't convey other information about her surroundings. Lyn is one of nearly two million people in the UK with a sight problem which seriously affects their daily lives. Most though have at least some residual sight. Researchers at Oxford University have developed a way to enhance this - using smart glasses. They are fitted with a specially adapted 3D camera. retinitis pigmentosa Dark spots across the retina (back of the eye) correspond with the extent of vision loss in retinitis pigmentosa The images are processed by computer and projected in real-time on to the lenses - so people and objects nearby become bright and clearly defined. 'More independent' Lyn Oliver has tried some of the early prototypes, but the latest model marks a key stage in the project, offering greater clarity and detail than ever before. Dr Stephen Hicks, from the University of Oxford, who has led the project, says they are now ready to be taken from the research setting to be used in the home. BBC © 2014