Chapter 7. Vision: From Eye to Brain

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1331

By Kelly Servick Do old and damaged cells remember what it was like to be young? That’s the suggestion of new study, in which scientists reprogrammed neurons in mouse eyes to make them more resistant to damage and able to regrow after injury—like the cells of younger mice. The study suggests that hallmarks of aging, and possibly the keys to reversing it, lie in the epigenome, the proteins and other compounds that decorate DNA and influence what genes are turned on or off. The idea that aging cells hold a memory of their young epigenome “is very provocative,” says Maximina Yun, a regenerative biologist at the Dresden University of Technology who was not involved in the work. The new study “supports that [idea], but by no means proves it,” she adds. If researchers can replicate these results in other animals and explain their mechanism, she says, the work could lead to treatments in humans for age-related disease in the eye and beyond. Epigenetic factors influence our metabolism, our susceptibility to various diseases, and even the way emotional trauma is passed through generations. Molecular biologist David Sinclair of Harvard Medical School, who has long been on the hunt for antiaging strategies, has also looked for signs of aging in the epigenome. “The big question was, is there a reset button?” he says. “Would cells know how to become younger and healthier?” In the new study, Sinclair and his collaborators aimed to rejuvenate cells by inserting genes that encode “reprogramming factors,” which regulate gene expression—the reading of DNA to make proteins. The team chose three of the four factors scientists have used for more than 10 years to turn adult cells into induced pluripotent stem cells, which resemble the cells of an early embryo. (Exposing animals to all four factors can cause tumors.) © 2020 American Association for the Advancement of Science.

Keyword: Vision
Link ID: 27608 - Posted: 12.05.2020

Researchers at the National Eye Institute (NEI) have decoded brain maps of human color perception. The findings, published today in Current Biology, open a window into how color processing is organized in the brain, and how the brain recognizes and groups colors in the environment. The study may have implications for the development of machine-brain interfaces for visual prosthetics. NEI is part of the National Institutes of Health. “This is one of the first studies to determine what color a person is seeing based on direct measurements of brain activity,” said Bevil Conway, Ph.D., chief of NEI’s Unit on Sensation, Cognition and Action, who led the study. “The approach lets us get at fundamental questions of how we perceive, categorize, and understand color.” The brain uses light signals detected by the retina’s cone photoreceptors as the building blocks for color perception. Three types of cone photoreceptors detect light over a range of wavelengths. The brain mixes and categorizes these signals to perceive color in a process that is not well understood. To examine this process, Isabelle Rosenthal, Katherine Hermann, and Shridhar Singh, post-baccalaureate fellows in Conway’s lab and co-first authors on the study, used magnetoencephalography or “MEG,” a 50-year-old technology that noninvasively records the tiny magnetic fields that accompany brain activity. The technique provides a direct measurement of brain cell activity using an array of sensors around the head. It reveals the millisecond-by-millisecond changes that happen in the brain to enable vision. The researchers recorded patterns of activity as volunteers viewed specially designed color images and reported the colors they saw.

Keyword: Vision; Brain imaging
Link ID: 27588 - Posted: 11.21.2020

By Jessica Wapner We are living through an inarguably challenging time. The U.S. has been facing its highest daily COVID-19 case counts yet. Uncertainty and division continue to dog the aftermath of the presidential election. And we are heading into a long, cold winter, when socializing outdoors will be less of an option. We are a nation and a world under stress. But Andrew Huberman, a neuroscientist at Stanford University who studies the visual system, sees matters a bit differently. Stress, he says, is not just about the content of what we are reading or the images we are seeing. It is about how our eyes and breathing change in response to the world and the cascades of events that follow. And both of these bodily processes also offer us easy and accessible releases from stress. Huberman’s assertions are based on both established and emerging science. He has spent the past 20 years unraveling the inner workings of the visual system. In 2018, for example, his lab reported its discovery of brain pathways connected with fear and paralysis that respond specifically to visual threats. And a small but growing body of research makes the case that altering our breathing can alter our brain. In 2017 Mark Krasnow of Stanford University, Jack Feldman of the University of California, Los Angeles, and their colleagues identified a tight link between neurons responsible for controlling breathing and the region of the brain responsible for arousal and panic. This growing understanding of how vision and breathing directly affect the brain—rather than the more nebulous categories of the mind and feelings—can come in handy as we continue to face mounting challenges around the globe, across the U.S. and in our own lives. Scientific American spoke with Huberman about how it all works. © 2020 Scientific American

Keyword: Stress; Vision
Link ID: 27584 - Posted: 11.18.2020

By Elizabeth Pennisi When Ian Ausprey outfitted dozens of birds with photosensor-containing backpacks, the University of Florida graduate student was hoping to learn how light affected their behavior. The unusual study, which tracked 15 species in Peru’s cloud forest, has now found that eye size can help predict where birds breed and feed—the bigger the eye, the smaller the prey or the darker the environment. The study also suggests birds with big eyes are especially at risk as humans convert forests into farmland. The study reveals a “fascinating new area of sensory biology,” says Richard Prum, an evolutionary biologist at Yale University who was not involved in the new work. It also shows the size of a bird’s eye says a lot about its owner, adds Matthew Walsh, an evolutionary ecologist at the University of Texas, Arlington, also not involved with the work. Light matters—not just for plants, but also for animals. Large eyes have long been associated with the need to see in dim conditions, but very little research has looked in depth at light’s impact on behavior. Recently, scientists have shown that the relative size of frogs’ eyes corresponds to where they live, hunt, and breed. And several studies published in the past 3 years suggest the eyes of killifish and water fleas vary in size depending on the presence of predators. With no predators, even slightly larger eyes offer a potential survival advantage. To find out how eye size might matter for birds, Ausprey and his adviser, Scott Robinson, an ecologist at the Florida Museum of Natural History, turned to the 240 species they had identified in one of Peru’s many cloud forests. The study area included a range of habitats—dense stands of trees, farms with fencerows, shrubby areas, and open ground. Because light can vary considerably by height—for example, in the tropics, the forest floor can have just 1% of the light at the tops of the trees—they included species living from the ground to the treetops. © 2020 American Association for the Advancement of Science.

Keyword: Vision; Evolution
Link ID: 27554 - Posted: 10.28.2020

By Lisa Sanders, M.D. The 61-year-old woman put on her reading glasses to try to decipher the tiny black squiggles on the back of the package of instant pudding. Was it two cups of milk? Or three? The glasses didn’t seem to help. The fuzzy, faded marks refused to become letters. The right side of her head throbbed — as it had for weeks. The constant aggravation of the headache made everything harder, and it certainly wasn’t helping her read this label. She rubbed her forehead, then brought her hand down to cover her right eye. The box disappeared into darkness. She could see only the upper-left corner of the instructions. Everything else was black. She quickly moved her hand to cover her left eye. The tiny letters sprang into focus. She moved back to the right: blackness. Over to the left: light and letters. That scared her. For the past few months, she’d had one of the worst headaches she had ever experienced in her lifetime of headaches. One that wouldn’t go away no matter how much ibuprofen she took. One that persisted through all the different medications she was given for her migraines. Was this terrible headache now affecting her vision? The neurologists she saw over the years always asked her about visual changes. She’d never had them, until now. “Should I take you to the hospital?” her husband asked anxiously when she told him about her nearly sightless left eye. “This could be serious.” She thought for a moment. No, tomorrow was Monday; her neurologist’s office would be open, and the doctor would see her right away. She was always reliable that way. The patient had bad headaches for most of her adult life. They were always on the right side. They were always throbbing. They could last for days, or weeks, or sometimes months. Loud noises were always bothersome. With really bad headaches, her eye would water and her nose would run, just on that side. Bending over was agony. For the past few weeks, her headache had been so severe that if she dropped something on the floor, she had to leave it there. When she bent down, the pounding was excruciating. © 2020 The New York Times Company

Keyword: Pain & Touch; Vision
Link ID: 27553 - Posted: 10.28.2020

By Macarena Carrizosa, Sophie Bushwick A new system called PiVR creates working artificial environments for small animals such as zebra fish larvae and fruit flies. Developers say the system’s affordability could help expand research into animal behavior. © 2020 Scientific American

Keyword: Development of the Brain; Vision
Link ID: 27505 - Posted: 10.07.2020

Neuroskeptic Why do particular brain areas tend to adopt particular roles? Is the brain "wired" by genetics to organize itself in a certain way, or does brain organization emerge from experience? One part of the brain has been the focus of a great deal of nature-vs-nurture debate. It's called the fusiform face area (FFA) and, as the name suggests, it seems to be most active during perception of faces. It's broadly accepted that the FFA responds most strongly to faces in most people, but there's controversy over why this is. Is the FFA somehow innately devoted to faces, or does its face specialization arise through experience? In the latest contribution to this debate, a new study argues that the FFA doesn't need any kind of visual experience to be face selective. The researchers, N. Apurva Ratan Murty et al., show that the FFA activates in response to touching faces, even in people who were born blind and have never seen a face. Murty et al. designed an experiment in which participants — 15 sighted and 15 congenitally blind people — could touch objects while their brain activity was recorded with fMRI. A 3D printer was used to create models of faces and other objects, and the participants could explore these with their hands, thanks to a rotating turntable. The key result was that touching the faces produced a similar pattern of activity in both the blind and sighted people, and this activity was also similar to when sighted people viewed faces visually: In a follow-up experiment with n=7 of the congenitally blind participants, Murty et al. found that the same face-selective areas in these individuals also responded to "face-related" sounds, such as laughing or chewing sounds, more than other sounds. (This replicates earlier work.) © 2020 Kalmbach Media Co.

Keyword: Attention; Vision
Link ID: 27459 - Posted: 09.07.2020

By Moises Velasquez-Manoff Jack Gallant never set out to create a mind-reading machine. His focus was more prosaic. A computational neuroscientist at the University of California, Berkeley, Dr. Gallant worked for years to improve our understanding of how brains encode information — what regions become active, for example, when a person sees a plane or an apple or a dog — and how that activity represents the object being viewed. By the late 2000s, scientists could determine what kind of thing a person might be looking at from the way the brain lit up — a human face, say, or a cat. But Dr. Gallant and his colleagues went further. They figured out how to use machine learning to decipher not just the class of thing, but which exact image a subject was viewing. (Which photo of a cat, out of three options, for instance.) One day, Dr. Gallant and his postdocs got to talking. In the same way that you can turn a speaker into a microphone by hooking it up backward, they wondered if they could reverse engineer the algorithm they’d developed so they could visualize, solely from brain activity, what a person was seeing. The first phase of the project was to train the AI. For hours, Dr. Gallant and his colleagues showed volunteers in fMRI machines movie clips. By matching patterns of brain activation prompted by the moving images, the AI built a model of how the volunteers’ visual cortex, which parses information from the eyes, worked. Then came the next phase: translation. As they showed the volunteers movie clips, they asked the model what, given everything it now knew about their brains, it thought they might be looking at. The experiment focused just on a subsection of the visual cortex. It didn’t capture what was happening elsewhere in the brain — how a person might feel about what she was seeing, for example, or what she might be fantasizing about as she watched. The endeavor was, in Dr. Gallant’s words, a primitive proof of concept. And yet the results, published in 2011, are remarkable. The reconstructed images move with a dreamlike fluidity. In their imperfection, they evoke expressionist art. (And a few reconstructed images seem downright wrong.) But where they succeed, they represent an astonishing achievement: a machine translating patterns of brain activity into a moving image understandable by other people — a machine that can read the brain. © 2020 The New York Times Company

Keyword: Vision; Brain imaging
Link ID: 27448 - Posted: 09.02.2020

Georgina Ferry The lightning flick of the tongue that secures a frog its next meal depends on a rapid response to a small black object moving through its field of view. During the 1950s the British neuroscientist Horace Barlow established that neurons in the frog’s retina were tuned to produce just such a response, not only detecting but also predicting the future position of a passing fly. This discovery raised the curtain on decades of research by Barlow and others, establishing that individual neurons of the billions that make up the visual system contribute to the efficient processing of movement, colour, position and orientation of objects in the visual world. Barlow, who has died aged 98, combined three approaches to the question of how the brain enables us to see. He looked at how people perceive, for example measuring the smallest and faintest spot of light they could reliably detect; he studied the responses of single neurons in the retina and brain to different visual stimuli; and he developed theories to account for the relationship between what neurons are doing and the corresponding visual experience. All his work started from the principle – apparently obvious but not often stated – that a deep, mathematical understanding of what is involved in the psychological process of seeing is an essential basis for exploring how the physiological elements of the visual system serve that end. In a vivid analogy, he wrote: “A wing would be a most mystifying structure if one did not know that birds flew.” He is best known for demanding answers to the question of how such a complex system could work most efficiently. He was influenced by early computer scientists, and was a pioneer in seeing visual signals as information to be processed. His concept of “efficient coding” predicted that of all the information presented to the eye, the brain would transmit the minimum necessary, wasting no energy on redundant signals. © 2020 Guardian News & Media Limite

Keyword: Vision
Link ID: 27433 - Posted: 08.26.2020

By Katherine J. Wu Dr. Arianne Pontes Oriá stands firm: She does not make animals cry for a living. Technically, only humans can cry, or weep in response to an emotional state, said Dr. Oriá, a veterinarian at the Federal University of Bahia in Brazil. For humans, crying is a way to physically manifest feelings, which are difficult to study and confirm in other creatures. But Dr. Oriá does collect animal tears — the liquid that keeps eyes clean and nourished. In vertebrates, or animals with backbones, tears are vital for vision, Dr. Oriá said. And yet, these captivating fluids have been paid little to no attention, except in a select few mammals. “A lot of vision, we’re not aware of until it’s a problem,” said Sebastian Echeverri, a biologist who studies animal vision but doesn’t work with Dr. Oriá’s team. “We notice when tears are missing.” That’s a bit of a shame, Dr. Oriá said. Because whether it hails from dogs, parrots or tortoises, the stuff that seeps out of animals’ eyes is simply “fascinating,” she said. As she and her colleagues have reported in a series of recent papers, including one published on Thursday in the journal Frontiers in Veterinary Science, tears can be great equalizers: Across several branches of the tree of life, vertebrates seem to swaddle their eyes with fluid in much the same way. But to help them cope with the challenges of various environments, evolution has tinkered with the tears of the world’s creatures in ways that scientists are only beginning to explore. Research like Dr. Oriá’s could offer a glimpse into the myriad paths that eyes have taken to maximize their health and the well-being of the organisms that use them. © 2020 The New York Times Company

Keyword: Vision
Link ID: 27418 - Posted: 08.15.2020

Children wearing multifocal contact lenses had slower progression of their myopia, according to results from a clinical trial funded by the National Eye Institute, part of the National Institutes of Health. The findings support an option for controlling the condition, also called nearsightedness, which increases the risk of cataracts, glaucoma and retinal detachment later in life. Investigators of the Bifocal Lenses In Nearsighted Kids (BLINK) Study published the results August 11 in JAMA(link is external). “It is especially good news to know that children as young as 7 achieved optimal visual acuity and got used to wearing multifocal lenses much the way they would a single vision contact lens. It’s not a problem to fit younger kids in contact lenses. It’s a safe practice,” said BLINK study chair, Jeffrey J. Walline, O.D., Ph.D., associate dean for research at the Ohio State University College of Optometry Myopia occurs when a child’s developing eyes grow too long from front to back. Instead of focusing images on the retina—the light-sensitive tissue in the back of the eye—images of distant objects are focused at a point in front of the retina. As a result, people with myopia have good near vision but poor distance vision. Single vision prescription glasses and contact lenses are used to correct myopic vision but fail to treat the underlying problem. Multifocal contact lenses – typically used to improve near vision of people over the age of 40 years – correct myopic vision in children while simultaneously slowing myopia progression by slowing eye growth.

Keyword: Vision
Link ID: 27415 - Posted: 08.12.2020

By Veronique Greenwood Planarians have unusual talents, to say the least. If you slice one of the tiny flatworms in half, the halves will grow back, giving you two identical worms. Cut a flatworm’s head in two, and it will grow two heads. Cut an eye off a flatworm — it will grow back. Stick an eye on a flatworm that lacks eyes — it’ll take root. Pieces as small as one-279th of a flatworm will turn into new, whole flatworms, given the time. This process of regeneration has fascinated scientists for more than 200 years, prompting myriad zany, if somewhat macabre, experiments to understand how it is possible for a complex organism to rebuild itself from scratch, over and over and over again. In a paper published Friday in Science, researchers revealed a tantalizing glimpse into how the worms’ nervous systems manage this feat. Specialized cells, the scientists report, point the way for neurons stretching from newly grown eyes to the brain of the worm, helping them connect correctly. The research suggests that cellular guides hidden throughout the planarian body may make it possible for the worm’s newly grown neurons to retrace their steps. Gathering these and other insights from the study of flatworms may someday help scientists interested in helping humans regenerate injured neurons. María Lucila Scimone, a researcher at M.I.T.’s Whitehead Institute for Biomedical Research, first noticed these cells while studying Schmidtea mediterranea, a planarian common to bodies of freshwater in Southern Europe and North Africa. During another experiment, she noted that they were expressing a gene involved in regeneration. The team looked more closely and realized that some of the regeneration-related cells were positioned at key branching points in the network of nerves between the worms’ eyes and their brains. When the researchers transplanted an eye from one animal to another, the neurons growing from the new eye always grew toward these cells. When the nerve cells reached their target, they kept growing along the route that would take them to the brain. Removing those cells meant the neurons got lost and did not reach the brain. © 2020 The New York Times Company

Keyword: Development of the Brain; Regeneration
Link ID: 27340 - Posted: 07.01.2020

By Courtney Linder Perception is certainly not always reality. Some people might think this image is a rabbit, for example, while others see it as a raven: But what if your brain just stopped recognizing numbers one day? That's precisely the basis for a recent Johns Hopkins University study about a man with a rare brain anomaly that prevents him from seeing certain numbers. Instead, the man told doctors, he sees squiggles that look like spaghetti, like in this video: And it's not just a matter of perception for him—not an optical illusion, nor something a Rorschach test could psychoanalyze away. It's actually proof that our brains can processes the world around us, and yet we could have no awareness of those sights. "We present neurophysiological evidence of complex cognitive processing in the absence of awareness, raising questions about the conditions necessary for visual awareness," the scientists note in a new paper published in the journal Proceedings of the National Academy of Sciences. RFS—the name researchers use to refer to the man in the study—has been diagnosed with a rare degenerative brain disease that has led to extensive atrophy in his cortex and basal ganglia. Atrophy is basically a loss of neurons and connective tissue, so you can think of it as the brain shrinking, in a sense. The cortex is the gray matter in your brain that controls things like attention, perception, awareness, and consciousness, while the basal ganglia are responsible for motor learning, executive functions, and emotional behaviors. ©2020 Hearst Magazine Media, Inc.

Keyword: Attention; Vision
Link ID: 27338 - Posted: 07.01.2020

Hemant Khanna In recent months, even as our attention has been focused on the coronavirus outbreak, there have been a slew of scientific breakthroughs in treating diseases that cause blindness. Researchers at U.S.-based Editas Medicine and Ireland-based Allergan have administered CRISPR for the first time to a person with a genetic disease. This landmark treatment uses the CRISPR approach to a specific mutation in a gene linked to childhood blindness. The mutation affects the functioning of the light-sensing compartment of the eye, called the retina, and leads to loss of the light-sensing cells. According to the World Health Organization, at least 2.2 billion people in the world have some form of visual impairment. In the United States, approximately 200,000 people suffer from inherited forms of retinal disease for which there is no cure. But things have started to change for good. We can now see light at the end of the tunnel. I am an ophthalmology and visual sciences researcher, and am particularly interested in these advances because my laboratory is focusing on designing new and improved gene therapy approaches to treat inherited forms of blindness. The eye as a testing ground for CRISPR Gene therapy involves inserting the correct copy of a gene into cells that have a mistake in the genetic sequence of that gene, recovering the normal function of the protein in the cell. The eye is an ideal organ for testing new therapeutic approaches, including CRISPR. That is because the eye is the most exposed part of our brain and thus is easily accessible. © 2010–2020, The Conversation US, Inc.

Keyword: Vision
Link ID: 27327 - Posted: 06.26.2020

By Brian Fix your gaze on the black dot on the left side of this image. But wait! Finish reading this paragraph first. As you gaze at the left dot, try to answer this question: In what direction is the object on the right moving? Is it drifting diagonally, or is it moving up and down? Remember, focus on the dot on the left. It appears as though the object on the right is moving diagonally, up to the right and then back down to the left. Right? Right?! Actually, it’s not. It’s moving up and down in a straight, vertical line. See for yourself. Trace it with your finger. This is a visual illusion. That alternating black-white patch inside the object suggests diagonal motion and confuses our senses. Like all misperceptions, it teaches us that our experience of reality is not perfect. But this particular illusion has recently reinforced scientists’ understanding of deeper, almost philosophical truths about the nature of our consciousness. “It’s really important to understand we’re not seeing reality,” says neuroscientist Patrick Cavanagh, a research professor at Dartmouth College and a senior fellow at Glendon College in Canada. “We’re seeing a story that’s being created for us.” Most of the time, the story our brains generate matches the real, physical world — but not always. Our brains also unconsciously bend our perception of reality to meet our desires or expectations. And they fill in gaps using our past experiences. All of this can bias us. Visual illusions present clear and interesting challenges for how we live: How do we know what’s real? And once we know the extent of our brain’s limits, how do we live with more humility — and think with greater care about our perceptions?

Keyword: Vision
Link ID: 27321 - Posted: 06.24.2020

By Elizabeth Pennisi Though not much bigger than a wooden match stick, snapping shrimp (Alpheus heterochaelis, pictured) are already famous for their loud, quick closing claws, the sound of which stuns their prey and rivals. Now, researchers have discovered these marine crustaceans have the eyesight to match this speed. In the new study, scientists stuck a thin conducting wire into the eye of a chilled, live shrimp and recorded electrical impulses from the eye in response to flickering light. The crustaceans refresh their view 160 times a second, the team reports today in Biology Letters. That’s one of the highest refresh rates of any animal on Earth. Pigeons come close, being able to sample their field of view 143 times per second, whereas humans top out at a relatively measly 60 times a second. Only some day-flying insects beat the snapping shrimp, the researchers report. As a result, what people—perhaps even Superman—and all other vertebrates see as a blur, the shrimp detects as discrete images moving across its field of vision. Until a few years ago, most researchers assumed snapping shrimp didn’t see very well because they have a hard hood called a carapace that extends over their eyes. Although the hood seems transparent, with some coloration, it wasn’t clear how well it transmitted light. But it appears to be no impediment to the shrimp detecting fast moving prey or even predators whipping by. This might be important because the shrimp tend to live in cloudy water, so they don’t have much notice when another critter is approaching them. Posted in: © 2020 American Association for the Advancement of Science.

Keyword: Vision; Evolution
Link ID: 27318 - Posted: 06.24.2020

By Veronique Greenwood Hummingbirds were already impressive. They move like hurried insects, turn on aerial dimes and extract nectar from flowers with almost surgical precision. But they conceal another talent, too: seeing colors that human eyes can’t perceive. Ultraviolet light from the sun creates colors throughout the natural world that are never seen by people. But researchers working out of the Rocky Mountain Biological Laboratory reported on Monday in Proceedings of the National Academy of Sciences that untrained broad-tailed hummingbirds can use these colors to help them identify sources of food. Testing 19 pairings of colors, the team found that hummingbirds are picking up on multiple colors beyond those we can see. From the bird’s-eye view, numerous plants and feathers have these as well, suggesting that they live in a richer-hued world than we do, full of signs and messages that we never notice. Compared with the color vision of many other animals, that of humans leaves something to be desired. The perception of color relies on cone cells in the retina, each of which responds to different wavelengths of light. Humans have three kinds of cone cells, which, when light reflects off an apple, a leaf or a field of daffodils, send signals that are combined in the brain to generate the perception of red, green or yellow. Birds, however, have four types of cones, including one that is sensitive to ultraviolet light. (And they are far from the most generously endowed — mantis shrimp, for instance, have 16.) In lab experiments, birds readily pick up on UV light and UV yellow, a mixture of UV light and visible yellow wavelengths, says Mary Caswell Stoddard, a professor of evolutionary biology at Princeton University and an author of the new study. Likewise, researchers have long known that UV colors are widespread in the natural world, though we can’t see them. However, experiments to see whether wild birds would use UV colors in their daily lives had not yet been performed. © 2020 The New York Times Company

Keyword: Vision; Evolution
Link ID: 27313 - Posted: 06.22.2020

By Marina Wang The classic eye exam may be about to get an upgrade. Researchers have developed an online vision test—fueled by artificial intelligence (AI)—that produces much more accurate diagnoses than the sheet of capital letters we’ve been staring at since the 19th century. If perfected, the test could also help patients with eye diseases track their vision at home. “It’s an intriguing idea” that reveals just how antiquated the classic eye test is, says Laura Green, an ophthalmologist at the Krieger Eye Institute. Green was not involved with the work, but she studies ways to use technology to improve access to health care. The classic eye exam, known as the Snellen chart, has been around since 1862. The farther down the sheet a person can read, the better their vision. The test is quick and easy to administer, but it has problems, says Chris Piech, a computer scientist at Stanford University. Patients start to guess at letters when they become blurry, he says, which means they can get different scores each time they take the test. Piech is no stranger to the Snellen test. At age 10, doctors diagnosed him with chronic uveitis, an inflammatory eye disease. “I was sitting through all these tests and it was pretty obvious to me that it was terribly inaccurate,” he says. He wanted to find a way to remove human error from the Snellen exam, while improving its accuracy. © 2020 American Association for the Advancement of Science.

Keyword: Vision; Robotics
Link ID: 27281 - Posted: 06.04.2020

By Nicoletta Lanese, Scientists sent patterns of electricity coursing across people’s brains, coaxing their brains to see letters that weren’t there. The experiment worked in both sighted people and blind participants who had lost their sight in adulthood, according to the study, published today (May 14) in the journal Cell. Although this technology remains in its early days, implanted devices could potentially be used in the future to stimulate the brain and somewhat restore people’s vision. Known as visual prosthetics, the implants were placed on the visual cortex and then stimulated in a pattern to “trace” out shapes that the participants could then “see.” More advanced versions of these implants could work similarly to cochlear implants, which stimulate nerves of the inner ear with electrodes to help enhance the wearer’s hearing ability. “An early iteration [of such a device] could provide detection of the contours of shapes encountered,” study authors neuroscientist Michael Beauchamp and neurosurgeon Dr. Daniel Yoshor, both at the Baylor College of Medicine, told Live Science in an email. (Yoshor will start a new position at the Perelman School of Medicine at the University of Pennsylvania this summer.) “The ability to detect the form of a family member or to allow more independent navigation would be a wonderful advance for many blind patients.” The study authors crafted the letters by stimulating the brain with electrical currents, causing it to generate so-called phosphenes — tiny pinpricks of light that people sometimes perceive without any actual light entering their eyes. © 2020 Scientific American

Keyword: Vision; Robotics
Link ID: 27250 - Posted: 05.16.2020

Abby Olena Instead of a traditional lymphatic system, the brain harbors a so-called glymphatic system, a network of tunnels surrounding arteries and veins through which fluid enters and waste products drain from the brain. In a study published March 25 in Science Translational Medicine, researchers show that the rodent eye also has a glymphatic system that takes out the trash through spaces surrounding the veins within the optic nerve. They also found that this system may be compromised in glaucoma and is capable of clearing amyloid-β, the build up of which has been implicated in the development of Alzheimer’s disease, glaucoma, and age-related macular degeneration. The work began in the group of Maiken Nedergaard, a neuroscientist with labs at both the University of Rochester Medical School and the University of Copenhagen, who described the glymphatic system of the brain in 2012. Xiaowei Wang, then a graduate student in Nedergaard’s group and now a postdoc at the University of California, San Francisco, was interested in the eye and spearheaded the search for an ocular glymphatic system. At that point, nobody had speculated that the optic nerve—in addition to transmitting electrical signals—is also a fluid transport highway, Nedergaard says. As Wang’s project was getting underway, Nedergaard met Lu Chen, a neuroscientist at the University of California, Berkeley, at a meeting. Chen’s group had done previous research on ocular lymphatics that focused on the front of the eye. There, the majority of the aqueous humor—the fluid that fills the chamber between the cornea and the lens—drains from the eye to the surrounding vasculature through a circular lymph-like vessel called Schlemm’s canal. This helps regulate intraocular pressure. Chen tells The Scientist that she and Nedergaard decided to collaborate to connect the knowledge about the front of the eye with their questions about the back of the eye. © 1986–2020 The Scientist

Keyword: Brain imaging; Vision
Link ID: 27207 - Posted: 04.22.2020