Chapter 10. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Paul Gabrielsen An insect's compound eye is an engineering marvel: high resolution, wide field of view, and incredible sensitivity to motion, all in a compact package. Now, a new digital camera provides the best-ever imitation of a bug's vision, using new optical materials and techniques. This technology could someday give patrolling surveillance drones the same exquisite vision as a dragonfly on the hunt. Human eyes and conventional cameras work about the same way. Light enters a single curved lens and resolves into an image on a retina or photosensitive chip. But a bug's eyes are covered with many individual lenses, each connected to light-detecting cells and an optic nerve. These units, called ommatidia, are essentially self-contained minieyes. Ants have a few hundred. Praying mantises have tens of thousands. The semicircular eyes sometimes take up most of an insect's head. While biologists continue to study compound eyes, materials scientists such as John Rogers try to mimic elements of their design. Many previous attempts to make compound eyes focused light from multiple lenses onto a flat chip, such as the charge-coupled device chips in digital cameras. While flat silicon chips have worked well for digital photography, in biology, "you never see that design," Rogers says. He thinks that a curved system of detectors better imitates biological eyes. In 2008, his lab created a camera designed like a mammal eye, with a concave electronic "retina" at the back. The curved surface enabled a wider field of view without the distortion typical of a wide-angle camera lens. Rogers then turned his attention to the compound eye. © 2010 American Association for the Advancement of Science.
Link ID: 18110 - Posted: 05.02.2013
By Michelle Roberts Health editor, BBC News online Canadian doctors say they have found an inventive way to treat lazy eye - playing the Tetris video game. The McGill University team discovered the popular tile-matching puzzle could train both eyes to work together. In a small study, in Current Biology with 18 adults, it worked better than conventional patching of the good eye to make the weak one work harder. The researchers now want to test if it would be a good way to treat children with the same condition. UK studies are already under way. An estimated one in 50 children has lazy eye, known medically as amblyopia. It happens when the vision in one eye does not develop properly, and is often accompanied by a squint - where the eyes do not look in the same direction. Without treatment it can lead to a permanent loss of vision in the weak eye, which is why doctors try to intervene early. Normally, the treatment is to cover the strong eye with a patch so that the child is forced to use their lazy eye. The child must wear the patch for much of the day over many months, which can be frustrating and unpleasant. BBC © 2013
By Breanna Draxler When you lose something important—a child, your wallet, the keys—your brain kicks into overdrive to find the missing object. But that’s not just a matter of extra concentration. Researchers have found that in these intense search situations your brain actually rallies extra visual processing troops (and even some other non-visual parts of the brain) to get the job done. It has to do with the way your brain processes images in the first place. When you see objects, your brain sorts them into broad categories—about 1,000 of them. The various elements we perceive trigger a pattern of different categorical areas in our brains. For example, if you see a woman carrying an umbrella while walking her dog in the park, your brain might catalog it as “people,” “tools” and “animals.” But when you lose something, your brain reacts a little differently. It expands the category of the object you’re looking for to include related categories and turns down the perception of other, non-related categories, to allow you to focus more intently on the object of interest. To see what this altered categorization looked like during a search, researchers at UC Berkeley used functional magnetic resonance imaging (fMRI) to record changes in five people’s brain activity as they looked for objects in movies. The objects they sought were categorized broadly, paralleling how our brains separate items into generalized groups like “vehicles” and “people.” During hour-long search sessions, the researchers found that regardless of whether the participants found the objects they were looking for, their brains cast a wider visual net than they would if they were watching passively.
By ERIC R. KANDEL THIS month, President Obama unveiled a breathtakingly ambitious initiative to map the human brain, the ultimate goal of which is to understand the workings of the human mind in biological terms. Many of the insights that have brought us to this point arose from the merger over the past 50 years of cognitive psychology, the science of mind, and neuroscience, the science of the brain. The discipline that has emerged now seeks to understand the human mind as a set of functions carried out by the brain. This new approach to the science of mind not only promises to offer a deeper understanding of what makes us who we are, but also opens dialogues with other areas of study — conversations that may help make science part of our common cultural experience. Consider what we can learn about the mind by examining how we view figurative art. In a recently published book, I tried to explore this question by focusing on portraiture, because we are now beginning to understand how our brains respond to the facial expressions and bodily postures of others. The portraiture that flourished in Vienna at the turn of the 20th century is a good place to start. Not only does this modernist school hold a prominent place in the history of art, it consists of just three major artists — Gustav Klimt, Oskar Kokoschka and Egon Schiele — which makes it easier to study in depth. As a group, these artists sought to depict the unconscious, instinctual strivings of the people in their portraits, but each painter developed a distinctive way of using facial expressions and hand and body gestures to communicate those mental processes. © 2013 The New York Times Company
By James Gallagher Health and science reporter, BBC News Eye drops designed to lower cholesterol may be able to prevent one of the most common forms of blindness, according to US researchers. They showed how high cholesterol levels could affect the immune system and lead to macular degeneration. Tests on mice and humans, published in the journal Cell Metabolism, showed that immune cells became destructive when they were clogged with fats. Others cautioned that the research was still at an early stage. The macula is the sweet spot in the eye which is responsible for fine detail. It is essential for reading, driving and recognising people's faces. Macular degeneration is more common in old age. It starts in a "dry" form in which the light-sensing cells in the eye become damaged, but can progress into the far more threatening "wet" version, when newly formed blood vessels can rapidly cause blindness. Doctors at the Washington University School of Medicine investigated the role of macrophages, a part of the immune system, in the transition from the dry to the wet form of the disease. One of the researchers, Dr Rajendra Apte, said the role of macrophages changed and they triggered the production of new blood vessels. "Instead of being protective, they accelerate the disease, but we didn't understand why they switched to become the bad cells," he told the BBC. Normally the cells can "eat" fatty deposits and send them back into the blood. However, their research showed that older macrophages struggle. They could still eat the fats, but they could not expel them. So they became "bloated", causing inflammation which in turn led to the creation of new blood vessels. BBC © 2013
Link ID: 17985 - Posted: 04.03.2013
By DOUGLAS QUENQUA A new study suggests that primates’ ability to see in three colors may not have evolved as a result of daytime living, as has long been thought. The findings, published in the journal Proceedings of the Royal Society B, are based on a genetic examination of tarsiers, the nocturnal, saucer-eyed primates that long ago branched off from monkeys, apes and humans. By analyzing the genes that encode photopigments in the eyes of modern tarsiers, the researchers concluded that the last ancestor that all tarsiers had in common had highly acute three-color vision, much like that of modern-day primates. Such vision would normally indicate a daytime lifestyle. But fossils show that the tarsier ancestor was also nocturnal, strongly suggesting that the ability to see in three colors somehow predated the shift to daytime living. The coexistence of the two normally incompatible traits suggests that primates were able to function during twilight or bright moonlight for a time before making the transition to a fully diurnal existence. “Today there is no mammal we know of that has trichromatic vision that lives during night,” said an author of the study, Nathaniel J. Dominy, associate professor of anthropology at Dartmouth. “And if there’s a pattern that exists today, the safest thing to do is assume the same pattern existed in the past. “We think that tarsiers may have been active under relatively bright light conditions at dark times of the day,” he added. “Very bright moonlight is bright enough for your cones to operate.” © 2013 The New York Times Company
By C. CLAIBORNE RAY Q. Can cataracts grow back after they have been removed? A. “Once a cataract is removed, it cannot grow back,” said Dr. Jessica B. Ciralsky, an ophthalmologist at NewYork-Presbyterian Hospital/Weill Cornell Medical Center. Blurred vision may develop after cataract surgery, mimicking the symptoms of the original cataract. This is not a recurrence of the cataract and is from a condition that is easily treated, said Dr. Ciralsky, who is a cornea and cataract specialist. Cataracts, which affect about 22 million Americans over 40, are a clouding of the eye’s naturally clear crystalline lens. Besides blurred vision, the symptoms include glare and difficulty driving at night. In cataract surgery, the entire cataract is removed and an artificial lens is implanted in its place; the capsule that held the cataract is left intact to provide support for the new lens. After surgery, patients may develop a condition called posterior capsular opacification, which is often referred to as a secondary cataract. “This is a misnomer,” Dr. Ciralsky said. “The cataract has not actually grown back.” Instead, she explained, in about 20 percent of patients, the capsule that once supported the cataract has become cloudy, or opacified. A simple laser procedure done in the office can treat the problem effectively. © 2013 The New York Times Company
Link ID: 17979 - Posted: 04.02.2013
By Puneet Kollipara Blind fish that spend their lives in dark, underwater caves have lost a huge chunk of their ability to hear, scientists report in the March 27 Biology Letters. Two of the fish species studied could not hear high-pitched sounds. “I was really surprised,” says study coauthor Daphne Soares of the University of Maryland, College Park. “I expected them to hear much better than the surface fishes.” Cave-dwelling fish can lose their vision and even their eyes over many generations. And without light, eyesight can lose its importance in fish survival. Only two previous studies have explored what happens to hearing after fish lose their vision; both found no differences in hearing between cave fish and those that experience daylight. Soares and her colleagues collected fish of two blind cave-dwelling species, Typhlichthys subterraneus and Amblyopsis spelaea, from lakes in Kentucky. Specimens of a surface-dwelling species, Forbesichthys agassizii, which is closely related to the cave dwellers, came from a lake in south-central Tennessee. Back in the lab, the researchers tested fish hearing by seeing whether sounds across a range of pitches could stimulate nerve activity in the fishes’ brains. The researchers also measured the density of sound-detecting hair cells in the creatures’ ears. © Society for Science & the Public 2000 - 2013
By Brian Palmer, As a columnist who tries to explain scientific and other puzzles, I get asked a lot of strange questions. Here’s one that has been bugging me for some time: Are there visually impaired animals? Are there nearsighted deer that could use glasses or farsighted elephants that could benefit from an enormous set of contacts? How about astigmatic alligators? It seems like an animal question, but, at its core, it’s motivated by an astute comparison with humans. We’re undeniably visual creatures, yet many of us have trouble seeing well. According to some estimates, up to 42 percent of Americans are myopic, or nearsighted. Isn’t this a failure of natural selection? Shouldn’t our blurry-sighted ancestors have starved to death or been consumed by predators because of their visual handicaps? Does nature allow other animals to have such poor vision? These questions turn out to be surprisingly complicated. Let’s start out with the non-human animals and work back to our own visual shortcomings. Ophthalmologists can’t ask lions to read an eye chart or put glasses on a whale. Instead, they shine a light into the animal’s eye to see how it refracts and focuses on the retina. And with a trainable animal, such as a hawk or a horse, researchers can teach it to respond to a visual cue, then determine how well the animal picks up the cue when it is far away, very close or somehow obscured. © 1996-2013 The Washington Post
Link ID: 17942 - Posted: 03.25.2013
Philip Ball No one with even a passing interest in scientific trends will have failed to notice that the brain is the next big thing. It has been said for at least a decade, but now it’s getting serious — with, for example, the recent award by the European Commission of €500 million (US$646 million) to the Human Brain Project to build a new “infrastructure for future neuroscience” and a $1-billion initiative endorsed by President Obama. Having failed to ‘find ourselves’ in our genome, we’re starting a search in the grey matter. It’s a reasonable objective, but only if we have a clear idea of what we hope and expect to find. Some neuroscientists have grand visions, such as that adduced by Semir Zeki of University College London: “It is only by understanding the neural laws that dictate human activity in all spheres — in law, morality, religion and even economics and politics, no less than in art — that we can ever hope to achieve a more proper understanding of the nature of man.” Zeki heads the UCL Institute of Neuroesthetics. This is one of many fields that attaches ‘neuro’ to some human trait with the implication that the techniques of neuroscience, such as functional magnetic resonance imaging, will explain it. We have neurotheology, neuroethics, neurocriminology and so on. Meanwhile, in popular media, a rash of books and articles proclaim (in a profoundly ugly trope) that “this is your brain on drugs/music/religion/sport”. It seems unlikely that studies of the brain will ever be able to wholly explain how we respond to art. © 2013 Nature Publishing Group
At 7 months of age, children who are later diagnosed with autism take a split second longer to shift their gaze during a task measuring eye movements and visual attention than do typically developing infants of the same age, according to researchers supported by the National Institutes of Health. The difference between the groups’ test results was 25 to 50 milliseconds on average, the researchers found, too brief to be detected in social interactions with an infant. However, they showed that this measurable delay could be accounted for by differences in the structure and organization of actively developing neurological circuits of a child’s brain. Image of brain structure known as the splenium of the corpus callosum When they were infants, children who were later diagnosed with autism took longer to shift their gaze during a measure of eye movements than did infants who were not diagnosed with autism. The researchers believe that brain circuits involved with a brain structure known as the splenium of the corpus callosum (shown in this scan) may account for the differences in gaze shifting between the two groups. Image courtesy of Jason Wolff, Ph.D., University of North Carolina at Chapel Hill. Efficiently shifting attention early in infancy is thought to be important for later social and cognitive development. Split-second delays, the researchers suggested, could be a precursor to such well known symptoms of autism as difficulty making eye contact or following a parent’s pointing finger, problems that generally emerge after a child turns 1. Typically, autism spectrum disorder (ASD) is not diagnosed until after 3 or 4 years of age. The study appears in the American Journal of Psychiatry.
by Lizzie Wade Hundreds of millions of years ago, the Earth's seas teemed with trilobites, hard-shelled critters that resembled spiny aquatic cockroaches. Because their exoskeletons lent themselves to fossilization, scientists know a lot about what the outside of their bodies looked like. Their inner workings, however, have remained mysterious. Now, a new study has revealed the structure of the trilobite eye, bringing researchers one step closer to understanding the evolution of vision. Like today's insects and crustaceans, trilobites had compound eyes, with many different lenses focusing light onto clusters of sensory cells lying below them. The resulting image was put together a lot like a picture on your computer screen, with each lens producing one "pixel" of the whole. Because the lenses themselves were made of the mineral calcite, they often fossilized along with the rest of the trilobite's tough exoskeleton. The sensory cells underneath the lenses, however, were ephemeral, and scientists had always assumed that they had decayed without a trace. So imagine Brigitte Schoenemann's surprise when she spotted fossilized versions of these delicate sensory cells while x-raying a long dead trilobite with a computed tomography (CT) scanner. "I expected that we would see [something] in the lens of trilobites, but then suddenly we saw structures of cells below the lens," recalls Schoenemann, a physiologist at the University of Bonn and the University of Cologne, both in Germany. Inspired, she applied to take more fossils to the European Synchrotron Radiation Facility in Grenoble, France, where she could use a particle accelerator's high energy x-rays to peer deeper into the trilobites' eyes. Now, she says, she's created images of the extinct animal's entire visual system, down to the level of fossilized individual cells. © 2010 American Association for the Advancement of Science
by Michael Marshall Neanderthals may have had bigger eyes than modern humans, but while this helped them see better, it may have meant that they did not have brainpower to spare for complex social lives. If true, this may have been a disadvantage when the ice age reduced access to food, as they would not have had the skills to procure help from beyond their normal social group, speculates Robin Dunbar at the University of Oxford. Neanderthals' brains were roughly the same size as modern humans, but may have been organised differently. To find out, a team led by Dunbar studied the skulls of 13 Neanderthals and 32 anatomically modern humans. The Neanderthals had larger eye sockets. There are no Neanderthal brains to examine, but primates with larger eyes tend to have larger visual systems in their brains, suggesting Neanderthals did too. Their large bodies would also have required extra brain power to manage. Together, their larger eyes and bodies would have left them with less grey matter to dedicate to other tasks. Neanderthals may have evolved enhanced visual systems to help them see in the gloom of the northern hemisphere, Dunbar says. "It makes them better at detecting things in grim, grey conditions." As a by-product of larger eyes, they may not have been able to expand their frontal lobes – a brain area vital for social interaction – as much as modern humans. As a result, Dunbar estimates they could only maintain a social group size of around 115 individuals, rather than the 150 that we manage. © Copyright Reed Business Information Ltd.
By Tina Hesman Saey If someone shouts “look behind you,” tadpoles in Michael Levin’s laboratory may be ready. The tadpoles can see out of eyes growing from their tails, even though the organs aren’t directly wired to the animals’ brains, Levin and Douglas Blackiston, both of Tufts University in Medford, Mass., report online February 27 in the Journal of Experimental Biology. Levin and Blackiston’s findings may help scientists better understand how the brain and body communicate, including in humans, and could be important for regenerative medicine or designing prosthetic devices to replace missing body parts, says Günther Zupanc, a neuroscientist at Northeastern University in Boston. Researchers have transplanted frog eyes to other body parts for decades, but until now, no one had shown that those oddly placed eyes (called “ectopic” eyes) actually worked. Ectopic eyes on tadpoles’ tails allow the animals to distinguish blue light from red light, the Tufts team found. Levin wanted to know whether the brain is hardwired to get visual information only from eyes in the head, or whether the brain could use data coming from elsewhere. To find out, he and Blackiston started with African clawed frog tadpoles (Xenopus laevis) and removed the normal eyes. They then transplanted cells that would grow into eyes onto the animals’ tails. The experiment seemed like a natural to test how well the brain can adapt, Levin says. “There’s no way the tadpole’s brain is expecting an eye on its tail.” Expected or not, some of the tadpoles managed to detect red and blue light from their tail eyes. The researchers placed tadpoles with transplanted eyes in chambers in which half of the chamber was illuminated in blue light and the other half in red light. A mild electric shock zapped the tadpole when it was in one half of the dish so that the animal learned to associate the color with the shock. The researchers periodically switched the colors in the chamber so that the tadpoles didn’t learn that staying still would save them. © Society for Science & the Public 2000 - 2013
By Maria Konnikova Georg Tobias Ludwig Sachs was born on April 22, 1786, in the mountain village of St. Ruprecht, Kärnthen, or Carinthia – the south of present-day Austria. From the first, he was notably different from his parents and siblings: he was an albino. (His youngest sister, eleven years his junior, would be one as well.) We don’t know if this physical distinction had any negative impact on the young Georg—but it certainly piqued his curiosity. He proceeded to embark on the scientific study of albinism at the universities in Tübingen, Altdorf, and Erlangen, and at the last of these, produced his 1812 doctoral dissertation. It was about albinism: “A Natural History of Two Albinos, the Author and His Sister.” Today, though, Sachs is remembered not for his thoughts on the nature of the albino, but rather those on another curious condition that was far less noticeable—but received a chapter of its very own in his thesis all the same: synesthesia. Georg Sachs just so happens to be the first known synesthete in the medical or psychological literature. Synesthesia means, literally, a cross-mingling of the senses, when two or more senses talk to each other in a way that is not usually associated with either sense on its own. For instance, you see color when you listen to a song on the radio. Taste shapes as you take a bite of your spaghetti. Frown at the 3 on that piece of paper because it’s giving you attitude—it seems irritable. Smile at the woman you just met because her name comes with a beautiful orange glow. The variations are many, but in every scenario, there is a sensory cross-talk that reaches to a neural level. As in, if I were to put you in a scanner while you took that bite or listened to that musical composition, the relevant areas of the brain would light up: your brain would actually be experiencing color, shape, or whatever you say you’re experiencing as if you were exposed to that very stimulus. It’s a condition that affects, by the most recent estimates, roughly 4% of the population. © 2013 Scientific American
Link ID: 17854 - Posted: 02.27.2013
Canadian researchers have found out how to restore normal vision to kittens with a lazy eye without using an eye patch. The cure was relatively simple — putting the kittens in complete darkness for 10 days. Once the kittens were returned to daylight, they regained normal vision in the lazy eye within a week, reported researchers at Dalhousie University in Halifax in the journal Current Biology this month. Lazy eye is a condition where the brain effectively turns off one eye. It affects about four per cent of the population in humans, and the most common treatment is fix the vision problem (for example, by using glasses) and then patch the good eye, forcing the person to use their bad eye. Kevin Duffy, a neuroscientist who co-authored the new study, told CBC's Quirks & Quarks that the condition is typically the result of a vision problem such as a cataract, a misalignment of the eyes, or poor focus in one eye, which then causes the brain to develop abnormally. "If the eye is providing abnormal vision, then the circuits that connect to that eye are going to develop abnormally," he said. The brain "becomes effectively disconnected." © CBC 2013
By Susan Milius Slight electric fields that form around flowers may lure pollinators much as floral colors and fragrances do. In lab setups, bumblebees learned to distinguish fake flowers by their electrical fields, says sensory biologist Daniel Robert at the University of Bristol in England. Combining an electrical charge with a color helped the bees learn faster, Robert and his colleagues report online February 21 in Science. Plants, a bit like lightning rods, tend to conduct electrical charges to the ground, Robert says. And bees pick up a positive charge from the atmosphere’s invisible rain of charged particles. “Anything flying through the air, whether it’s a baseball, 767 jumbo jet, or a bee, acquires a strong positive electrostatic charge due to interaction with air molecules,” says Stephen Buchmann of the University of Arizona in Tucson. Robert and his colleagues checked whether bees could choose flowers based solely on the electric fields the plants produce. Purple metal disks (encased in plastic so as not to shock bees) stood in for flowers. Half of them, wired for 30 volts, held sips of sugar water. The unwired ones offered a bitter quinine solution that bees don’t like. Bombus terrestris bumblebees learned to choose sweet, wired disks more than 80 percent of the time. When researchers unplugged the wired disks, the bees bumbled, scoring sugar only by chance. © Society for Science & the Public 2000 - 2013
Link ID: 17832 - Posted: 02.23.2013
by Gisela Telis A stint in the dark may be just what the doctor ordered—at least if you have "lazy eye." Researchers report that kittens with the disorder, a visual impairment medically known as amblyopia that leads to poor sight or blindness in one eye, can completely recover their vision by simply spending 10 days in total darkness. "It's a remarkable study, with real potential to change how we think about recovery from amblyopia," says neuroscientist Frank Sengpiel of Cardiff University in the United Kingdom who was not involved in the work. Amblyopia affects about 4% of the human population. It's thought to start with an imbalance in vision early in life: If one eye doesn't see as well as the other—because, for example, of a cataract or astigmatism—the brain reroutes most of the connections needed for visual processing to the "good" eye. Doctors often treat the condition by patching the good eye and forcing the brain to rely on the other eye, but the treatment risks damaging vision in the good eye, and if it doesn't succeed or occur early enough in a child's visual development, the vision loss in the impaired eye can be permanent. Earlier studies with cats, whose complex visual systems are good stand-ins for human vision, showed that neurons in the brain's visual centers shrink when the brain decides to disconnect from the bad eye, but that they grow again when the cats are placed in darkness. So neuroscientists Kevin Duffy and Donald Mitchell of Dalhousie University in Halifax, Canada, set out to test darkness itself as a treatment. © 2010 American Association for the Advancement of Science
By PAM BELLUCK The device allows people with a certain type of blindness to detect crosswalks on the street, the presence of people or cars, and sometimes even large numbers or letters. The approval of the system marks a milestone in a new frontier in vision research, a field in which scientists are making strides with gene therapy, optogenetics, stem cells and other strategies. “This is just the beginning,” said Grace Shen, a director of the retinal diseases program at the National Eye Institute, which helped finance the artificial retina research and is supporting many other blindness therapy projects. “We have a lot of exciting things sitting in the wings.” The artificial retina is a sheet of electrodes implanted in the eye. The patient is also given glasses with an attached camera and a portable video processor. This system, called Argus II, allows visual signals to bypass the damaged portion of the retina and be transmitted to the brain. With the artificial retina or retinal prosthesis, a blind person cannot see in the conventional sense, but can identify outlines and boundaries of objects, especially when there is contrast between light and dark — fireworks against a night sky or black socks mixed with white ones. “Without the system, I wouldn’t be able to see anything at all, and if you were in front of me and you moved left and right, I’m not going to realize any of this,” said Elias Konstantopolous, 74, a retired electrician in Baltimore, one of about 50 Americans and Europeans who have been using the device in clinical trials. He said it helps him differentiate curbs from roads, and detect contours of objects and people. “When you have nothing, this is something. It’s a lot.” The F.D.A. approved Argus II, made by Second Sight Medical Products, to treat people with severe retinitis pigmentosa, in which photoreceptor cells, which take in light, deteriorate. © 2013 The New York Times Company
The latest bionic superhero is a rat: its brain hooked up to an infrared detector, it's become the first animal to be given a sixth sense. Developed by Miguel Nicolelis and colleagues at Duke University in Durham, North Carolina, the system connects a head-mounted sensor to a brain region that normally processes touch sensations from whiskers. As shown in this video, the rat's brain is tricked when infrared light is detected, giving it a new sense organ. "Instead of seeing, the rats learned how to touch the light," says Nicolelis. Even though the touch-processing brain area acquires a new role, the team found that it continues to process touch sensations from whiskers, somehow dividing its time between both types of signal. "The adult brain is a lot more plastic than we thought," says Nicolelis. The finding could lead to new brain prostheses that restore sight in humans with a damaged visual cortex. By bypassing the damaged part of the brain altogether, it might be possible to wire up a video camera to a part of the brain that processes touch, letting people "touch" what the camera sees. According to Nicolelis, it could also lead to superhero powers for humans. "It could be X-rays, radio waves, anything," he says. "Superman probably had a prosthetic device that nobody knew of." © Copyright Reed Business Information Ltd.