Links for Keyword: Vision
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Jane E. Brody Joanne Reitano is a professor of history at LaGuardia Community College in Long Island City, Queens. She writes wonderful books about the history of the city and state, and has recently been spending many hours — sometimes all day — at her computer to revise her first book, “The Restless City.” But while sitting in front of the screen, she told me, “I developed burning in my eyes that made it very difficult to work.” After resting her eyes for a while, the discomfort abates, but it quickly returns when she goes back to the computer. “If I was playing computer games, I’d turn off the computer, but I need it to work,” the frustrated professor said. Dr. Reitano has a condition called computer vision syndrome. She is hardly alone. It can affect anyone who spends three or more hours a day in front of computer monitors, and the population at risk is potentially huge. Worldwide, up to 70 million workers are at risk for computer vision syndrome, and those numbers are only likely to grow. In a report about the condition written by eye care specialists in Nigeria and Botswana and published in Medical Practice and Reviews, the authors detail an expanding list of professionals at risk — accountants, architects, bankers, engineers, flight controllers, graphic artists, journalists, academicians, secretaries and students — all of whom “cannot work without the help of computer.” And that’s not counting the millions of children and adolescents who spend many hours a day playing computer games. Studies have indicated 70 percent to 90 percent of people who use computers extensively, whether for work or play, have one or more symptoms of computer vision syndrome. The effects of prolonged computer use are not just vision-related. Complaints include neurological symptoms like chronic headaches and musculoskeletal problems like neck and back pain. © 2016 The New York Times Company
Sara Reardon Every time something poked its foot, the mouse jumped in pain. Researchers at Circuit Therapeutics, a start-up company in Menlo Park, California, had made the animal hypersensitive to touch by tying off a nerve in its leg. But when they shone a yellow light on its foot while poking it, the mouse did not react. The treatment is one of several nearing clinical use that draw on optogenetics — a technique in which light is used to control genes and neuron firing. In March, RetroSense Therapeutics of Ann Arbor, Michigan, began the first clinical-safety trial of an optogenetic therapy to treat the vision disorder retinitis pigmentosa. Many scientists are waiting to see how the trial turns out before they decide how to move forward with their own research on a number of different applications. “I think it will embolden people if there’s good news,” says Robert Gereau, a pain researcher at Washington University in St Louis, Missouri. “It opens up a whole new range of possiblilities for how to treat neurological diseases.” Retinitis pigmentosa destroys photoreceptors in the eye. RetroSense’s treatment seeks to compensate for this loss by conferring light sensitivity to retinal ganglion cells, which normally help to pass visual signals from photoreceptors to the brain. The therapy involves injecting patients who are blind or mostly blind with viruses carrying genes that encode light-sensitive proteins called opsins. The cells fire when stimulated with blue light, passing the visual information to the brain. Chief executive Sean Ainsworth says that the company has injected several individuals in the United States with the treatment, and plans to enroll a total of 15 blind patients in its trial. RetroSense will follow them for two years, but may release some preliminary data later this year. © 2016 Nature Publishing Group
Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 22235 - Posted: 05.21.2016
By Roni Caryn Rabin Here’s another reason to eat your fruits and veggies: You may reduce your risk of vision loss from cataracts. Cataracts that cloud the lenses of the eye develop naturally with age, but a new study is one of the first to suggest that diet may play a greater role than genetics in their progression. Researchers had about 1,000 pairs of female twins in Britain fill out detailed food questionnaires that tracked their nutrient intake. Their mean age was just over 60. The study participants underwent digital imaging of the eye to measure the progression of cataracts. The researchers found that women who consumed diets rich in vitamin C and who ate about two servings of fruit and two servings of vegetables a day had a 20 percent lower risk of cataracts than those who ate a less nutrient-rich diet. Ten years later, the scientists followed up with 324 of the twin pairs, and found that those who had reported consuming more vitamin C in their diet — at least twice the recommended dietary allowance of 75 milligrams a day for women (the R.D.A. for adult men is 90 milligrams) — had a 33 percent lower risk of their cataracts progressing than those who get less vitamin C. The researchers concluded that genetic factors account for about 35 percent of the difference in cataract progression, while environmental factors like diet account for 65 percent. “We found no beneficial effect from supplements, only from the vitamin C in the diet,” said Dr. Christopher Hammond, a professor of ophthalmology at King’s College London and an author of the study,published in Ophthalmology. Foods high in vitamin C include oranges, cantaloupe, kiwi, broccoli and dark leafy greens. © 2016 The New York Times Company
Monya Baker A surgical technique to treat cataracts in children spurs stem cells to generate a new, clear lens. Discs made of multiple types of eye tissue have been grown from human stem cells — and that tissue has been used to restore sight in rabbits. The work, reported today in Nature1, suggests that induced pluripotent stem (iPS) cells — stem cells generated from adult cells — could one day be harnessed to provide replacement corneal or lens tissue for human eyes. The discs also could be used to study how eye tissue and congenital eye diseases develop. “The potential of this technique is mind-boggling,” says Mark Daniell, head of corneal research at the Centre for Eye Research Australia in Melbourne, who was not involved in the research. “It’s almost like an eye in a dish.” A second, unrelated paper in Nature2 describes a surgical procedure that activates the body’s own stem cells to regenerate a clear, functioning lens in the eyes of babies born with cataracts. The two studies are “amazing, almost like science fiction”, Daniell says. In the first study, a team led by Kohji Nishida, an ophthalmologist at Osaka University Graduate School of Medicine in Japan, cultivated human iPS cells to produce discs that contained several types of eye tissue. © 2016 Nature Publishing Group
By Virginia Morell Butterflies may not have a human’s sharp vision, but their eyes beat us in other ways. Their visual fields are larger, they’re better at perceiving fast-moving objects, and they can distinguish ultraviolet and polarized light. Now, it turns out that one species of swallowtail butterfly from Australasia, the common bluebottle (Graphium sarpedon, pictured), known for its conspicuous blue-green markings, is even better equipped for such visual tasks. Each of their eyes, scientists report in Frontiers in Ecology and Evolution, contains at least 15 different types of photoreceptors, the light-detecting cells required for color vision. These are comparable to the rods and cones found in our eyes. To understand how the spectrally complex retinas of butterflies evolved, the researchers used physiological, anatomical, and molecular experiments to examine the eyes of 200 male bluebottles collected in Japan. (Only males were used because the scientists failed to catch a sufficient number of females.) They found that different colors stimulate each class of receptor. For instance, UV light stimulates one, while slightly different blue lights set off three others; and green lights trigger four more. Most insect species have only three classes of photoreceptors. Even humans have only three cones, yet we still see millions of colors. Butterflies need only four receptor classes for color vision, including spectra in the UV region. So why did this species evolve 11 more? The scientists suspect that some of the receptors must be tuned to perceive specific things of great ecological importance to these iridescent butterflies—such as sex. For instance, with eyes alert to the slightest variation in the blue-green spectrum, male bluebottles can spot and chase their rivals, even when they’re flying against a blue sky. © 2016 American Association for the Advancement of Science
Our eyes constantly send bits of information about the world around us to our brains where the information is assembled into objects we recognize. Along the way, a series of neurons in the eye uses electrical and chemical signals to relay the information. In a study of mice, National Institutes of Health scientists showed how one type of neuron may do this to distinguish moving objects. The study suggests that the NMDA receptor, a protein normally associated with learning and memory, may help neurons in the eye and the brain relay that information. “The eye is a window onto the outside world and the inner workings of the brain,” said Jeffrey S. Diamond, Ph.D., senior scientist at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS), and the senior author of the study published in Neuron. “Our results show how neurons in the eye and the brain may use NMDA receptors to help them detect motion in a complex visual world.” Vision begins when light enters the eye and hits the retina, which lines the back of the eyeball. Neurons in the retina convert light into nerve signals which are then sent to the brain. Using retinas isolated from mice, Dr. Alon Poleg-Polsky, Ph.D. a postdoctoral fellow in Dr. Diamond’s lab, studied neurons called directionally selective retinal ganglion cells (DSGCs), which are known to fire and send signals to the brain in response to objects moving in specific directions across the eye. Electrical recordings showed that some of these cells fired when a bar of light passed across the retina from left to right, whereas others responded to light crossing in the opposite direction. Previous studies suggested these unique responses are controlled by incoming signals sent from neighboring cells at chemical communication points called synapses. In this study, Dr. Poleg-Polsky discovered that the activity of NMDA receptors at one set of synapses may regulate whether DSGCs sent direction-sensitive information to the brain.
By C. CLAIBORNE RAY Q. What’s the No. 1 cause of blindness in seniors in the United States? A. “It sounds like a simple question, but there’s no perfect answer,” said Dr. Susan Vitale, a research epidemiologist at the National Eye Institute of the National Institutes of Health. “It depends on age, how blindness is measured and how statistics are collected.” For example, some studies have relied on the self-reported answer to the vague question: “Do you have vision problems?” The best available estimates, she said, come from a 2004 paper aggregating many other studies, some in the United States and some in other countries, updated by applying later census data. This paper and others have found striking differences by age and by racial and socioeconomic groups, Dr. Vitale said. In white people, she said, the major cause of blindness at older ages is usually age-related macular degeneration, progressive damage to the central portion of the retina. In older black people, the major causes are likely to be glaucoma or cataracts. In older people of working age, from their 40s to their 60s, the major cause, regardless of race, is diabetic retinopathy, damage to the retina as a result of diabetes. Many studies have shown that white people are more likely to have age-related macular degeneration, Dr. Vitale said, but as for cataracts, for which blindness is preventable by surgery, there are questions about access to health care and whether those affected can get the needed surgery. It is not known why black people are at higher risk of glaucoma. There are also some gender differences, she said, with white women more likely than white men to become blind. Studies have not found the same difference by gender in black and Hispanic people. Because many of the causes of blindness at all ages are preventable, Dr. Vitale said, it is essential to have regular eye checkups, even if there are no obvious symptoms. © 2016 The New York Times Company
By Susana Martinez-Conde, Stephen L. Macknik In the forests of Australia and New Guinea lives a pigeon-sized creature that is not only a master builder but a clever illusionist, too. The great bowerbird (Chlamydera nuchalis)—a cousin of crows and jays—has an elaborate mating ritual that relies on the male's ability to conjure forced perspective. Throughout the year he painstakingly builds and maintains his bower: a 60-centimeter-long corridor made of twigs, leading to a courtyard decorated with gray and white pebbles, shells and bones. Some species also add flowers, fruits, feathers, bottle caps, acorns, abandoned toys—whatever colorful knickknacks they can find. The male takes great care to arrange the objects according to size so that the smallest pieces are closest to the bower's entrance and the largest items are farthest away. The elaborate structure is not a nest. Its sole purpose is to attract a female for mating. Once construction is complete, the male performs in the courtyard for a visiting female, who—poised like a critical American Idol judge—evaluates the routine from the middle of the corridor. He sings, dances and prances, tossing around a few select trinkets to impress his potential mate. Her viewpoint is very narrow, and so she perceives objects paving the courtyard as being uniform in size. This forced perspective makes the choice offerings appear grander and therefore all the more enticing. The offerings, and the male himself, appear larger than life because of an effect that visual scientists call the Ebbinghaus illusion, which causes an object to look bigger if it is surrounded by smaller objects. © 2016 Scientific American
Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 8: Hormones and Sex
Link ID: 21912 - Posted: 02.19.2016
Floaters, those small dots or cobweb-shaped patches that move or “float” through the field of vision, can be alarming. Though many are harmless, if you develop a new floater, “you need to be seen pretty quickly” by an eye doctor in order to rule out a retinal tear or detachment, said Dr. Rebecca Taylor, a spokeswoman for the American Academy of Ophthalmology. Floaters are caused by clumping of the vitreous humor, the gel-like fluid that fills the inside of the eye. Normally, the vitreous gel is anchored to the back of the eye. But as you age, it tends to thin out and may shrink and pull away from the inside surface of the eye, causing clumps or strands of connective tissue to become lodged in the jelly, much as “strands of thread fray when a button comes off on your coat,” Dr. Taylor said. The strands or clumps cast shadows on the retina, appearing as specks, dots, clouds or spider webs in your field of vision. Such changes may occur at younger ages, too, particularly if you are nearsighted or have had a head injury or eye surgery. There is no treatment for floaters, though they usually fade with time. But it’s still important to see a doctor if new floaters arise because the detaching vitreous gel can pull on the retina, causing it to tear, which can lead to retinal detachment, a serious condition. The pulling or tugging on the retina may be perceived as lightning-like flashes, “like a strobe light off to the side of your vision,” Dr. Taylor said. See an eye doctor within 24 to 48 hours if you have a new floater, experience a sudden “storm” of floaters, see a gray curtain or shadow move across your field of vision, or have a sudden decrease in vision. © 2016 The New York Times Company
By Susana Martinez-Conde Take a look at the red chips on the two Rubik cubes below. They are actually orange on the left and purple on the right, if you look at them in isolation. They only appear more or less equally red across the images because your brain is interpreting them as red chips lit by either yellow or blue light. This kind of misperception is an example of perceptual constancy, the mechanism that allows you to recognize an object as being the same in different environments, and under very diverse lighting conditions. Constancy illusions are adaptive: consider what would have happened if your ancestors thought a friend became a foe whenever a cloud hid the sun, or if they lost track of their belongings–and even their own children—every time they stepped out of the cave and into the sunlight. Why, they might have even eaten their own kids! You are here because the perceptual systems of your predecessors were resistant to annoying changes in the physical reality–as is your own (adult) perception. There are many indications that constancy effects must have helped us survive (and continue to do so). One such clue is that we are not born with perceptual constancy, but develop it many months after birth. So at first we see all differences, and then we learn to ignore certain types of differences so that we can recognize the same object as unchanging in many varied scenarios. When perceptual constancy arises, we lose the ability to detect multiple contradictions that are nevertheless highly noticeable to young babies. © 2016 Scientific American
Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 13: Memory, Learning, and Development
Link ID: 21858 - Posted: 02.04.2016
By Ana Swanson Earlier this year, the famous blue-and-black (or white-and-gold) dress captivated the Internet, serving as a reminder that color is truly in the eye of the beholder. The dress was also a lesson in the power of social media, the science of shifting colors, and the fun of optical illusions. Here we present a visual story from February 27 that rounded up some of the best-known optical illusions on the Web. The Internet erupted in an energetic debate yesterday about whether an ugly dress was blue and black or white and gold, with celebrities from Anna Kendrick (white) to Taylor Swift (black) weighing in. (For the record, I’m with Taylor – never a bad camp to be in.) It sounds inane, but the dress question was actually tricky: Some declared themselves firmly in the blue and black camp, only to have the dress appear white and gold when they looked back a few hours later. Wired had the best explanation of the science behind the dress’s shifting colors. When your brain tries to figure out what color something is, it essentially subtracts the lighting and background colors around it, or as the neuroscientist interviewed by Wired says, tries to “discount the chromatic bias of the daylight axis.” This is why you can identify an apple as red whether you see it at noon or at dusk. The dress is on some kind of perceptual boundary, with a pretty even mix of blue, red and green. (Frankly, it’s just a terrible, washed out photo.) So for those who see it as white, your eyes may be subtracting the wrong background and lighting.
By John Bohannon It may sound like a bird-brained idea, but scientists have trained pigeons to spot cancer in images of biopsied tissue. Individually, the avian analysts can't quite match the accuracy of professional pathologists. But as a flock, they did as well as trained humans, according to a new study appearing this week in PLOS ONE. Cancer diagnosis often begins as a visual challenge: Does this lumpy spot in a mammogram image justify a biopsy? And do cells in biopsy slides look malignant or benign? Training doctors and medical technicians to tell the difference is expensive and time-consuming, and computers aren't yet up to the task. To see whether a different type of trainee could do better, a team led by Richard Levenson, a pathologist and technologist at the University of California, Davis, and Edward Wasserman, a psychologist at the University of Iowa, in Iowa City, turned to pigeons. In spite of their limited intellect, the bobble-headed birds have certain advantages. They have excellent visual systems, similar to, if not better than, a human's. They sense five different colors as opposed to our three, and they don’t “fill in” the gaps like we do when expected shapes are missing. However, training animals to do a sophisticated task is tricky. Animals can pick up on unintentional cues from their trainers and other humans that may help them correctly solve problems. For example, a famous 20th century horse named Clever Hans was purportedly able to do simple arithmetic, but was later shown to be observing the reactions of his human audience. And although animals can perform extremely well on tasks that are confined to limited circumstances, overtraining on one set of materials can lead to total inaccuracy when the same information is conveyed slightly differently. © 2015 American Association for the Advancement of Science
Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 21652 - Posted: 11.21.2015
Susan Milius Certain species of the crawling lumps of mollusk called chitons polka-dot their armor-plated backs with hundreds of tiny black eyes. But mixing protection and vision can come at a price. The lenses are rocky nuggets formed mostly of aragonite, the same mineral that pearls and abalone shells are made of. New analyses of these eyes support previous evidence that they form rough images instead of just sensing overall lightness or darkness, says materials scientist Ling Li of Harvard University. Adding eyes to armor does introduce weak spots in the shell. Yet the positioning of the eyes and their growth habits show how chitons compensate for that, Li and his colleagues report in the November 20 Science. Li and coauthor Christine Ortiz of MIT have been studying such trade-offs in biological materials that serve multiple functions. Human designers often need substances that multitask, and the researchers have turned to evolution’s solutions in chitons and other organisms for inspiration. Biologists had known that dozens of chiton species sprinkle their armored plates with simple-seeming eye spots. (The armor has other sensory organs: pores even tinier than the eyes.) But in 2011, a research team showed that the eyes of the West Indian fuzzy chiton (Acanthopleura granulata) were much more remarkable than anyone had realized. Their unusual aragonite lens can detect the difference between a looming black circle and a generally gray field of vision. Researchers could tell because chitons clamped their shells defensively to the bottom when a scary circle appeared but not when an artificial sky turned overall shadowy. © Society for Science & the Public 2000 - 2015
Angus Chen If you peek into classrooms around the world, a bunch of bespectacled kids peek back at you. In some countries such as China, as much as 80 percent of children are nearsighted. As those kids grow up, their eyesight gets worse, requiring stronger and thicker eyeglasses. But a diluted daily dose of an ancient drug might slow that process. The drug is atropine, one of the toxins in deadly nightshade and jimsonweed. In the 19th and early 20th centuries, atropine was known as belladonna, and fancy Parisian ladies used it to dilate their pupils, since big pupils were considered alluring at the time. A few decades later, people started using atropine to treat amblyopia, or lazy eye, since it blurs the stronger eye's vision and forces the weaker eye to work harder. As early as the 1990s, doctors had some evidence that atropine can slow the progression of nearsightedness. In some countries, notably in Asia, a 1 percent solution of atropine eyedrops is commonly prescribed to children with myopia. It's not entirely clear how atropine works. Because people become nearsighted when their eyeballs get too elongated, it's generally thought that atropine must be interfering with that unwanted growth. But as Parisians discovered long ago, the drug can have some inconvenient side effects. © 2015 npr
A clinical trial funded by the National Institutes of Health has found that the drug ranibizumab (Lucentis) is highly effective in treating proliferative diabetic retinopathy. The trial, conducted by the Diabetic Retinopathy Clinical Research Network (DRCR.net) compared Lucentis with a type of laser therapy called panretinal or scatter photocoagulation, which has remained the gold standard for proliferative diabetic retinopathy since the mid-1970s. The findings demonstrate the first major therapy advance in nearly 40 years. “These latest results from the DRCR Network provide crucial evidence for a safe and effective alternative to laser therapy against proliferative diabetic retinopathy,” said Paul A. Sieving, M.D., Ph.D., director of NIH’s National Eye Institute (NEI), which funded the trial. The results were published online today in the Journal of the American Medical Association. Treating abnormal retinal blood vessels with laser therapy became the standard treatment for proliferative diabetic retinopathy after the NEI announced results of the Diabetic Retinopathy Study in 1976. Although laser therapy effectively preserves central vision, it can damage night and side vision; so, researchers have sought therapies that work as well or better than laser but without such side effects. A complication of diabetes, diabetic retinopathy can damage blood vessels in the light-sensitive retina in the back of the eye. As the disease worsens, blood vessels may swell, become distorted and lose their ability to function properly. Diabetic retinopathy becomes proliferative when lack of blood flow in the retina increases production of a substance called vascular endothelial growth factor, which can stimulate the growth of new, abnormal blood vessels.
By Kelli Whitlock Burton More than half of Americans over the age of 70 have cataracts, caused by clumps of proteins collecting in the eye lens. The only way to remove them is surgery, an unavailable or unaffordable option for many of the 20 million people worldwide who are blinded by the condition. Now, a new study in mice suggests eye drops made with a naturally occurring steroid could reverse cataracts by teasing apart the protein clumps. “This is a game changer in the treatment of cataracts,” says Roy Quinlan, a molecular biologist at Durham University in the United Kingdom who was not part of the study. “It takes decades for the cataracts to get to that point, so if you can reverse that by a few drops in the eye over a couple of weeks, that’s amazing.” The proteins that make up the human lens are among the oldest in the body, forming at about 4 weeks after fertilization. The majority are crystallins, a family of proteins that allow the eye to focus and keep the lens clear. Two of the most abundant crystallins, CRYAA and CRYAB, are produced in response to stress or injury. They act as chaperones, identifying and binding to damaged and misfolded proteins in the lens, preventing them from aggregating. But over the years, as damaged proteins accumulate in the lens, these chaperones become overwhelmed. The mutated proteins then clump together, blocking light and producing the tell-tale cloudiness of cataracts. © 2015 American Association for the Advancement of Science
By Christof Koch Artificial intelligence has been much in the news lately, driven by ever cheaper computer processing power that has become effectively a near universal commodity. The excitement swirls around mathematical abstractions called deep convolutional neural networks, or ConvNets. Applied to photographs and other images, the algorithms that implement ConvNets identify individuals from their faces, classify objects into one of 1,000 distinct categories (cheetah, husky, strawberry, catamaran, and so on)—and can describe whether they see “two pizzas sitting on top of a stove top oven” or “a red motorcycle parked on the side of the road.” All of this happens without human intervention. Researchers looking under the hood of these powerful algorithms are surprised, puzzled and entranced by the beauty of what they find. How do ConvNets work? Conceptually they are but one or two generations removed from the artificial neural networks developed by engineers and learning theorists in the 1980s and early 1990s. These, in turn, are abstracted from the circuits neuroscientists discovered in the visual system of laboratory animals. Already in the 1950s a few pioneers had found cells in the retinas of frogs that responded vigorously to small, dark spots moving on a stationary background, the famed “bug detectors.” Recording from the part of the brain's outer surface that receives visual information, the primary visual cortex, Torsten Wiesel and the late David H. Hubel, both then at Harvard University, found in the early 1960s a set of neurons they called “simple” cells. These neurons responded to a dark or a light bar of a particular orientation in a specific region of the visual field of the animal. © 2015 Scientific American
By Jessica Schmerler Young brains are plastic, meaning their circuitry can be easily rewired to promote learning. By adulthood, however, the brain has lost much of its plasticity and can no longer readily recover lost function after, say, a stroke. Now scientists have successfully restored full youthful plasticity in adult mice by transplanting young neurons into their brain—curing their severe visual impairments in the process. In a groundbreaking study published in May in Neuron, a team of neuroscientists led by Sunil Gandhi of the University of California, Irvine, transplanted embryonic mouse stem cells into the brains of other mice. The cells were primed to become inhibitory neurons, which tamp down brain activity. Prior to this study, “it was widely doubted that the adult brain would allow these cells to disperse, integrate and reactivate plasticity,” says Melissa Davis, first author of the study. Scientists have been attempting such a feat for years, refining their methods along the way, and the Irvine team finally saw success: the cells were integrated in the brain and caused large-scale rewiring, restoring the high-level plasticity of early development. In visually impaired mice, the transplant allowed for the restoration of normal vision, as demonstrated by tests of visual nerve signals and a swimming maze test. The scientists have not yet tested the transplanting technique for other neurological disorders, but they believe the technique has potential for many conditions and injuries depending on how, exactly, the new neurons restore plasticity. It is not yet known whether the proliferation of the transplanted cells accounts for the restored plasticity or if the new cells trigger plasticity in existing neurons. If the latter, the treatment could spur the rewiring and healing of the brain following traumatic brain injury or stroke. © 2015 Scientific American
Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 13: Memory, Learning, and Development
Link ID: 21572 - Posted: 10.27.2015
By Karen Weintraub The short answer is: not yet, but treatments are getting better. Getting older is the leading risk factor for age-related macular degeneration, the leading cause of vision loss in the United States. Macular degeneration comes in two forms: dry and wet. The dry form is milder and usually has no symptoms, but it can degenerate into the wet form, which is characterized by the growth of abnormal blood vessels in the back of the eye, potentially causing blurriness or vision loss in the center of the field of vision. The best treatment for wet macular degeneration is prevention, said Dr. Rahul N. Khurana, a clinical spokesman for the American Academy of Ophthalmology and a retina specialist practicing in Mountain View, Calif. Not smoking, along with eating dark green vegetables and at least two servings of fish a week, may help reduce the risk of macular degeneration, he said. An annual eye exam can catch macular degeneration while it is still in the dry form, Dr. Khurana said, and vitamins can help prevent it from progressing into the wet form, the main cause of vision loss. Dr. Joan W. Miller, chief of ophthalmology at Massachusetts Eye and Ear, said anyone with a family history of the disease should get a retina check at age 50. People should also get an eye exam if they notice problems like trouble adjusting to the dark or needing more light to read. The federally funded Age-Related Eye Disease Study, published in 2001 and updated in 2013, found that people at high risk for advanced age-related macular degeneration could cut that risk by about 25 percent by taking a supplement that included 500 milligrams of vitamin C, 400 I.U.s of vitamin E, 10 milligrams of lutein, 2 milligrams of zeaxanthin, 80 milligrams of zinc, and 2 milligrams of copper. © 2015 The New York Times Company
By Hanae Armitage CHICAGO, ILLINOIS—Aside from a few animals—like pythons and vampire bats—that can sense infrared light, the world of this particular electromagnetic radiation has been off-limits to most creatures. But now, researchers have engineered rodents to see infrared light by implanting sensors in their visual cortex—a first-ever feat announced here yesterday at the annual meeting of the Society for Neuroscience. Before they wired rats to see infrared light, Duke University neuroscientist Miguel Nicolelis and his postdoc Eric Thomson engineered them to feel it. In 2013, they surgically implanted a single infrared-detecting electrode into an area of the rat’s brain that processes touch called the somatosensory cortex. The other end of the sensor, outside the rat’s head, surveyed the environment for infrared light. When it picked up infrared, the sensor sent electrical messages to the rats’ brains that seemed to give them a physical sensation. At first, the rats would groom and rub their whiskers repeatedly whenever the light went on. But after a short while, they stopped fidgeting. They even learned to associate infrared with a reward-based task in which they followed the light to a bowl of water. In the new experiment, the team inserted three additional electrodes, spaced out equally so that the rats could have 360 degrees of infrared perception. When they were primed to perform the same water-reward task, they learned it in just 4 days, compared to 40 days with the single implant. “Frankly, this was a surprise,” Thomson says. “I thought it would be really confusing for [the rats] to have so much stimulation all over their brain, rather than [at] one location.” © 2015 American Association for the Advancement of Science.