Chapter 10. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Stephen L. Macknik and Susana Martinez-Conde To a neuroscientist, the trouble with cocktail parties is not that we do not love cocktails or parties (many neuroscientists do). Instead what we call “the cocktail party problem” is the mystery of how anyone can have a conversation at a cocktail party at all. Consider a typical scene: You have a dozen or more lubricated and temporarily uninhibited adults telling loud, improbable stories at increasing volumes. Interlocutors guffaw and slap backs. Given the decibel level, it is a minor neural miracle that any one of these revelers can hear and parse one word from any other. The alcohol does not help, but it is not the main source of difficulties. The cocktail party problem is that there is just too much going on at once: How can our brain filter out the noise to focus on the wanted information? This problem is a central one for perceptual neuroscience—and not just during cocktail parties. The entire world we live in is quite literally too much to take in. Yet the brain does gather all of this information somehow and sorts it in real time, usually seamlessly and correctly. Whereas the physical reality consists of comparable amounts of signal and noise for many of the sounds and sights around you, your perception is that the conversation or object that interests you remains in clear focus. So how does the brain accomplish this feat? One critical component is that our neural circuits simplify the problem by actively ignoring—suppressing—anything that is not task-relevant. Our brain picks its battles. It stomps out irrelevant information so that the good stuff has a better chance of rising to awareness. This process, colloquially called attention, is how the brain sorts the wheat from the chaff. © 2014 Scientific American
by Andy Coghlan To catch agile prey on the wing, dragonflies rely on the same predictive powers we use to catch a ball: that is, anticipating by sight where the ball will go and readying body and hand to snatch it from mid-air. Until now, dragonflies were thought to catch their prey without this predictive skill, instead blindly copying every steering movement made by their prey, which can include flies and bees. Now, sophisticated laboratory experiments have tracked the independent body and eye movements of dragonflies as they pursue prey, showing for the first time that dragonflies second guess where their prey will fly to next and then steer their flight accordingly. Throughout the pursuit, they lock on to their target visually while they orient their bodies and flight path for ultimate interception, rather than copying each little deviation in their prey's flight path in the hope of ultimately catching up with it. "The dragonfly lines up its body axis in the flight direction of the prey, but keeps the eyes in its head firmly fixed on the prey," says Anthony Leonardo of the Howard Hughes Medical Institute in Ashburn, Virginia. "It enables the dragonfly to catch the prey from beneath and behind, the prey's blind spot," he says. © Copyright Reed Business Information Ltd.
Link ID: 20412 - Posted: 12.13.2014
Jia You Ever wonder how cockroaches scurry around in the dark while you fumble to switch on the kitchen light? Scientists know the insect navigates with its senses of touch and smell, but now they have found a new piece to the puzzle: A roach can also see its environment in pitch darkness, by pooling visual signals from thousands of light-sensitive cells in each of its compound eyes, known as photoreceptors. To test the sensitivity of roach vision, researchers created a virtual reality system for the bugs, knowing that when the environment around a roach rotates, the insect spins in the same direction to stabilize its vision. First, they placed the roach on a trackball, where it couldn’t navigate with its mouthpart or antennae. Then the scientists spun black and white gratings around the insect, illuminated by light at intensities ranging from a brightly lit room to a moonless night. The roach responded to its rotating environment in light as dim as 0.005 lux, when each of its photoreceptors was picking up only one photon every 10 seconds, the researchers report online today in The Journal of Experimental Biology. They suggest that the cockroach must rely on unknown neural processing in the deep ganglia, an area in the base of the brain involved in coordinating movements, to process such complex visual information. Understanding this mechanism could help scientists design better imaging systems for night vision. © 2014 American Association for the Advancement of Science.
Link ID: 20389 - Posted: 12.04.2014
Katharine Sanderson Although we do not have X-ray vision like Superman, we have what could seem to be another superpower: we can see infrared light — beyond what was traditionally considered the visible spectrum. A series of experiments now suggests that this little-known, puzzling effect could occur when pairs of infrared photons simultaneously hit the same pigment protein in the eye, providing enough energy to set in motion chemical changes that allow us to see the light. Received wisdom, and the known chemistry of vision, say that human eyes can see light with wavelengths between 400 (blue) and 720 nanometres (red). Although this range is still officially known as the 'visible spectrum', the advent of lasers with very specific infrared wavelengths brought reports that people were seeing laser light with wavelengths above 1,000 nm as white, green and other colours. Krzysztof Palczewski, a pharmacologist at Case Western Reserve University in Cleveland, Ohio, says that he has seen light of 1,050 nm from a low-energy laser. “You see it with your own naked eye,” he says. To find out whether this ability is common or a rare occurrence, Palczewski scanned the retinas of 30 healthy volunteers with a low-energy beam of light, and changed its wavelength. As the wavelength increased into the infrared (IR), participants found the light at first harder to detect, but at around 1,000 nm the light became easier to see. How humans can do this has puzzled scientists for years. Palczewski wanted to test two leading hypotheses to explain infrared vision. © 2014 Nature Publishing Group,
Link ID: 20388 - Posted: 12.03.2014
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
By Victoria Colliver Marianne Austin watched her mother go blind from age-related macular degeneration, an eye disease that affects about 10 million older Americans. Now that Austin has been diagnosed with the same condition, she wants to avoid her mother’s experience. “I’ve seen what can happen and the devastation it can cause,” said Austin, 67, of Atherton, who found out she had the disease last year. “I call it having seen the movie. I don’t like that ending, I want to change the movie, and I don’t want to wait 10 years until something is proven in research.” About 10 percent of patients diagnosed with age-related macular degeneration will develop the form of the disease that causes permanent blindness. It’s unclear just how much genetics plays a role, so there’s no definitive way to predict who will progress to that stage or when that would happen. But a team of Stanford doctors think they may have found a way. In a study, published this month in the medical journal Investigative Ophthalmology and Visual Science, researchers analyzed data from 2,146 retinal scans from 244 macular degeneration patients at Stanford from 2008 to 2013. They then created an algorithm that predicted whether a particular patient would be likely to develop the form of the disease that causes blindness within less than a year, three years or up to five years. For those with macular degeneration to go blind, the disease has to advance from what is known as the “dry” form to the “wet” form. The sooner a doctor can notice changes, the better chance there is to save a patient’s vision.
By Laura Geggel A major pathway of the human brain involved in visual perception, attention and movement — and overlooked by many researchers for more than a century — is finally getting its moment in the sun. In 2012, researchers made note of a pathway in a region of the brain associated with reading, but "we couldn't find it in any atlas," said Jason Yeatman, a research scientist at the University of Washington's Institute for Learning and Brain Sciences. "We'd thought we had discovered a new pathway that no one else had noticed before." A quick investigation showed that the pathway, known as the vertical occipital fasciculus (VOF), was not actually unknown. Famed neuroscientist Carl Wernicke discovered the pathway in 1881, during the dissection of a monkey brain that was most likely a macaque. [10 Things You Didn't Know About the Brain] But besides Wernicke's discovery, and a few other mentions throughout the years, the VOF is largely absent from studies of the human brain. This made Yeatman and his colleagues wonder, "How did a whole piece of brain anatomy get forgotten?" he said. The researchers immersed themselves in century-old brain atlases and studies, trying to decipher when and why the VOF went missing from mainstream scientific literature. They also scanned the brains of 37 individuals, and found an algorithm that can help present-day researchers pinpoint the elusive pathway.
By Paula Span A few days after I wrote about conditions that can mimic dementia, reader Sue Murray emailed me from Westchester County. Her subject line: “Have you heard of Charles Bonnet Syndrome?” I hadn’t, and until about six months ago, neither had Ms. Murray. Her mother Elizabeth, who is 91, has glaucoma and macular degeneration, and has been gradually losing her vision, Ms. Murray explained. So at first, her family was excited when Elizabeth seemed to be seeing things more clearly. Maybe, they thought, her vision was returning. But the things she was seeing — patterns and colors, strangers, a green man — weren’t there. She insisted that “there were people in the cellar, people on the porch, people in the house,” Ms. Murray said. “She’d point and say, ‘Don’t you see them?’ And she’d get mad when we didn’t.” Elizabeth and her husband Victor, 95, live in Connecticut, in a house they bought 50 years ago. For a while, the Green Man, as Elizabeth began calling him, seemed to have moved in, too. “She’d start hiding things in the closet so the Green Man wouldn’t take them,” Ms. Murray said. “There wasn’t any real fear; it was just, ‘Look at that!’” Elizabeth’s ophthalmologist promptly supplied the name for this condition: Charles Bonnet Syndrome, named for a Swiss philosopher who described such visual hallucinations in the 18th century. “We were relieved,” said Ms. Murray. What they feared, of course, was mental illness or dementia. “To have an eye doctor say, ‘I’m familiar with this,’ it’s still jarring but it’s not so terrible.” Bonnet Syndrome (pronounced Boh-NAY) isn’t terribly rare, it turns out. Oliver Sacks described several cases in his 2012 book, “Hallucinations.” Dr. Abdhish Bhavsar, a clinical spokesperson for the American Academy of Ophthalmology and a retina specialist in Minneapolis, estimates that he has probably seen about 200 patients with the syndrome over 17 years of practice. © 2014 The New York Times Company
By SINDYA N. BHANOO BERKELEY, CALIF. — Lilith Sadil, 12, climbs into an examination chair here at the Myopia Control Center at the University of California. “Do you know why you are here?” asks Dr. Maria Liu, an optometrist. “Because my eyes are changing fast,” Lilith says. “Do you read a lot?” Dr. Liu asks. “Yes.” “Do you use the computer a lot?” “Yes.” Lilith is an active child who practices taekwondo. But like an increasing number of children, she has myopia — she can see close up but not farther away. Her mother, Jinnie Sadil, has brought her to the center because she has heard about a new treatment that could help. Eye specialists are offering young patients special contact lenses worn overnight that correct vision for the next day. Myopia has become something of a minor epidemic: More than 40 percent of Americans are nearsighted, a 16 percent increase since the 1970s. People with so-called high myopia — generally, blurry vision beyond about five inches — face an increased likelihood of developing cataracts and glaucoma, are at higher risk for retinal detachments that can result in blindness. Exactly what is causing the nationwide rise in nearsightedness is not known. “It can’t be entirely genetic, because genes don’t change that fast,” said Susan Vitale, an epidemiologist at the National Institutes of Health who studies myopia. “It’s probably something that’s environmental, or a combination of genetic and environmental factors.” Some research indicates that “near work” — reading, computer work, playing video games, and using tablets and smartphones — is contributing to the increase. A recent study found that the more educated a person is, the more likely he or she will be nearsighted. A number of other studies show that children who spend time outdoors are less likely to develop high myopia. But no one is certain whether the eye benefits from ultraviolet light or whether time outside simply means time away from near work. © 2014 The New York Times Company
Linda Carroll TODAY contributor For years Larry Hester lived in darkness, his sight stolen by a disease that destroyed the photoreceptor cells in his retinas. But last week, through the help of a “bionic eye,” Hester got a chance to once again glimpse a bit of the world around him. Video: Larry Hester has been without sight for decades, but with the help of a new tool called the "bionic eye," researchers at Duke University have found a way to restore some of his sight. Hester is the seventh patient to receive an FDA-approved device that translates video signals into data the optic nerve can process. The images Hester and others “see” will be far from full sight, but experts hope it will be enough to give a little more autonomy to those who had previously been completely blind. Hester’s doctors at Duke University Eye Center believe that as time goes on the 66-year-old tire salesman from Raleigh, N.C., will be able to “see” more and more. After only five days, there has been remarkable progress. “I hope that [after some practice] he will be able to do things he can’t do today: maybe walk around a little more independently, see doorways or the straight line of a curb. We don’t expect him to be able to make out figures on TV. But we hope he’ll be more visually connected.” said Dr. Paul Hahn, an assistant professor of ophthalmology at the university in Durham.
2014 by Andy Coghlan Seeing is definitely believing when it comes to stem cell therapy. A blind man has recovered enough sight to ride his horse. A woman who could see no letters at all on a standard eye test chart can now read the letters on the top four lines. Others have recovered the ability to see colour. All have had injections of specialised retinal cells in their eyes to replace ones lost through age or disease. A trial in 18 people with degenerative eye conditions is being hailed as the most promising yet for a treatment based on human embryonic stem cells. "We've been hearing about their potential for more than a decade, but the results have always been in mice and rats, and no one has shown they're safe or effective in humans long term," says Robert Lanza of Advanced Cell Technology in Marlborough, Massachusetts, the company that carried out the stem cell intervention. "Now, we've shown both that they're safe and that there's a real chance these cells can help people." Ten years ago, the team at Advanced Cell Technology announced that it had successfully converted human embryonic stem cells into retinal pigment epithelial cells. These cells help keep the eyes' light-detecting rods and cones healthy. But when retinal pigment epithelial cells deteriorate, blindness can occur. This happens in age-related macular degeneration and Stargardt's macular dystrophy. In a bid to reverse this, Lanza's team injected retinal cells into one of each of the 18 participants' eyes, half of whom had age-related macular degeneration and half had Stargardt's. A year later, 10 people's eyes had improved, and the eyes of the others had stabilised. Untreated eyes had continued to deteriorate. © Copyright Reed Business Information Ltd.
David Cyranoski A Japanese patient with a debilitating eye disease is about to become the first person to be treated with induced pluripotent stem cells, which have generated enthusiastic expectations and earned their inventor a Nobel Prize. A health-ministry committee has vetted researchers' safety tests and cleared the team to begin the experimental procedure. Masayo Takahashi, an ophthalmologist at the RIKEN Center for Developmental Biology (CDB) in Kobe, has been using induced pluripotent stem (iPS) cells to prepare a treatment for age-related macular degeneration. Unlike embryonic stem cells, iPS cells are produced from adult cells, so they can be genetically tailored to each recipient. They are capable of becoming any cell type in the body, and have the potential to treat a wide range of diseases. The CDB trial will be the first opportunity for the technology to prove its clinical value. In age-related macular degeneration, extra blood vessels form in the eye, destabilizing a supportive base layer of the retina known as the retinal pigment epithelium. This results in the loss of the light-sensitive photoreceptors that are anchored in the epithelium, and often leads to blindness. Takahashi took skin cells from people with the disease and converted them to iPS cells. She then coaxed these cells to become retinal pigment epithelium cells, and then to grow into thin sheets that can be transplanted to the damaged retina. Takahashi and her collaborators have shown in monkey studies1 that iPS cells generated from the recipients' own cells do not provoke an immune reaction that causes them to be rejected. There have been concerns that iPS cells could cause tumours, but Takahashi's team has found that to be unlikely in mice2 and monkeys1. © 2014 Nature Publishing Group
Corie Lok Tami Morehouse's vision was not great as a child, but as a teenager she noticed it slipping even further. The words she was trying to read began disappearing into the page and eventually everything faded to a dull, grey haze. The culprit was a form of Leber's congenital amaurosis (LCA), a group of genetic disorders in which light-sensing cells in the retina die off, usually resulting in total blindness by the time people reach their thirties or forties. But Morehouse got a reprieve. In 2009, at the age of 44, the social worker from Ashtabula, Ohio, became the oldest participant in a ground-breaking clinical trial to test a gene therapy for LCA. Now, she says, she can see her children's eyes, and the colours of the sunset seem brighter than before. Morehouse calls these improvements life-changing, but they are minor compared with the changes in some of the younger trial participants. Corey Haas was eight years old when he was treated in 2008 — the youngest person to receive the therapy. He went from using a white cane to riding a bicycle and playing softball. Morehouse often wonders what she would be able to see now if she had been closer to Haas's age when she had the therapy. “I was born a little too soon,” she says. Visual impairment affects some 285 million people worldwide, about 39 million of whom are considered blind, according to a 2010 estimate from the World Health Organization. Roughly 80% of visual impairment is preventable or curable, including operable conditions such as cataracts that account for much of the blindness in the developing world. But retinal-degeneration disorders — including age-related macular degeneration, the leading cause of blindness in the developed world — have no cure. © 2014 Nature Publishing Group
Link ID: 20064 - Posted: 09.11.2014
By Meeri Kim The pervasive glow of electronic devices may be an impediment to a good night’s sleep. That’s particularly noticeable now, when families are adjusting to early wake-up times for school. Teenagers can find it especially hard to get started in the morning. For nocturnal animals, it spurs activity. For daytime species such as humans, melatonin signals that it’s time to sleep. As lamps switch off in teens’ bedrooms across America, the lights from their computer screens, smartphones and tablets often stay on throughout the night. These devices emit light of all colors, but it’s the blues in particular that pose a danger to sleep. Blue light is especially good at preventing the release of melatonin, a hormone associated with nighttime. Ordinarily, the pineal gland, a pea-size organ in the brain, begins to release melatonin a couple of hours before your regular bedtime. The hormone is no sleeping pill, but it does reduce alertness and make sleep more inviting. However, light — particularly of the blue variety — can keep the pineal gland from releasing melatonin, thus warding off sleepiness. You don’t have to be staring directly at a television or computer screen: If enough blue light hits the eye, the gland can stop releasing melatonin. So easing into bed with a tablet or a laptop makes it harder to take a long snooze, especially for sleep-deprived teenagers who are more vulnerable to the effects of light than adults. During adolescence, the circadian rhythm shifts, and teens feel more awake later at night. Switching on a TV show or video game just before bedtime will push off sleepiness even later even if they have to be up by 6 a.m. to get to school on time.
by Penny Sarchet It's a selfie that might save your sight. An implanted sensor could help people with glaucoma monitor the pressure in their eyes using a smartphone camera. The second biggest cause of blindness after cataracts, glaucoma occurs when fluid builds up in the eye. This raises the pressure, damaging the optic nerve. Accurate pressure readings are crucial for giving the right treatment, but one-off measurements during check-ups produce variable results and can be misleading. Yossi Mandel at Bar-Ilan University in Ramat Gan, Israel, and his colleagues have developed a pressure sensor which can be inserted into the eye during surgery to provide easy, regular monitoring from home. A few millimetres in length, the sensor can be embedded into the synthetic lenses used to replace the natural lenses of people with cataracts. It works like a miniature barometer, and contains a fluid column that rises with eye pressure. The level can be read at any time using a smartphone camera fitted with a special optical adapter. Software then analyses the image and calculates the reading. "Continuous monitoring is a clear unmet need in glaucoma," says Francesca Cordeiro, a glaucoma researcher at University College London. Mandel believes self-monitoring will lead to better treatment of glaucoma, and could enable people to skip unnecessary appointments when their eye pressures are on target. © Copyright Reed Business Information Ltd.
By ELEANOR LEW I was watching Diane Sawyer on the evening news, wondering how she manages year after year to look so young, when suddenly her face disappeared. Now you see. Now you don’t. One second. That’s all it took. A dense black inkblot shaped like a map of England and southern Norway suddenly blocked my view of Diane so that all I could see was her blond hair and shoulders. At first, I thought it was the television set. Changing channels didn’t bring her face back, nor did rubbing my eyes. “It’s permanent vision loss,” my ophthalmologist said. “Your optic nerve and retina buckled.” He drew a picture of the inside of my right eye, the affected one, and explained that my degenerative myopia, an inherited condition that is far less common than ordinary nearsightedness but still a leading cause of blindness worldwide, had caused my eyeball to elongate excessively. It looked like a house whose walls had been stretched so thin that the roof caved. The doctor didn’t say much else, didn’t make any recommendations for physical or occupational therapy, didn’t tell me to call him if I noticed any changes. I left his office shaken. “What if it happens in my other eye? What if…?” In the weeks that followed, I began to notice bizarre changes in my right eye. Frequent flashing lights, like a dying neon tube, sometimes flickering color or bright white light, so intense I swore I could hear them buzz. I observed my peripheral vision diminishing. England and Norway morphed into a large, bushy oak tree with a short and wide trunk. At a park, I came upon children playing. When I covered my good eye with my hand, I could see only a sliver of sky, and legs and shoes of children running in and out of the tree. I wrote off the psychedelic changes to the “buckling” and didn’t bother to call my ophthalmologist. But I was scared and needed help. © 2014 The New York Times Company
Link ID: 19982 - Posted: 08.22.2014
by Bethany Brookshire For most of us, where our birthday falls in the year doesn’t matter much in the grand scheme of things. A July baby doesn’t make more mistakes than a Christmas kid — at least, not because of their birthdays. But for neurons, birth date plays an important role in how these cells find their connections in the brain, a new study finds. Nerve cells that form early in development will make lots of connections — and lots of mistakes. Neurons formed later are much more precise in their targeting. The findings are an important clue to help scientists understand how the brain wires itself during development. And with more information on how the brain forms its network, scientists might begin to see what happens when that network is injured or malformed. Many, many brain cells are born as the brain develops. Each one has to reach out and make connections, sometimes to other cells around them and sometimes to other regions of the brain. To do this, these nerve cells send out axons, long, incredibly thin projections that reach out to other regions. How mammalian axons end up at their final destination in the growing brain remains a mystery. To find out how developing brains get wired up, Jessica Osterhout and colleagues at the University of California, San Diego and colleagues started in the eye. They looked at retinal ganglion cells, neurons that connect the brain and the eye. “It’s easy to access,” explains Andrew Huberman, a neuroscientist at UC San Diego and an author on the paper. “Your retina is basically part of the central nervous system that got squeezed into your eye during development.” Retinal ganglion cells all have the same function: To convey visual information from the eyes to the brain. But they are not all the same. © Society for Science & the Public 2000 - 2013
By Emily Underwood Old age may make us wiser, but it rarely makes us quicker. In addition to slowing down physically, most people lose points on intelligence tests as they enter their golden years. Now, new research suggests the loss of certain types of cognitive skills with age may stem from problems with basic sensory tasks, such as making quick judgments based on visual information. Although there’s no clear causal link between the two types of thinking yet, the new work could provide a simple, affordable way to track mental decline in senior citizens, scientists say. Since the 1970s, researchers who study intelligence have hypothesized that smartness, as measured on standard IQ tests, may hinge on the ability to quickly and efficiently sample sensory information from the environment, says Stuart Ritchie, a psychologist at the University of Edinburgh in the United Kingdom. Today it’s well known that people who score high on such tests do, indeed, tend to process such information more quickly than those who do poorly, but it’s not clear how these measures change with age, Ritchie says. Studying older people over time can be challenging given their uncertain health, but Ritchie and his colleagues had an unusual resource in the Lothian Birth Cohort, a group of people born in 1936 whose mental function has been periodically tested by the Scottish government since 1947—their first IQ test was at age 11. After recruiting more than 600 cohort members for their study, Ritchie and colleagues tracked their scores on a simple visual task three times over 10 years, repeating the test at the mean ages of 70, 73, and 76. © 2014 American Association for the Advancement of Science
|By Ingrid Wickelgren One important function of your inner ear is stabilizing your vision when your head is turning. When your head turns one way, your vestibular system moves your eyes in the opposite direction so that what you are looking at remains stable. To see for yourself how your inner ears make this adjustment, called the vestibulo-ocular reflex, hold your thumb upright at arm’s length. Shake your head back and forth about twice per second while looking at your thumb. See that your thumb remains in focus. Now create the same relative motion by swinging your arm back and forth about five inches at the same speed. Notice that your thumb is blurry. To see an object clearly, the image must remain stationary on your retina. When your head turns, your vestibular system very rapidly moves your eyes in the opposite direction to create this stability. When the thumb moves, your visual system similarly directs the eyes to follow, but the movement is too slow to track a fast-moving object, causing blur. © 2014 Scientific American