Chapter 10. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Sam Wong Here’s looking at you, squid. Cock-eyed squid have one huge, bulging eye and another normal-sized eye, but the reason has remained a mystery. Now we have an answer. Kate Thomas of Duke University in North Carolina studied 161 videos of the creatures collected over 26 years by remotely operated submarines in Monterey Bay, California. The findings provide the first behavioural evidence that the two eyes are adapted to look in different directions. The large one points upwards to spot prey silhouetted against the sky. The smaller one points downwards to spot bioluminescent organisms against the darkness below. The squid, from the histioteuthid family, live at depths of 200 to 1000 metres, where little light penetrates. The videos show that the squid normally swims with its tail end pointing upwards, but tilted so the large eye is consistently oriented towards the sky. Based on measurements of the eyes and the light levels they would be exposed to, Thomas and her colleagues calculated that having a big upward-pointing eye greatly improves visual perception, while a downward-pointing eye would gain little from being large. “That gives you the context for how this trait might have evolved,” says Thomas. Some of the squid’s prey, such as lanternfish and shrimp, have luminescent undersides so they are camouflaged against the sunlight when seen from below. Yellow pigmentation in the lens of the squid’s large eye may help it distinguish between sunlight and bioluminescence. © Copyright Reed Business Information Ltd.
Noah Charney The Chinese government just arrested a group of people associated with a sham tourist attraction that had lured hundreds of sight-seers to a fake Terracotta Warriors exhibit, comprised entirely of modern replicas. Sotheby’s recently hired Jamie Martin of Orion Analytical, a forensic specialist at testing art, who then discovered that a Parmigianino painting recently sold is actually a modern forgery (Sotheby’s returned the buyer’s money and then sued the person for whom they sold it). And the Ringling Museum in Sarasota, Florida, is hoping that a painting of Philip IV of Spain in their collection will be definitively determined to be by Velazquez, and not a copy in the style of Velazquez. And that’s just in the last week or so. Art forgery and authenticity seems to be in the news just about every week (to my publicist’s delight). But I’m on a bit of a brainstorm. After my interview with Nobel Prize winner Dr. Eric Kandel on the neuroscience behind how we humans understand art, I’ve developed a keen interest in art and the mind. I tackled selfies, self-portraits and facial recognition recently, as well as what happens when the brain fails to function properly and neglects to recognize the value of art. Since my last book was a history of forgery, it was perhaps inevitable that I would wonder about the neurology of the recognition of originals versus copies. But while I looked into forgery from a wide variety of angles for the book, neuroscience was not one of them. © 2017 Salon Media Group, Inc.
by Laura Sanders Most nights I read a book in bed to wind down. But when I run out of my library supply, I read articles on my phone instead. I suspect that this digital substitution messes with my sleep. That’s not good for me — but it’s probably worse for the many children who have screens in their rooms at night. A team of researchers recently combed through the literature looking for associations between mobile devices in the bedroom and poor sleep. Biostatistician Ben Carter of King’s College London and colleagues found that kids between ages 6 and 19 who used screen-based media around bedtime slept worse and were more tired in the day. That’s not surprising: Phones, tablets and laptops make noise and emit blue light that can interfere with the sleep-inducing melatonin. But things got interesting when the researchers compared kids who didn’t have screens in their bedrooms with kids who did have phones or tablets in their rooms but didn’t use them. You might think there wouldn’t be a sleep difference between those groups. None of these kids were up all night texting, gaming or swiping, so neither sounds nor blue light were messing with any of the kids’ sleep. Yet Carter and colleagues found a difference: Kids who had screen-based media in their bedroom, but didn’t use it, didn’t sleep as much as kids without the technology. What’s more, the sleep they did get was worse and they were more tired during the day, the researchers reported in the December JAMA Pediatrics. |© Society for Science & the Public 2000 - 2017
By GRETCHEN REYNOLDS Being nearsighted is far more common than it once was. The prevalence of myopia, the condition’s medical name, in Americans has soared by 66 percent since the early 1970s, according to a 2009 study by the National Eye Institute; in China and other East Asian countries, as many as 90 percent of recent high school graduates are thought to be nearsighted. Myopia results when eyeballs are longer than normal, changing the angle at which light enters the eye and therefore the ability to focus on distant objects. The disorder involves a complex interplay of genetics and environment and usually begins before adolescence, when the eye is growing, but it can worsen in early adulthood. Some experts connect the elevated rates of myopia to the many hours young people stare at computers and other screens. But a recent study published in JAMA Ophthalmology suggests that a greater factor may be a side effect of all that screen-watching — it’s keeping children inside. This new study joins a growing body of research indicating that a lack of direct sunlight may reshape the human eye and impair vision. Researchers at King’s College London, the London School of Hygiene and Tropical Medicine and other institutions gave vision exams to more than 3,100 older European men and women and interviewed them at length about their education, careers and how often they remembered being outside during various stages of their lives. This biographical information was then cross-referenced with historical data about sunlight, originally compiled for research on skin cancer and other conditions. © 2017 The New York Times Company
By Rachael Lallensack A video game is helping researchers learn more about how tiny European starlings keep predators at bay. Their massive flocks, consisting of hundreds to thousands of birds, fly together in a mesmerizing, pulsating pattern called a murmuration. For a long time, researchers have suspected that the bigger the flock, the harder it is for predators like falcons and hawks to take down any one member, something known as “confusion effect.” Now, researchers have analyzed that effect—in human hunters. Using the first 3D computer program to simulate a murmuration, scientists tested how well 25 players, acting as flying predators, could target and pursue virtual starlings, whose movements were simulated based on data from real starling flocks (see video above). The team’s findings reaffirmed the confusion effect: The larger the simulated flocks, the harder it was for the “predators” to single out and catch individual prey, the researchers report this week in Royal Society Open Science. So maybe sometimes, it’s not so bad to get lost in a crowd. © 2017 American Association for the Advancement of Science.
By Anna Azvolinsky Hummingbirds are efficient hoverers, suspending their bodies midair using rapid forward and backward strokes. Aside from their unique ability to hover, the tiny avians are also the only known birds that can fly in any direction, including sideways. Hummingbird brains appear to be adapted for this flying ability, researchers have now shown. According to a study published today (January 5) in Current Biology, a highly conserved area of the brain—the lentiformis mesencephali (LM), which receives panoramic visual motion information directly from the retina—processes the movement of objects from all directions. In contrast, the LMs of other bird species and all other four-limbed vertebrates studied to date predominantly sense back-to-front motion. While the authors had predicted the neurons of this hummingbird brain region would be tuned to slow motion, they in fact found the opposite: LM neurons were sensitive to quick visual motion, most likely because hummingbirds must process and respond to their environments quickly to avoid collisions, both during hovering and in other modes of flight. “This ancient part of the brain the authors studied has one job: to detect the motion of the image in front of the eyes,” explained Michael Ibbotson, a neuroscientist at the University of Melbourne who penned an accompanying editorial but was not involved in the research. The results of this study suggest that “hummingbirds evolved this area of the brain to have fine motor control to be able to hover and push in every direction possible,” Ibbotson said. © 1986-2017 The Scientist
Link ID: 23064 - Posted: 01.07.2017
By Susana Martinez-Conde Our perceptual and cognitive systems like to keep things simple. We describe the line drawings below as a circle and a square, even though their imagined contours consist—in reality—of discontinuous line segments. The Gestalt psychologists of the 19th and early 20th century branded this perceptual legerdemain as the Principle of Closure, by which we tend to recognize shapes and concepts as complete, even in the face of fragmentary information. Now at the end of the year, it is tempting to seek a cognitive kind of closure: we want to close the lid on 2016, wrap it with a bow and start a fresh new year from a blank slate. Of course, it’s just an illusion, the Principle of Closure in one of its many incarnations. The end of the year is just as arbitrary as the end of the month, or the end of the week, or any other date we choose to highlight in the earth’s recurrent journey around the sun. But it feels quite different. That’s why we have lists of New Year’s resolutions, or why we start new diets or exercise regimes on Mondays rather than Thursdays. Researchers have also found that, even though we measure time in a continuous scale, we assign special meaning to idiosyncratic milestones such as entering a new decade. What should we do about our brain’s oversimplification tendencies concerning the New Year—if anything? One strategy would be to fight our feelings of closure and rebirth as we (in truth) seamlessly move from the last day of 2016 to the first day of 2017. But that approach is likely to fail. Try as we might, the Principle of Closure is just too ingrained in our perceptual and cognitive systems. In fact, if you already have the feeling that the beginning of the year is somewhat special (hey, it only happens once a year!), you might as well decide that resistance is futile, and not just embrace the illusion, but do your best to channel it. © 2017 Scientific American
By Stephen L. Macknik Masashi Atarashi, a physics high school teacher from Japan, submitted this wonderful winter illusion to the 2015 Best Illusion of the Year Contest, where it competed as a finalist. Atarashi discovered this effect serendipitously, while watching the snow fall through the venetian window blinds of his school’s faculty lounge—just like his students must sometimes do in the classroom during a lecture! Notice that as the blinds occupy more area on the screen, the speed of the snowfall seems to accelerate. A great illusion to ponder during our white holiday season. Nobody knows how Atarashi’s effect works, but our working hypothesis is that each time the snow disappears behind a blind, or reappears below it, it triggers transient increases in the activity of your visual system’s motion-sensitive neurons. Such transient surges in neural activity are perhaps misinterpreted by your brain as faster motion speed. © 2016 Scientific American,
Link ID: 23022 - Posted: 12.27.2016
Sarah Boseley Health editor The NHS is to pay for 10 people to be implanted with a “bionic eye”, a pioneering technology that can restore some sight to those who have been blind for years. Only a handful of people have undergone surgery in trials so far to equip them to use Argus II, which employs a camera mounted in a pair of glasses and a tiny computer to relay signals directly to the nerves controlling sight. The decision to fund the first 10 NHS patients to be given the bionic eye could pave the way for the life-changing technology to enter the mainstream. Those who will get the equipment can currently see nothing more than the difference between daylight and darkness. The system allows the brain to decode flashes of light, so that they can learn to see movement. One of three patients to have had the implant into the retina in trials at Manchester Royal Eye hospital is Keith Hayman, 68, from Lancashire, who has five grandchildren. He was diagnosed with retinitis pigmentosa in his 20s. The disease causes cells in the retina gradually to stop working and eventually die. Hayman, who was originally a butcher, was registered blind in 1981, and forced to give up all work. “Having spent half my life in darkness, I can now tell when my grandchildren run towards me and make out lights twinkling on Christmas trees,” he said. “I would be talking to a friend, who might have walked off and I couldn’t tell and kept talking to myself. This doesn’t happen anymore, because I can tell when they have gone.” They may seem like little things, he said, but “they make all the difference to me”. © 2016 Guardian News and Media Limited
Betsy Mason With virtual reality finally hitting the consumer market this year, VR headsets are bound to make their way onto a lot of holiday shopping lists. But new research suggests these gifts could also give some of their recipients motion sickness — especially if they’re women. In a test of people playing one virtual reality game using an Oculus Rift headset, more than half felt sick within 15 minutes, a team of scientists at the University of Minnesota in Minneapolis reports online December 3 in Experimental Brain Research. Among women, nearly four out of five felt sick. So-called VR sickness, also known as simulator sickness or cybersickness, has been recognized since the 1980s, when the U.S. military noticed that flight simulators were nauseating its pilots. In recent years, anecdotal reports began trickling in about the new generation of head-mounted virtual reality displays making people sick. Now, with VR making its way into people’s homes, there’s a steady stream of claims of VR sickness. “It's a high rate of people that you put in [VR headsets] that are going to experience some level of symptoms,” says Eric Muth, an experimental psychologist at Clemson University in South Carolina with expertise in motion sickness. “It’s going to mute the ‘Wheee!’ factor.” Oculus, which Facebook bought for $2 billion in 2014, released its Rift headset in March. The company declined to comment on the new research but says it has made progress in making the virtual reality experience comfortable for most people, and that developers are getting better at creating VR content. All approved games and apps get a comfort rating based on things like the type of movements involved, and Oculus recommends starting slow and taking breaks. But still some users report getting sick. © Society for Science & the Public 2000 - 2016.
Link ID: 22962 - Posted: 12.07.2016
.By JOANNA KLEIN A honey bee gathering pollen on a white flower. Dagmar Sporck/EyeEm, via Getty Images Set your meetings, phone calls and emails aside, at least for the next several minutes. That’s because today you’re a bee. It's time to leave your hive, or your underground burrow, and forage for pollen. Pollen is the stuff that flowers use to reproduce. But it’s also essential grub for you, other bees in your hive and your larvae. Once you’ve gathered pollen to take home, you or another bee will mix it with water and flower nectar that other bees have gathered and stored in the hive. But how do you decide which flowers to approach? What draws you in? In a review published last week in the journal Functional Ecology, researchers asked: What is a flower like from a bee’s perspective, and what does the pollinator experience as it gathers pollen? And that's why we're talking to you in the second person: to help you understand how bees like you, while hunting for pollen, use all of your senses — taste, touch, smell and more — to decide what to pick up and bring home. Maybe you're ready to go find some pollen. But do you even know where to look? © 2016 The New York Times Company
Hannah Devlin Science Correspondent Blind animals have had their vision partially restored using a revolutionary DNA editing technique that scientists say could in future be applied to a range of devastating genetic diseases. The study is the first to demonstrate that a gene editing tool, called Crispr, can be used to replace faulty genes with working versions in the cells of adults - in this case adult rats. Previously, the powerful procedure, in which strands of DNA are snipped out and replaced, had been used only in dividing cells - such as those in an embryo - and scientists had struggled to apply it to non-dividing cells that make up most adult tissue, including the brain, heart, kidneys and liver. The latest advance paves the way for Crispr to be used to treat a range of incurable illnesses, such as muscular dystrophy, haemophilia and cystic fibrosis, by overwriting aberrant genes with a healthy working version. Professor Juan Carlos Izpisua Belmonte, who led the work at the Salk Institute in California, said: “For the first time, we can enter into cells that do not divide and modify the DNA at will. The possible applications of this discovery are vast.” The technique could be trialled in humans in as little as one or two years, he predicted, adding that the team were already working on developing therapies for muscular dystrophy. Crispr, a tool sometimes referred to as “molecular scissors”, has already been hailed as a game-changer in genetics because it allows scientists to cut precise sections of DNA and replace them with synthetic, healthy replacements. © 2016 Guardian News and Media Limited
Link ID: 22882 - Posted: 11.17.2016
By ARNAUD COLINART, AMAURY LA BURTHE, PETER MIDDLETON and JAMES SPINNEY “What is the world of sound?” So begins a diary entry from April 1984, recorded on audiocassette, about the nature of acoustic experience. The voice on the tape is that of the writer and theologian John Hull, who at the time of the recording had been totally blind for almost two years. After losing his sight in his mid-40s, Dr. Hull, a newlywed with a young family, had decided that blindness would destroy him if he didn’t learn to understand it. For three years he recorded his experiences of sight loss, documenting “a world beyond sight.” We first met Dr. Hull in 2011, having read his acclaimed 1991 book “Touching The Rock: An Experience of Blindness,” which was transcribed from his audio diaries. We began collaborating with him on a series of films using his original recordings. These included an Emmy-winning Op-Doc in 2014 and culminated in the feature-length documentary “Notes on Blindness.” But we were also interested in how interactive forms of storytelling might further explore Dr. Hull’s vast and detailed account — in particular how new mediums like virtual reality could illuminate his investigations into auditory experience. The diaries describe his evolving appreciation of “the breadth and depth of three-dimensional world that is revealed by sound,” the awakening of an acoustic perception of space. The sound of falling rain, he said, “brings out the contours of what is around you”; wind brings leaves and trees to life; thunder “puts a roof over your head.” This interactive experience is narrated by Dr. Hull, using extracts from his diary recordings to consider the nature of acoustic space. Binaural techniques map the myriad details of everyday life (in this case, the noises that surround Dr. Hull in a park) within a 3-D sound environment, a “panorama of music and information,” rich in color and texture. The real-time animation visualizes this multilayered soundscape in which, Dr. Hull says, “every sound is a point of activity.” © 2016 The New York Times Company
By Simon Oxenham Isy Suttie has felt “head squeezing” since she was young. The comedian, best known for playing Dobbie in the BBC sitcom Peep Show, is one of many people who experience autonomous sensory meridian response (ASMR) – a tingly feeling often elicited by certain videos or particular mundane interactions. Growing up, Suttie says she had always assumed everyone felt it too. Not everyone feels it, but Suttie is by no means alone. On Reddit, a community of more than 100,000 members share videos designed to elicit the pleasurable sensation. The videos, often described as “whisper porn”, typically consist of people role-playing routine tasks, whispering softly into a microphone or making noises by crinkling objects such as crisp packets. The most popular ASMR YouTuber, “Gentle Whispering”, has over 250 million views. To most of us, the videos might seem strange or boring, but the clips frequently garner hundreds of thousands of views. These videos often mimic real-life situations that provoke ASMR in susceptible people. Suttie says her strongest real-world triggers occur during innocuous interactions with strangers, like talking about the weather – “it’s almost as if the more superficial the subject the better,” Suttie says. She feels the sensation particularly strongly when someone brushes past her. For Suttie, the feelings are so powerful that she often feels floored by them, and they even overcome pain and emotional distress. During a trip to the dentist, she still experiences the pleasurable tingles when the assistant brushes past her, she says. © Copyright Reed Business Information Ltd.
By Jessica Boddy Glasses may be trendy now, but for centuries they were the stodgy accessories of the elderly worn only for failing eyes. Now, new research suggests that aging bonobos might also benefit from a pair of specs—not for reading, but for grooming. Many older bonobos groom their partners at arm’s length instead of just centimeters away, in the same way that older humans often hold newspapers farther out to read. This made researchers think the apes might also be losing their close-up vision as they age. To see whether their hypothesis held, the researchers took photos of 14 different bonobos of varying ages as they groomed one another (above) and measured the distance between their hands and faces. By analyzing how this so-called grooming distance varied from ape to ape, the researchers found that grooming distance increased exponentially with age, they report today in Current Biology. And because both humans and bonobos shows signs of farsightedness around age 40, deterioration in human eyes might not be the mere result of staring at screens and small text, the scientists say. Rather, it might be a deep-rooted natural trait reaching back to a common ancestor. © 2016 American Association for the Advancement of Science.
Link ID: 22841 - Posted: 11.08.2016
By Diana Kwon Can you feel your heart beating? Most people cannot, unless they are agitated or afraid. The brain masks the sensation of the heart in a delicate balancing act—we need to be able to feel our pulse racing occasionally as an important signal of fear or excitement, but most of the time the constant rhythm would be distracting or maddening. A growing body of research suggests that because of the way the brain compensates for our heartbeat, it may be vulnerable to perceptual illusions—if they are timed just right. In a study published in May in the Journal of Neuroscience, a team at the Swiss Federal Institute of Technology in Lausanne conducted a series of studies on 143 participants and found that subjects took longer to identify a flashing object when it appeared in sync with the rhythm of their heartbeats. Using functional MRI, they also found that activity in the insula, a brain area associated with self-awareness, was suppressed when people viewed these synchronized images. The authors suggest that the flashing object was suppressed by the brain because it got lumped in with all the other bodily changes that occur with each heartbeat—the eyes make tiny movements, eye pressure changes slightly, the chest expands and contracts. “The brain knows that the heartbeat is coming from the self, so it doesn't want to be bothered by the sensory consequences of these signals,” says Roy Salomon, one of the study's co-authors. © 2016 Scientific American
By peering into the eyes of mice and tracking their ocular movements, researchers made an unexpected discovery: the visual cortex — a region of the brain known to process sensory information — plays a key role in promoting the plasticity of innate, spontaneous eye movements. The study, published in Nature, was led by researchers at the University of California, San Diego (UCSD) and the University of California, San Francisco (UCSF) and funded by the National Eye Institute (NEI), part of the National Institutes of Health. “This study elegantly shows how analysis of eye movement sheds more light on brain plasticity — an ability that is at the core of the brain’s capacity to adapt and function. More specifically, it shows how the visual cortex continues to surprise and to awe,” said Houmam Araj, Ph.D., a program director at NEI. Without our being aware of it, our eyes are in constant motion. As we rotate our heads and as the world around us moves, two ocular reflexes kick in to offset this movement and stabilize images projected onto our retinas, the light-sensitive tissue at the back of our eyes. The optokinetic reflex causes eyes to drift horizontally from side-to-side — for example, as we watch the scenery through a window of a moving train. The vestibulo-ocular reflex adjusts our eye position to offset head movements. Both reflexes are crucial to survival. These mechanisms allow us to see traffic while driving down a bumpy road, or a hawk in flight to see a mouse scurrying for cover.
Link ID: 22750 - Posted: 10.13.2016
By Dwayne Godwin, Jorge Cham The brain processes a wealth of visual information in parallel so that we perceive the world around us in the blink of an eye Dwayne Godwin is a neuroscientist at the Wake Forest University School of Medicine. Jorge Cham draws the comic strip Piled Higher and Deeper at www.phdcomics.com. © 2016 Scientific American
Link ID: 22689 - Posted: 09.24.2016
Alva Noë Eaters and cooks know that flavor, in the jargon of neuroscientists, is multi-modal. Taste is all important, to be sure. But so is the look of food and its feel in the mouth — not to mention its odor and the noisy crunch, or juicy squelch, that it may or may not make as we bite into it. The perception of flavor demands that we exercise a suite of not only gustatory, but also visual, olfactory, tactile and auditory sensitivities. Neuroscientists are now beginning to grasp some of the ways the brain enables our impressive perceptual power when it comes to food. Traditionally, scientists represent the brain's sensory function in a map where distinct cortical areas are thought of as serving the different senses. But it is increasingly appreciated that brain activity can't quite be segregated in this way. Cells in visual cortex may be activated by tactile stimuli. This is the case, for example, when Braille readers use their fingers to read. These blind readers aren't seeing with their fingers, rather, they are deploying their visual brains to perceive with their hands. And, in a famous series of studies that had a great influence on my thinking on these matters, Miriganka Sur at MIT showed that animals whose retinas were re-wired surgically to feed directly into auditory cortex do not hear lights and other visible objects presented to the eyes, rather, they see with their auditory brains. The brain is plastic, and different sensory modalities compete continuously for control over populations of cells. An exciting new paper on the gustatory cortex from the laboratory of Alfredo Fontanini at Stony Brook University shows that there are visual-, auditory-, olfactory- and touch-sensitive cells in the gustatory cortex of rats. There are even some cells that respond to stimuli in more than one modality. But what is more remarkable is that when rats learn to associate non-taste qualities — tones, flashes of lights, etc. — with food (sucrose in their study), there is a marked transformation in the gustatory cortex. © 2016 npr
By Colin Barras Subtract 8 from 52. Did you see the calculation in your head? While a leading theory suggests our visual experiences are linked to our understanding of numbers, a study of people who have been blind from birth suggests the opposite. The link between vision and number processing is strong. Sighted people can estimate the number of people in a crowd just by looking, for instance, while children who can mentally rotate an object and correctly imagine how it might look from a different angle often develop better mathematical skills. “It’s actually hard to think of a situation when you might process numbers through any modality other than vision,” says Shipra Kanjlia at Johns Hopkins University in Baltimore, Maryland. But blind people can do maths too. To understand how they might compensate for their lack of visual experience, Kanjlia and her colleagues asked 36 volunteers – 17 of whom had been blind at birth – to do simple mental arithmetic inside an fMRI scanner. To level the playing field, the sighted participants wore blindfolds. We know that a region of the brain called the intraparietal sulcus (IPS) is, and brain scans revealed that the same area is similarly active in blind people too. “It’s really surprising,” says Kanjlia. “It turns out brain activity is remarkably similar, at least in terms of classic number processing.” This may mean we have a deep understanding of how to handle numbers that is entirely independent of visual experience. This suggests we are all born with a natural understanding of numbers – an idea many researchers find difficult to accept. © Copyright Reed Business Information Ltd.