Chapter 10. Vision: From Eye to Brain
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By JANE E. BRODY “Feeling My Way Into Blindness,” an essay published in The New York Times in November by Edward Hoagland, an 84-year-old nature and travel writer and novelist, expressed common fears about the effects of vision loss on quality of life. Mr. Hoagland, who became blind about four years ago, projected deep-seated sadness in describing the challenges he faces of pouring coffee, not missing the toilet, locating a phone number, finding the food on his plate, and knowing to whom he is speaking, not to mention shopping and traveling, when he often must depend on the kindness of strangers. And, of course, he sorely misses nature’s inspiring vistas and inhabitants that fueled his writing, though he can still hear birds chatter in the trees, leaves rustle in the wind and waves crash on the shore. Mr. Hoagland is hardly alone in his distress. According to Action for Blind People, a British support organization, those who have lost some or all sight “struggle with a range of emotions — from shock, anger, sadness and frustration to depression and grief.” When eyesight fails, some people become socially disengaged, leading to isolation and loneliness. Anxiety about a host of issues — falls, medication errors, loss of employment, social blunders — is common. A recent study from researchers at the Wilmer Eye Institute at Johns Hopkins University School of Medicine found that most Americans regard loss of eyesight as the worst ailment that could happen to them, surpassing such conditions as loss of limb, memory, hearing or speech, or having H.I.V./AIDS. Indeed, low vision ranks behind arthritis and heart disease as the third most common chronic cause of impaired functioning in people over 70, Dr. Eric A. Rosenberg of Weill Cornell Medical College and Laura C. Sperazza, a New York optometrist, wrote in American Family Physician. © 2017 The New York Times Company
By Michael Price BOSTON--Among mammals, primates are unique in that certain species have three different types of light-sensitive cone cells in their eyes rather than two. This allows humans and their close relatives to see what we think of as the standard spectrum of color. (Humans with red-green color blindness, of course, see a different spectrum.) The standard explanation for why primates developed trichromacy, as this kind of vision is called, is that it allowed our early ancestors to see colorful ripe fruit more easily against a background of mostly green forest. A particular Old World monkey, the rhesus macaque (pictured), has a genetic distinction that offers a convenient natural test of this hypothesis: a common genetic variation makes some females have three types of cone cells and others have two. Studies with captive macaques has shown that trichromatic females are faster than their dichromatic peers at finding fruit, but attempts to see whether that’s true for wild monkeys has been complicated by the fact that macaques are hard to find, and age and rank also play big roles in determining who eats when. A vision researcher reported today at the annual meeting of AAAS, which publishes Science, that after making more than 20,000 individual observations of 80 different macaques feeding from 30 species of trees on Cayo Santiago, Puerto Rico, she can say with confidence that wild trichromatic female monkeys do indeed appear to locate and eat fruit more quickly than dichromatic ones, lending strong support to the idea that this advantage helped drive the evolution of trichromacy in humans and our relatives. © 2017 American Association for the Advancement of Science.
By LISA SANDERS, M.D. The 3-year-old girl was having a very bad day — a bad week, really. She’d been angry and irritable, screaming and kicking at her mother over nothing. Her mother was embarrassed by this unusual behavior, because her husband’s sister, Amber Bard, was visiting. Bard, a third-year medical student at Michigan State, was staying in the guest room while working with a local medical practice in Grand Rapids so that she could spend a little time with her niece. The behavior was strange, but the mother was more concerned about her child’s left eye. A few days earlier it was red and bloodshot. It no longer was, but now the girl had little bumps near the eye. The mother asked Bard whether she could look at the eye. “I’m a third-year medical student,” Bard told her. “I know approximately nothing.” But Bard was happy to try. She turned to the girl, who immediately averted her face. “Can you show me your eye?” she asked. The girl shouted: “No! No, no, no!” Eventually Bard was able to coax her into allowing her a quick look at the eye. She saw a couple of tiny pimples along the lower lid, near the lashes, and a couple more just next to the eye. The eye itself wasn’t red; the lid wasn’t swollen. She couldn’t see any discharge. Once the child was in bed, Bard opened her laptop and turned to a database she’d been using for the past week when she started to see patients. Called VisualDx, it’s one of a dozen or so programs known as decision-support software, designed to help doctors make a diagnosis. This one focuses mostly on skin findings.
Link ID: 23242 - Posted: 02.17.2017
By Amitha Kalaichandran, When pain researcher Diane Gromala recounts how she started in the field of virtual reality, she seems reflective. She had been researching virtual reality for pain since the early 1990s, but her shift to focusing on how virtual reality could be used for chronic pain management began in 1999, when her own chronic pain became worse. Prior to that, her focus was on VR as entertainment. Gromala, 56, was diagnosed with chronic pain in 1984, but the left-sided pain that extended from her lower stomach to her left leg worsened over the next 15 years. "Taking care of my chronic pain became a full-time job. So at some point I had to make a choice — either stop working or charge full force ahead by making it a motivation for my research. You can guess what I chose," she said. Diane Gromala Pain researcher Diane Gromala found that taking care of her own chronic pain became 'a full-time job.' (Pain studies lab at Simon Fraser University) Now she's finding that immersive VR technology may offer another option for chronic pain, which affects at least one in five Canadians, according to a 2011 University of Alberta study. "We know that there is some evidence supporting immersive VR for acute pain, so it's reasonable to look into how it could help patients that suffer from chronic pain." Gromala has a PhD in human computer interaction and holds the Canada Research Chair in Computational Technologies for Transforming Pain. She also directs the pain studies lab and the Chronic Pain Research Institute at Simon Fraser University in Burnaby, B.C. ©2017 CBC/Radio-Canada.
By Sam Wong Here’s looking at you, squid. Cock-eyed squid have one huge, bulging eye and another normal-sized eye, but the reason has remained a mystery. Now we have an answer. Kate Thomas of Duke University in North Carolina studied 161 videos of the creatures collected over 26 years by remotely operated submarines in Monterey Bay, California. The findings provide the first behavioural evidence that the two eyes are adapted to look in different directions. The large one points upwards to spot prey silhouetted against the sky. The smaller one points downwards to spot bioluminescent organisms against the darkness below. The squid, from the histioteuthid family, live at depths of 200 to 1000 metres, where little light penetrates. The videos show that the squid normally swims with its tail end pointing upwards, but tilted so the large eye is consistently oriented towards the sky. Based on measurements of the eyes and the light levels they would be exposed to, Thomas and her colleagues calculated that having a big upward-pointing eye greatly improves visual perception, while a downward-pointing eye would gain little from being large. “That gives you the context for how this trait might have evolved,” says Thomas. Some of the squid’s prey, such as lanternfish and shrimp, have luminescent undersides so they are camouflaged against the sunlight when seen from below. Yellow pigmentation in the lens of the squid’s large eye may help it distinguish between sunlight and bioluminescence. © Copyright Reed Business Information Ltd.
Noah Charney The Chinese government just arrested a group of people associated with a sham tourist attraction that had lured hundreds of sight-seers to a fake Terracotta Warriors exhibit, comprised entirely of modern replicas. Sotheby’s recently hired Jamie Martin of Orion Analytical, a forensic specialist at testing art, who then discovered that a Parmigianino painting recently sold is actually a modern forgery (Sotheby’s returned the buyer’s money and then sued the person for whom they sold it). And the Ringling Museum in Sarasota, Florida, is hoping that a painting of Philip IV of Spain in their collection will be definitively determined to be by Velazquez, and not a copy in the style of Velazquez. And that’s just in the last week or so. Art forgery and authenticity seems to be in the news just about every week (to my publicist’s delight). But I’m on a bit of a brainstorm. After my interview with Nobel Prize winner Dr. Eric Kandel on the neuroscience behind how we humans understand art, I’ve developed a keen interest in art and the mind. I tackled selfies, self-portraits and facial recognition recently, as well as what happens when the brain fails to function properly and neglects to recognize the value of art. Since my last book was a history of forgery, it was perhaps inevitable that I would wonder about the neurology of the recognition of originals versus copies. But while I looked into forgery from a wide variety of angles for the book, neuroscience was not one of them. © 2017 Salon Media Group, Inc.
by Laura Sanders Most nights I read a book in bed to wind down. But when I run out of my library supply, I read articles on my phone instead. I suspect that this digital substitution messes with my sleep. That’s not good for me — but it’s probably worse for the many children who have screens in their rooms at night. A team of researchers recently combed through the literature looking for associations between mobile devices in the bedroom and poor sleep. Biostatistician Ben Carter of King’s College London and colleagues found that kids between ages 6 and 19 who used screen-based media around bedtime slept worse and were more tired in the day. That’s not surprising: Phones, tablets and laptops make noise and emit blue light that can interfere with the sleep-inducing melatonin. But things got interesting when the researchers compared kids who didn’t have screens in their bedrooms with kids who did have phones or tablets in their rooms but didn’t use them. You might think there wouldn’t be a sleep difference between those groups. None of these kids were up all night texting, gaming or swiping, so neither sounds nor blue light were messing with any of the kids’ sleep. Yet Carter and colleagues found a difference: Kids who had screen-based media in their bedroom, but didn’t use it, didn’t sleep as much as kids without the technology. What’s more, the sleep they did get was worse and they were more tired during the day, the researchers reported in the December JAMA Pediatrics. |© Society for Science & the Public 2000 - 2017
By GRETCHEN REYNOLDS Being nearsighted is far more common than it once was. The prevalence of myopia, the condition’s medical name, in Americans has soared by 66 percent since the early 1970s, according to a 2009 study by the National Eye Institute; in China and other East Asian countries, as many as 90 percent of recent high school graduates are thought to be nearsighted. Myopia results when eyeballs are longer than normal, changing the angle at which light enters the eye and therefore the ability to focus on distant objects. The disorder involves a complex interplay of genetics and environment and usually begins before adolescence, when the eye is growing, but it can worsen in early adulthood. Some experts connect the elevated rates of myopia to the many hours young people stare at computers and other screens. But a recent study published in JAMA Ophthalmology suggests that a greater factor may be a side effect of all that screen-watching — it’s keeping children inside. This new study joins a growing body of research indicating that a lack of direct sunlight may reshape the human eye and impair vision. Researchers at King’s College London, the London School of Hygiene and Tropical Medicine and other institutions gave vision exams to more than 3,100 older European men and women and interviewed them at length about their education, careers and how often they remembered being outside during various stages of their lives. This biographical information was then cross-referenced with historical data about sunlight, originally compiled for research on skin cancer and other conditions. © 2017 The New York Times Company
By Rachael Lallensack A video game is helping researchers learn more about how tiny European starlings keep predators at bay. Their massive flocks, consisting of hundreds to thousands of birds, fly together in a mesmerizing, pulsating pattern called a murmuration. For a long time, researchers have suspected that the bigger the flock, the harder it is for predators like falcons and hawks to take down any one member, something known as “confusion effect.” Now, researchers have analyzed that effect—in human hunters. Using the first 3D computer program to simulate a murmuration, scientists tested how well 25 players, acting as flying predators, could target and pursue virtual starlings, whose movements were simulated based on data from real starling flocks (see video above). The team’s findings reaffirmed the confusion effect: The larger the simulated flocks, the harder it was for the “predators” to single out and catch individual prey, the researchers report this week in Royal Society Open Science. So maybe sometimes, it’s not so bad to get lost in a crowd. © 2017 American Association for the Advancement of Science.
By Anna Azvolinsky Hummingbirds are efficient hoverers, suspending their bodies midair using rapid forward and backward strokes. Aside from their unique ability to hover, the tiny avians are also the only known birds that can fly in any direction, including sideways. Hummingbird brains appear to be adapted for this flying ability, researchers have now shown. According to a study published today (January 5) in Current Biology, a highly conserved area of the brain—the lentiformis mesencephali (LM), which receives panoramic visual motion information directly from the retina—processes the movement of objects from all directions. In contrast, the LMs of other bird species and all other four-limbed vertebrates studied to date predominantly sense back-to-front motion. While the authors had predicted the neurons of this hummingbird brain region would be tuned to slow motion, they in fact found the opposite: LM neurons were sensitive to quick visual motion, most likely because hummingbirds must process and respond to their environments quickly to avoid collisions, both during hovering and in other modes of flight. “This ancient part of the brain the authors studied has one job: to detect the motion of the image in front of the eyes,” explained Michael Ibbotson, a neuroscientist at the University of Melbourne who penned an accompanying editorial but was not involved in the research. The results of this study suggest that “hummingbirds evolved this area of the brain to have fine motor control to be able to hover and push in every direction possible,” Ibbotson said. © 1986-2017 The Scientist
Link ID: 23064 - Posted: 01.07.2017
By Susana Martinez-Conde Our perceptual and cognitive systems like to keep things simple. We describe the line drawings below as a circle and a square, even though their imagined contours consist—in reality—of discontinuous line segments. The Gestalt psychologists of the 19th and early 20th century branded this perceptual legerdemain as the Principle of Closure, by which we tend to recognize shapes and concepts as complete, even in the face of fragmentary information. Now at the end of the year, it is tempting to seek a cognitive kind of closure: we want to close the lid on 2016, wrap it with a bow and start a fresh new year from a blank slate. Of course, it’s just an illusion, the Principle of Closure in one of its many incarnations. The end of the year is just as arbitrary as the end of the month, or the end of the week, or any other date we choose to highlight in the earth’s recurrent journey around the sun. But it feels quite different. That’s why we have lists of New Year’s resolutions, or why we start new diets or exercise regimes on Mondays rather than Thursdays. Researchers have also found that, even though we measure time in a continuous scale, we assign special meaning to idiosyncratic milestones such as entering a new decade. What should we do about our brain’s oversimplification tendencies concerning the New Year—if anything? One strategy would be to fight our feelings of closure and rebirth as we (in truth) seamlessly move from the last day of 2016 to the first day of 2017. But that approach is likely to fail. Try as we might, the Principle of Closure is just too ingrained in our perceptual and cognitive systems. In fact, if you already have the feeling that the beginning of the year is somewhat special (hey, it only happens once a year!), you might as well decide that resistance is futile, and not just embrace the illusion, but do your best to channel it. © 2017 Scientific American
By Stephen L. Macknik Masashi Atarashi, a physics high school teacher from Japan, submitted this wonderful winter illusion to the 2015 Best Illusion of the Year Contest, where it competed as a finalist. Atarashi discovered this effect serendipitously, while watching the snow fall through the venetian window blinds of his school’s faculty lounge—just like his students must sometimes do in the classroom during a lecture! Notice that as the blinds occupy more area on the screen, the speed of the snowfall seems to accelerate. A great illusion to ponder during our white holiday season. Nobody knows how Atarashi’s effect works, but our working hypothesis is that each time the snow disappears behind a blind, or reappears below it, it triggers transient increases in the activity of your visual system’s motion-sensitive neurons. Such transient surges in neural activity are perhaps misinterpreted by your brain as faster motion speed. © 2016 Scientific American,
Link ID: 23022 - Posted: 12.27.2016
Sarah Boseley Health editor The NHS is to pay for 10 people to be implanted with a “bionic eye”, a pioneering technology that can restore some sight to those who have been blind for years. Only a handful of people have undergone surgery in trials so far to equip them to use Argus II, which employs a camera mounted in a pair of glasses and a tiny computer to relay signals directly to the nerves controlling sight. The decision to fund the first 10 NHS patients to be given the bionic eye could pave the way for the life-changing technology to enter the mainstream. Those who will get the equipment can currently see nothing more than the difference between daylight and darkness. The system allows the brain to decode flashes of light, so that they can learn to see movement. One of three patients to have had the implant into the retina in trials at Manchester Royal Eye hospital is Keith Hayman, 68, from Lancashire, who has five grandchildren. He was diagnosed with retinitis pigmentosa in his 20s. The disease causes cells in the retina gradually to stop working and eventually die. Hayman, who was originally a butcher, was registered blind in 1981, and forced to give up all work. “Having spent half my life in darkness, I can now tell when my grandchildren run towards me and make out lights twinkling on Christmas trees,” he said. “I would be talking to a friend, who might have walked off and I couldn’t tell and kept talking to myself. This doesn’t happen anymore, because I can tell when they have gone.” They may seem like little things, he said, but “they make all the difference to me”. © 2016 Guardian News and Media Limited
Betsy Mason With virtual reality finally hitting the consumer market this year, VR headsets are bound to make their way onto a lot of holiday shopping lists. But new research suggests these gifts could also give some of their recipients motion sickness — especially if they’re women. In a test of people playing one virtual reality game using an Oculus Rift headset, more than half felt sick within 15 minutes, a team of scientists at the University of Minnesota in Minneapolis reports online December 3 in Experimental Brain Research. Among women, nearly four out of five felt sick. So-called VR sickness, also known as simulator sickness or cybersickness, has been recognized since the 1980s, when the U.S. military noticed that flight simulators were nauseating its pilots. In recent years, anecdotal reports began trickling in about the new generation of head-mounted virtual reality displays making people sick. Now, with VR making its way into people’s homes, there’s a steady stream of claims of VR sickness. “It's a high rate of people that you put in [VR headsets] that are going to experience some level of symptoms,” says Eric Muth, an experimental psychologist at Clemson University in South Carolina with expertise in motion sickness. “It’s going to mute the ‘Wheee!’ factor.” Oculus, which Facebook bought for $2 billion in 2014, released its Rift headset in March. The company declined to comment on the new research but says it has made progress in making the virtual reality experience comfortable for most people, and that developers are getting better at creating VR content. All approved games and apps get a comfort rating based on things like the type of movements involved, and Oculus recommends starting slow and taking breaks. But still some users report getting sick. © Society for Science & the Public 2000 - 2016.
Link ID: 22962 - Posted: 12.07.2016
.By JOANNA KLEIN A honey bee gathering pollen on a white flower. Dagmar Sporck/EyeEm, via Getty Images Set your meetings, phone calls and emails aside, at least for the next several minutes. That’s because today you’re a bee. It's time to leave your hive, or your underground burrow, and forage for pollen. Pollen is the stuff that flowers use to reproduce. But it’s also essential grub for you, other bees in your hive and your larvae. Once you’ve gathered pollen to take home, you or another bee will mix it with water and flower nectar that other bees have gathered and stored in the hive. But how do you decide which flowers to approach? What draws you in? In a review published last week in the journal Functional Ecology, researchers asked: What is a flower like from a bee’s perspective, and what does the pollinator experience as it gathers pollen? And that's why we're talking to you in the second person: to help you understand how bees like you, while hunting for pollen, use all of your senses — taste, touch, smell and more — to decide what to pick up and bring home. Maybe you're ready to go find some pollen. But do you even know where to look? © 2016 The New York Times Company
Hannah Devlin Science Correspondent Blind animals have had their vision partially restored using a revolutionary DNA editing technique that scientists say could in future be applied to a range of devastating genetic diseases. The study is the first to demonstrate that a gene editing tool, called Crispr, can be used to replace faulty genes with working versions in the cells of adults - in this case adult rats. Previously, the powerful procedure, in which strands of DNA are snipped out and replaced, had been used only in dividing cells - such as those in an embryo - and scientists had struggled to apply it to non-dividing cells that make up most adult tissue, including the brain, heart, kidneys and liver. The latest advance paves the way for Crispr to be used to treat a range of incurable illnesses, such as muscular dystrophy, haemophilia and cystic fibrosis, by overwriting aberrant genes with a healthy working version. Professor Juan Carlos Izpisua Belmonte, who led the work at the Salk Institute in California, said: “For the first time, we can enter into cells that do not divide and modify the DNA at will. The possible applications of this discovery are vast.” The technique could be trialled in humans in as little as one or two years, he predicted, adding that the team were already working on developing therapies for muscular dystrophy. Crispr, a tool sometimes referred to as “molecular scissors”, has already been hailed as a game-changer in genetics because it allows scientists to cut precise sections of DNA and replace them with synthetic, healthy replacements. © 2016 Guardian News and Media Limited
Link ID: 22882 - Posted: 11.17.2016
By ARNAUD COLINART, AMAURY LA BURTHE, PETER MIDDLETON and JAMES SPINNEY “What is the world of sound?” So begins a diary entry from April 1984, recorded on audiocassette, about the nature of acoustic experience. The voice on the tape is that of the writer and theologian John Hull, who at the time of the recording had been totally blind for almost two years. After losing his sight in his mid-40s, Dr. Hull, a newlywed with a young family, had decided that blindness would destroy him if he didn’t learn to understand it. For three years he recorded his experiences of sight loss, documenting “a world beyond sight.” We first met Dr. Hull in 2011, having read his acclaimed 1991 book “Touching The Rock: An Experience of Blindness,” which was transcribed from his audio diaries. We began collaborating with him on a series of films using his original recordings. These included an Emmy-winning Op-Doc in 2014 and culminated in the feature-length documentary “Notes on Blindness.” But we were also interested in how interactive forms of storytelling might further explore Dr. Hull’s vast and detailed account — in particular how new mediums like virtual reality could illuminate his investigations into auditory experience. The diaries describe his evolving appreciation of “the breadth and depth of three-dimensional world that is revealed by sound,” the awakening of an acoustic perception of space. The sound of falling rain, he said, “brings out the contours of what is around you”; wind brings leaves and trees to life; thunder “puts a roof over your head.” This interactive experience is narrated by Dr. Hull, using extracts from his diary recordings to consider the nature of acoustic space. Binaural techniques map the myriad details of everyday life (in this case, the noises that surround Dr. Hull in a park) within a 3-D sound environment, a “panorama of music and information,” rich in color and texture. The real-time animation visualizes this multilayered soundscape in which, Dr. Hull says, “every sound is a point of activity.” © 2016 The New York Times Company
By Simon Oxenham Isy Suttie has felt “head squeezing” since she was young. The comedian, best known for playing Dobbie in the BBC sitcom Peep Show, is one of many people who experience autonomous sensory meridian response (ASMR) – a tingly feeling often elicited by certain videos or particular mundane interactions. Growing up, Suttie says she had always assumed everyone felt it too. Not everyone feels it, but Suttie is by no means alone. On Reddit, a community of more than 100,000 members share videos designed to elicit the pleasurable sensation. The videos, often described as “whisper porn”, typically consist of people role-playing routine tasks, whispering softly into a microphone or making noises by crinkling objects such as crisp packets. The most popular ASMR YouTuber, “Gentle Whispering”, has over 250 million views. To most of us, the videos might seem strange or boring, but the clips frequently garner hundreds of thousands of views. These videos often mimic real-life situations that provoke ASMR in susceptible people. Suttie says her strongest real-world triggers occur during innocuous interactions with strangers, like talking about the weather – “it’s almost as if the more superficial the subject the better,” Suttie says. She feels the sensation particularly strongly when someone brushes past her. For Suttie, the feelings are so powerful that she often feels floored by them, and they even overcome pain and emotional distress. During a trip to the dentist, she still experiences the pleasurable tingles when the assistant brushes past her, she says. © Copyright Reed Business Information Ltd.
By Jessica Boddy Glasses may be trendy now, but for centuries they were the stodgy accessories of the elderly worn only for failing eyes. Now, new research suggests that aging bonobos might also benefit from a pair of specs—not for reading, but for grooming. Many older bonobos groom their partners at arm’s length instead of just centimeters away, in the same way that older humans often hold newspapers farther out to read. This made researchers think the apes might also be losing their close-up vision as they age. To see whether their hypothesis held, the researchers took photos of 14 different bonobos of varying ages as they groomed one another (above) and measured the distance between their hands and faces. By analyzing how this so-called grooming distance varied from ape to ape, the researchers found that grooming distance increased exponentially with age, they report today in Current Biology. And because both humans and bonobos shows signs of farsightedness around age 40, deterioration in human eyes might not be the mere result of staring at screens and small text, the scientists say. Rather, it might be a deep-rooted natural trait reaching back to a common ancestor. © 2016 American Association for the Advancement of Science.
Link ID: 22841 - Posted: 11.08.2016
By Diana Kwon Can you feel your heart beating? Most people cannot, unless they are agitated or afraid. The brain masks the sensation of the heart in a delicate balancing act—we need to be able to feel our pulse racing occasionally as an important signal of fear or excitement, but most of the time the constant rhythm would be distracting or maddening. A growing body of research suggests that because of the way the brain compensates for our heartbeat, it may be vulnerable to perceptual illusions—if they are timed just right. In a study published in May in the Journal of Neuroscience, a team at the Swiss Federal Institute of Technology in Lausanne conducted a series of studies on 143 participants and found that subjects took longer to identify a flashing object when it appeared in sync with the rhythm of their heartbeats. Using functional MRI, they also found that activity in the insula, a brain area associated with self-awareness, was suppressed when people viewed these synchronized images. The authors suggest that the flashing object was suppressed by the brain because it got lumped in with all the other bodily changes that occur with each heartbeat—the eyes make tiny movements, eye pressure changes slightly, the chest expands and contracts. “The brain knows that the heartbeat is coming from the self, so it doesn't want to be bothered by the sensory consequences of these signals,” says Roy Salomon, one of the study's co-authors. © 2016 Scientific American