Chapter 6. Hearing, Balance, Taste, and Smell
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By ARNAUD COLINART, AMAURY LA BURTHE, PETER MIDDLETON and JAMES SPINNEY “What is the world of sound?” So begins a diary entry from April 1984, recorded on audiocassette, about the nature of acoustic experience. The voice on the tape is that of the writer and theologian John Hull, who at the time of the recording had been totally blind for almost two years. After losing his sight in his mid-40s, Dr. Hull, a newlywed with a young family, had decided that blindness would destroy him if he didn’t learn to understand it. For three years he recorded his experiences of sight loss, documenting “a world beyond sight.” We first met Dr. Hull in 2011, having read his acclaimed 1991 book “Touching The Rock: An Experience of Blindness,” which was transcribed from his audio diaries. We began collaborating with him on a series of films using his original recordings. These included an Emmy-winning Op-Doc in 2014 and culminated in the feature-length documentary “Notes on Blindness.” But we were also interested in how interactive forms of storytelling might further explore Dr. Hull’s vast and detailed account — in particular how new mediums like virtual reality could illuminate his investigations into auditory experience. The diaries describe his evolving appreciation of “the breadth and depth of three-dimensional world that is revealed by sound,” the awakening of an acoustic perception of space. The sound of falling rain, he said, “brings out the contours of what is around you”; wind brings leaves and trees to life; thunder “puts a roof over your head.” This interactive experience is narrated by Dr. Hull, using extracts from his diary recordings to consider the nature of acoustic space. Binaural techniques map the myriad details of everyday life (in this case, the noises that surround Dr. Hull in a park) within a 3-D sound environment, a “panorama of music and information,” rich in color and texture. The real-time animation visualizes this multilayered soundscape in which, Dr. Hull says, “every sound is a point of activity.” © 2016 The New York Times Company
Amir Kheradmand, When we spin—on an amusement park ride or the dance floor—we often become disoriented, even dizzy. So how do professional athletes, particularly figure skaters who spin at incredible speeds, avoid losing their balance? The short answer is training, but to really grasp why figure skaters can twirl without getting dizzy requires an understanding of the vestibular system, the apparatus in our inner ear that helps to keep us upright. This system contains special sensory nerve cells that can detect the speed and direction at which our head moves. These sensors are tightly coupled with our eye movements and with our perception of our body's position and motion through space. For instance, if we rotate our head to the right while our eyes remain focused on an object straight ahead, our eyes naturally move to the left at the same speed. This involuntary response allows us to stay focused on a stationary object. Spinning is more complicated. When we move our head during a spin, our eyes start to move in the opposite direction but reach their limit before our head completes a full 360-degree turn. So our eyes flick back to a new starting position midspin, and the motion repeats as we rotate. When our head rotation triggers this automatic, repetitive eye movement, called nystagmus, we get dizzy. © 2016 Scientific American
Link ID: 22878 - Posted: 11.17.2016
Laura Sanders SAN DIEGO — Mice raised in cages bombarded with glowing lights and sounds have profound brain abnormalities and behavioral trouble. Hours of daily stimulation led to behaviors reminiscent of attention-deficit/hyperactivity disorder, scientists reported November 14 at the annual meeting of the Society for Neuroscience. Certain kinds of sensory stimulation, such as sights and sounds, are known to help the brain develop correctly. But scientists from Seattle Children’s Research Institute wondered whether too much stimulation or stimulation of the wrong sort could have negative effects on the growing brain. To mimic extreme screen exposure, mice were blasted with flashing lights and TV audio for six hours a day. The cacophony began when the mice were 10 days old and lasted for six weeks. After the end of the ordeal, scientists examined the mice’s brains. “We found dramatic changes everywhere in the brain,” said study coauthor Jan-Marino Ramirez. Mice that had been stimulated had fewer newborn nerve cells in the hippocampus, a brain structure important for learning and memory, than unstimulated mice, Ramirez said. The stimulation also made certain nerve cells more active in general. Stimulated mice also displayed behaviors similar to some associated with ADHD in children. These mice were noticeably more active and had trouble remembering whether they had encountered an object. The mice also seemed more inclined to take risks, venturing into open areas that mice normally shy away from, for instance. |© Society for Science & the Public 2000 - 2016.
Laura Sanders SAN DIEGO — A nerve-zapping headset caused people to shed fat in a small preliminary study. Six people who had received the stimulation lost on average about 8 percent of the fat on their trunks in four months, scientists reported November 12 at the annual meeting of the Society for Neuroscience. The headset stimulated the vestibular nerve, which runs just behind the ears. That nerve sends signals to the hypothalamus, a brain structure thought to control the body’s fat storage. By stimulating the nerve with an electrical current, the technique shifts the body away from storing fat toward burning it, scientists propose. Six overweight and obese people received the treatment, consisting of up to four one-hour-long sessions of stimulation a week. Because it activates the vestibular system, the stimulation evoked the sensation of gently rocking on a boat or floating in a pool, said study coauthor Jason McKeown of the University of California, San Diego. After four months, body scans measured the trunk body fat for the six people receiving the treatment and three people who received sham stimulation. All six in the treatment group lost some trunk fat, despite not having changed their activity or diet. In contrast, those in the sham group gained some fat. Researchers suspect that metabolic changes are behind the difference. “The results were a lot better than we thought they’d be,” McKeown said. |© Society for Science & the Public 2000 - 2016.
Link ID: 22869 - Posted: 11.15.2016
By Rachel Feltman and Sarah Kaplan Dear Science, I just got a new iPhone and can't decide what kind of headphones I should be using. I read somewhere that ear buds are worse for you than headphones that fit over your ear. Is that true? I don't want to damage my hearing by using the wrong thing. Here's what science has to say: At the end of the day, nothing really matters but volume. No pair of headphones is inherently “good” or “bad” for your hearing. But picking the right headphones can help you listen to your music more responsibly. The louder a sound is, the more quickly it can cause injury to your ears. If you're not careful, a powerful sound wave can actually tear right through your delicate eardrum, but that's unlikely to happen while blasting music. Most hearing loss is the result of nerve damage, and your smartphone is more than capable of wrecking your ears that way. You can be exposed to 85 decibels — the noise of busy city traffic — pretty much all day without causing nerve damage, but things quickly become dangerous once you get louder than that. At 115 decibels, which is about the noise level produced at a rock concert or by a chain saw, nerve damage can happen in less than a minute. You might not immediately notice significant hearing loss as the result of that nerve damage, but it will add up over time. Some smartphones can crank music to 120 decibels. If you listened to an entire album at that volume, you might have noticeable hearing loss by the time you took off your headphones. According to the World Health Organization, 1.1 billion teens and young adults globally are at risk of developing hearing loss because of these “personal audio devices.” You already know the solution, folks: Turn that music down. © 1996-2016 The Washington Post
Link ID: 22867 - Posted: 11.15.2016
by Helen Thompson Narwhals use highly targeted beams of sound to scan their environment for threats and food. In fact, the so-called unicorns of the sea (for their iconic head tusks) may produce the most refined sonar of any living animal. A team of researchers set up 16 underwater microphones to eavesdrop on narwhal click vocalizations at 11 ice pack sites in Greenland’s Baffin Bay in 2013. The recordings show that narwhal clicks are extremely intense and directional — meaning they can widen and narrow the beam of sound to find prey over long and short distances. It’s the most directional sonar signal measured in a living species, the researchers report November 9 in PLOS ONE. The sound beams are also asymmetrically narrow on top. That minimizes clutter from echoes bouncing off the sea surface or ice pack. Finally, narwhals scan vertically as they dive, which could help them find patches of open water where they can surface and breathe amid sea ice cover. All this means that narwhals employ pretty sophisticated sonar. The audio data could help researchers tell the difference between narwhal vocalizations and those of neighboring beluga whales. It also provides a baseline for assessing the potential impact of noise pollution from increases in shipping traffic made possible by sea ice loss. |© Society for Science & the Public 2000 - 2016.
Link ID: 22856 - Posted: 11.12.2016
By Bob Holmes It’s not something to be sniffed at. Computers have cracked a problem that has stumped chemists for centuries: predicting a molecule’s odour from its structure. The feat may allow perfumers and flavour specialists to create new products with much less trial and error. Unlike vision and hearing, the result of which can be predicted by analysing wavelengths of light or sound, our sense of smell has long remained inscrutable. Olfactory chemists have never been able to predict how a given molecule will smell, except in a few special cases, because so many aspects of a molecule’s structure could be important in determining its odour. Andreas Keller and Leslie Vosshall at Rockefeller University in New York City decided to crowdsource the power of machine learning to address the problem. First, they had 49 volunteers rate the odour of 476 chemicals according to how intense and how pleasant the smell was, and how well it matched 19 other descriptors, such as garlic, spice or fruit. Then they released the data for 407 of the chemicals, along with 4884 different variables measuring chemical structure, and invited anyone to develop machine-learning algorithms that would make sense of the patterns. They used the remaining 69 chemicals to evaluate the accuracy of the algorithms of the 22 teams that took up the challenge. © Copyright Reed Business Information Ltd.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 22806 - Posted: 10.29.2016
By Jessica Boddy You’d probably never notice a jumping spider across your living room, but it would surely notice you. The arachnids are known for their brilliant eyesight, and a new study shows they have even greater sensory prowess than we thought: Jumping spiders can hear sounds even though they don’t have ears—or even eardrums. To find this out, researchers implanted tiny electrodes in a region of spiders’ brains that would show whether sound was being processed. Then they placed the spiders on a specially designed box to eliminate any vibrations from below—most spiders sense their surroundings through vibrations—and scared the heck out of them with a speaker-produced buzz of one of their predators, the mud dauber wasp. An out-of-earshot, high-frequency buzz and a silent control elicited no response from the spiders. But the 80-hertz wasp buzz made them freeze and look around, startled, just as they would do in the wild. What’s more, data from the electrodes showed a spike in brain activity with each buzz, revealing that spiders actually hear sounds, from a swooping mud dauber wasp to you crunching potato chips on your couch. The researchers, who publish their work today in Current Biology, say further study is needed to see exactly how spiders receive sounds without eardrums, but they believe sensitive hairs on their legs play a part. © 2016 American Association for the Advancement of Science.
Link ID: 22755 - Posted: 10.15.2016
By Virginia Morell Human-produced noise in the ocean is likely harming marine mammals in numerous unknown ways, according to a comprehensive new report from the National Academies of Sciences, Engineering, and Medicine. That’s because there are insufficient data to determine how the ill effects of noise created by ships, sonar signals, and other activities interact with other threats, including pollution, climate change, and the loss of prey due to fishing. The report, which was sponsored by several government agencies and released on 7 October, provides a new framework for researchers to begin exploring these cumulative impacts. “There’s a growing recognition that interactions between stressors on marine mammals can’t right now be accurately assessed," said Peter Tyack, a marine mammal biologist at the University of St Andrews in the United Kingdom, in a webinar on the report. Tyack also chaired the committee that prepared the study, "Approaches to Understanding the Cumulative Effects of Stressors on Marine Mammals." Killer whales, for instance, are known to swim away from areas where they have encountered sonar signals of about 142 decibels, a sound level lower than currently allowed by the U.S. Navy for its ships, Tyack said, referring to a 2014 study in The Journal of the Acoustical Society of America that determined the mammals’ likely response. But scientists don’t yet know how other marine mammals might respond. They also don’t know whether or how other factors, such as encountering an oil spill or colliding with a ship, would—or would not—compound the cetaceans’ response to these sounds; or how or whether such combined stressors matter to the animals’ long-term health and overall population. © 2016 American Association for the Advancement of Science.
By JAN HOFFMAN Our daily tug of leash war goes like this. I tell Chico we’re taking a left. He yanks right, wet black nostrils burrowing in loamy leaf piles. Me versus a 15-pound Havanese, incensed by scent. Today, I let him win. That’s because I have fresh appreciation for his sniffing behavior, after reading a new book, “Being a Dog: Following the Dog into a World of Smell,” by Alexandra Horowitz, a professor of cognitive science who runs the Dog Cognition Lab at Barnard College. In it, she explains the elegant engineering of the dog’s olfactory system and how familiar canine behaviors — licking, sneezing, tail-wagging — have associations with smell. Dr. Horowitz also describes how she trained herself to enhance her inferior human sniffing ability. On a recent afternoon at Riverside Park in Manhattan, I met Dr. Horowitz and Finn (short for Finnegan), her affable, glossy black 9-year-old mixed breed. There she — and he — shared some sniffing insights that have since made my walks with Chico more intriguing and fun. “There are many ways to sniff, and the human method is not the best,” Dr. Horowitz said. Sniff researchers (yes, you read that correctly) have found we have about six million olfactory receptors; dogs have 300 million. Humans sniff once per second-and-a-half; dogs, five to 10 times a second. “They even exhale better than we do,” Dr. Horowitz continued, describing a sort of doggy yoga breath. Dogs exhale through the side slits of their nostrils, so they keep a continuous flow of inhaled air in their snout for smelling. “This gives them a continuous olfactory view of the world.” © 2016 The New York Times Company
Keyword: Chemical Senses (Smell & Taste)
Link ID: 22742 - Posted: 10.11.2016
Annette Heist Nisha Pradhan is worried. The recent college graduate just turned 21 and plans to live on her own. But she's afraid she won't be able to stay safe. That's because Pradhan is anosmic — she isn't able to smell. She can't tell if milk is sour, or if she's burning something on the stove, or if there's a gas leak, and that worries her. "It actually didn't even strike me as being a big deal until I got to college," Pradhan says. Back home in Pennington, N.J., her family did her smelling for her, she says. She's moved in with them for now, but she's looking for a place of her own. "Now that I'm searching for ways or places to live as an independent person, I find more and more that the sense of smell is crucial to how we live our lives," Pradhan says. There's no good estimate for how many people live with smell loss. Congenital anosmia, being born without a sense of smell, is a rare condition. Acquired smell loss is more common. That loss can be total, or what's known as hyposmia, a diminished sense of smell. Pradhan doesn't know how she lost her sense of smell. She thinks she was born with it because as a child, she says she liked to eat and ate a lot. But there came a point where she lost interest in food. "That's actually one of the first things that people notice whenever they have a smell problem, is food doesn't taste right anymore," says Beverly Cowart, a researcher at the Monell Chemical Senses Center in Philadelphia. That's because eating and smell go hand in hand. How food tastes often relies on what we smell. © 2016 npr
By Michelle Roberts Some people are genetically wired to prefer the taste of fatty foods, putting them at increased risk of obesity, according to UK researchers. The University of Cambridge team offered 54 volunteers unlimited portions of chicken korma, followed by an Eton mess-style dessert. Some of the meals were packed with fat while others were low-fat versions. Those with a gene already linked to obesity showed a preference for the high-fat food and ate more of it. Fat genes The gene in question is called MC4R. It is thought about one in every 1,000 people carries a defective version of this gene which controls hunger and appetite as well as how well we burn off calories. Mutations in MC4R are the most common genetic cause of severe obesity within families that has so far been identified. Humans probably evolved hunger genes to cope in times of famine, say experts. When food is scarce it makes sense to eat and store more fat to fend off starvation. But having a defect in the MC4R gene means hunger can become insatiable. In the study, published in the journal Nature Communications, the researchers created a test menu that varied only in fat or sugar content. The three versions of the main meal on offer - chicken korma - were identical in appearance, and as far as possible, taste, but ranged in fat from low to medium and high. The volunteers were offered a small sample of each and then left to eat as much as they liked of the three dishes. The same was then done for a pudding of strawberries, meringue and cream, but this time varying the sugar content rather than the fat. © 2016 BBC.
Alva Noë Eaters and cooks know that flavor, in the jargon of neuroscientists, is multi-modal. Taste is all important, to be sure. But so is the look of food and its feel in the mouth — not to mention its odor and the noisy crunch, or juicy squelch, that it may or may not make as we bite into it. The perception of flavor demands that we exercise a suite of not only gustatory, but also visual, olfactory, tactile and auditory sensitivities. Neuroscientists are now beginning to grasp some of the ways the brain enables our impressive perceptual power when it comes to food. Traditionally, scientists represent the brain's sensory function in a map where distinct cortical areas are thought of as serving the different senses. But it is increasingly appreciated that brain activity can't quite be segregated in this way. Cells in visual cortex may be activated by tactile stimuli. This is the case, for example, when Braille readers use their fingers to read. These blind readers aren't seeing with their fingers, rather, they are deploying their visual brains to perceive with their hands. And, in a famous series of studies that had a great influence on my thinking on these matters, Miriganka Sur at MIT showed that animals whose retinas were re-wired surgically to feed directly into auditory cortex do not hear lights and other visible objects presented to the eyes, rather, they see with their auditory brains. The brain is plastic, and different sensory modalities compete continuously for control over populations of cells. An exciting new paper on the gustatory cortex from the laboratory of Alfredo Fontanini at Stony Brook University shows that there are visual-, auditory-, olfactory- and touch-sensitive cells in the gustatory cortex of rats. There are even some cells that respond to stimuli in more than one modality. But what is more remarkable is that when rats learn to associate non-taste qualities — tones, flashes of lights, etc. — with food (sucrose in their study), there is a marked transformation in the gustatory cortex. © 2016 npr
Dean Burnett You remember that time a children’s TV presenter, one who has been working in children’s television for decades and is now employed on a channel aimed at under-8-year-olds, decided to risk it all and say one of the worst possible swear words on a show for pre-schoolers that he is famous for co-hosting? Remember how he took a huge risk for no appreciable gain and uttered a context-free profanity to an audience of toddlers? How he must have wanted to swear on children’s TV but paradoxically didn’t want anyone to notice so “snuck it in” as part of a song, where it would be more ambiguous? How all the editors and regulators at the BBC happened to completely miss it and allow it to be aired? Remember this happening? Well you shouldn’t, because it clearly didn’t. No presenter and/or channel would risk their whole livelihood in such a pointless, meaningless way, especially not the ever-pressured BBC. And, yet, an alarming number of people do think it happened. Apparently, there have been some “outraged parents” who are aghast at the whole thing. This seems reasonable in some respects; if your toddler was subjected to extreme cursing then as a parent you probably would object. On the other hand, if your very small child is able to recognise strong expletives, then perhaps misheard lyrics on cheerful TV shows aren’t the most pressing issue in their life. Regardless, a surprising number of people report that they did genuinely “hear” the c-word. This is less likely to be due to a TV presenter having some sort of extremely-fleeting breakdown, and more likely due to the quirks and questionable processing of our senses by our powerful yet imperfect brains. © 2016 Guardian News and Media Limited
By Jessica Hamzelou As any weight-watcher knows, carb cravings can be hard to resist. Now there’s evidence that carbohydrate-rich foods may elicit a unique taste too, suggesting that “starchy” could be a flavour in its own right. It has long been thought that our tongues register a small number of primary tastes: salty, sweet, sour and bitter. Umami – the savoury taste often associated with monosodium glutamate – was added to this list seven years ago, but there’s been no change since then. However, this list misses a major component of our diets, says Juyun Lim at Oregon State University in Corvallis. “Every culture has a major source of complex carbohydrate. The idea that we can’t taste what we’re eating doesn’t make sense,” she says. Complex carbohydrates such as starch are made of chains of sugar molecules and are an important source of energy in our diets. However, food scientists have tended to ignore the idea that we might be able to specifically taste them, says Lim. Because enzymes in our saliva break starch down into shorter chains and simple sugars, many have assumed we detect starch by tasting these sweet molecules. Her team tested this by giving a range of different carbohydrate solutions to volunteers – who it turned out were able to detect a starch-like taste in solutions that contained long or shorter carbohydrate chains. “They called the taste ‘starchy’,” says Lim. “Asians would say it was rice-like, while Caucasians described it as bread-like or pasta-like. It’s like eating flour.” © Copyright Reed Business Information Ltd.
By Alison F. Takemura A stationary Carolina sphinx moth (Manduca sexta) is the Cinderella of the animal kingdom. The hummingbird-size insect has dull, dark wings that are mottled like charred wood, and a plump body reminiscent of a small breakfast sausage. Casual observers of M. sexta often see little else. “They say, ‘Oh, it doesn’t look so nice. It’s just grey.’ But as soon as [the moths] start flying, they’re completely impressed,” says Danny Kessler, a pollination ecologist at the Max Planck Institute of Chemical Ecology in Germany. “They change their minds completely.” Hawkmoths, the group to which M. sexta belongs, whir their wings like hummingbirds as they flit between flowers, hovering to drink nectar. M. sexta’s proboscis, longer than its 2-inch body, stays unfurled, a straw ready to sip. Kessler studies the interaction between the Carolina sphinx moth, whose larvae are known as tobacco hornworms, and its preferred food source, the coyote tobacco plant (Nicotiana attenuata), to better understand how insect behavior affects a plant’s reproductive success. M. sexta adults drink nectar from tobacco’s skinny, white, trumpet-shape flowers, foraging from them at night and pollinating them in the process. Scientists have known for decades that the moth uses its antennae to detect the flowers’ scent—even from several miles away, Kessler says. © 1986-2016 The Scientist
Keyword: Chemical Senses (Smell & Taste)
Link ID: 22626 - Posted: 09.05.2016
By Simon Oxenham It can seem like barely a week goes by without a new study linking the stage in a woman’s monthly cycle to her preferences in a sexual partner. Reportedly, when women are ovulating they are attracted to men who are healthier, more dominant, more masculine, have higher testosterone levels– the list goes on. But do women really exhibit such behavioural changes – and why are we so fascinated by the idea that they do? A popular theory in evolutionary psychology is that women seek out men with better genes while they are ovulating to have short term affairs with, so as to produce healthier babies. These men may not necessarily stick around for the long haul, but appear particularly attractive when a woman is in the fertile stage of her cycle. During the non-fertile phase, the theory goes that women seek out men who are more likely to make reliable long-term partners and good fathers. But something smells a bit fishy here. Are women really evolutionarily hard-wired to cuckold their partners? Or might the attraction of a salacious hypothesis – with slightly sexist overtones – be shaping some of this research? Masculine all month A review of these kinds of studies is now challenging this often-told story. Wendy Wood at the University of Southern California and her team have analysed 58 studies – some of which were never published – and found that this theory is largely unsupported by evidence. © Copyright Reed Business Information Ltd.
By Virginia Morell Scientists have long worried whether animals can respond to the planet’s changing climate. Now, a new study reports that at least one species of songbird—and likely many more—already knows how to prep its chicks for a warming world. They do so by emitting special calls to the embryos inside their eggs, which can hear and learn external sounds. This is the first time scientists have found animals using sound to affect the growth, development, behavior, and reproductive success of their offspring, and adds to a growing body of research revealing that birds can “doctor” their eggs. “The study is novel, surprising, and fascinating, and is sure to lead to much more work on parent-embryo communication,” says Robert Magrath, a behavioral ecologist at the Australian National University in Canberra who was not involved in the study. The idea that the zebra finch (Taeniopygia guttata) parents were “talking to their eggs” occurred to Mylene Mariette, a behavioral ecologist at Deakin University in Waurn Ponds, Australia, while recording the birds’ sounds at an outdoor aviary. She noticed that sometimes when a parent was alone, it would make a rapid, high-pitched series of calls while sitting on the eggs. Mariette and her co-author, Katherine Buchanan, recorded the incubation calls of 61 female and 61 male finches inside the aviary. They found that parents of both sexes uttered these calls only during the end of the incubation period and when the maximum daily temperature rose above 26°C (78.8°F). © 2016 American Association for the Advancement of Scienc
Dean Burnett A lot of people, when they travel by car, ship, plane or whatever, end up feeling sick. They’re fine before they get into the vehicle, they’re typically fine when they get out. But whilst in transit, they feel sick. Particularly, it seems, in self-driving cars. Why? One theory is that it’s due to a weird glitch that means your brain gets confused and thinks it’s being poisoned. This may seem surprising; not even the shoddiest low-budget airline would get away with pumping toxins into the passengers (airline food doesn’t count, and that joke is out of date). So where does the brain get this idea that it’s being poisoned? Despite being a very “mobile” species, humans have evolved for certain types of movement. Specifically, walking, or running. Walking has a specific set of neurological processes tied into it, so we’ve had millions of years to adapt to it. Think of all the things going on in your body when you’re walking, and how the brain would pick up on these. There’s the steady thud-thud-thud and pressure on your feet and lower legs. There’s all the signals from your muscles and the movement of your body, meaning the motor cortex (which controls conscious movement of muscles) and proprioception (the sense of the arrangement of your body in space, hence you can know, for example, where your arm is behind your back without looking at it directly) are all supplying particular signals. © 2016 Guardian News and Media Limited
By Marlene Cimons Former president Jimmy Carter, 91, told the New Yorker recently that 90 percent of the arguments he has with Rosalynn, his wife of 70 years, are about hearing. “When I tell her, ‘Please speak more loudly,’ she absolutely refuses to speak more loudly, or to look at me when she talks,” he told the magazine. In response, the former first lady, 88, declared that having to repeat things “drives me up the wall.” Yet after both went to the doctor, much to her surprise, “I found out it was me!” she said. “I was the one who was deaf.” Hearing loss is like that. It comes on gradually, often without an individual’s realizing it, and it prompts a range of social and health consequences. “You don’t just wake up with a sudden hearing loss,” says Barbara Kelley, executive director of the Hearing Loss Association of America. “It can be insidious. It can creep up on you. You start coping, or your spouse starts doing things for you, like making telephone calls.” An estimated 25 percent of Americans between ages 60 and 69 have some degree of hearing loss, according to the President’s Council of Advisors on Science and Technology. That percentage grows to more than 50 percent for those age 70 to 79, and to almost 80 percent of individuals older than 80. That’s about 30 million people, a number likely to increase as our population ages. Behind these statistics are disturbing repercussions such as social isolation and the inability to work, travel or be physically active.
Link ID: 22561 - Posted: 08.16.2016