Links for Keyword: Hearing

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 542

By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 20393 - Posted: 12.04.2014

By Beth Winegarner When Beth Wankel’s son, Bowie, was a baby, he seemed pretty typical. But his “terrible twos” were more than terrible: In preschool, he would hit and push his classmates to a degree that worried his parents and teachers. As Bowie got a little older, he was able tell his mom why he was so combative. “He would say things like, 'I thought they were going to step on me or push me,’” Wankel said. “He was overly uncomfortable going into smaller spaces; it was just too much for him.” Among other things, he refused to enter the school bathroom if another student was inside. When Bowie was 3, he was formally evaluated by his preschool teachers. They said he appeared to be having trouble processing sensory input, especially when it came to figuring out where his body is in relation to other people and objects. He’s also very sensitive to touch and to the textures of certain foods, said Wankel, who lives with her family in San Francisco. Bowie has a form of what’s known as sensory processing disorder. As the name suggests, children and adults with the disorder have trouble filtering sights, smells, sounds and more from the world around them. While so-called neurotypicals can usually ignore background noise, clothing tags or cluttered visual environments, people with SPD notice all of those and more — and quickly become overwhelmed by the effort. Rachel Schneider, a mental-health expert and a blogger for adults with SPD, describes it as a “neurological traffic jam” or “a soundboard, except the sound technician is terrible at his job.”

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 20391 - Posted: 12.04.2014

By Joyce Cohen Like many people, George Rue loved music. He played guitar in a band. He attended concerts often. In his late 20s, he started feeling a dull ache in his ears after musical events. After a blues concert almost nine years ago, “I left with terrible ear pain and ringing, and my life changed forever,” said Mr. Rue, 45, of Waterford, Conn. He perceived all but the mildest sounds as not just loud, but painful. It hurt to hear. Now, he has constant, burning pain in his ears, along with ringing, or tinnitus, so loud it’s “like a laser beam cutting a sheet of steel.” Everyday noise, like a humming refrigerator, adds a feeling of “needles shooting into my ears,” said Mr. Rue, who avoids social situations and was interviewed by email because talking by phone causes pain. Mr. Rue was given a diagnosis of hyperacusis, a nonspecific term that has assorted definitions, including “sound sensitivity,” “decreased sound tolerance,” and “a loudness tolerance problem.” But hyperacusis sometimes comes with ear pain, too, a poorly understood medical condition that is beginning to receive more serious attention. “This is clearly an emerging field,” said Richard Salvi of the Department of Communicative Disorders and Sciences at the University at Buffalo and a scientific adviser to Hyperacusis Research, a nonprofit group that funds research on the condition. “Further work is required to understand the symptoms, etiology and underlying neural mechanisms.” Loud noises, even when they aren’t painful, can damage both the sensory cells and sensory nerve fibers of the inner ear over time, causing hearing impairment, said M. Charles Liberman, a professor of otology at Harvard Medical School, who heads a hearing research lab at the Massachusetts Eye and Ear Infirmary. And for some people who are susceptible, possibly because of some combination of genes that gives them “tender” ears, noise sets in motion “an anomalous response,” he said. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20381 - Posted: 12.02.2014

By Sandra G. Boodman ‘That’s it — I’m done,” Rachel Miller proclaimed, the sting of the neurologist’s judgment fresh as she recounted the just-concluded appointment to her husband. Whatever was wrong with her, Miller decided after that 2009 encounter, she was not willing to risk additional humiliation by seeing another doctor who might dismiss her problems as psychosomatic. The Baltimore marketing executive had spent the previous two years trying to figure out what was causing her bizarre symptoms, some of which she knew made her sound delusional. Her eyes felt “weird,” although her vision was 20/20. Normal sounds seemed hugely amplified: at night when she lay in bed, her breathing and heartbeat were deafening. Water pounding on her back in the shower sounded like a roar. She was plagued by dizziness. “I had started to feel like a person in one of those stories where someone has been committed to a mental hospital by mistake or malice and they desperately try to appear sane,” recalled Miller, now 53. She began to wonder if she really was crazy; numerous tests had ruled out a host of possible causes, including a brain tumor. Continuing to look for answers seemed futile, since all the doctors she had seen had failed to come up with anything conclusive. “My attitude was: If it’s something progressive like MS [multiple sclerosis] or ALS [amyotrophic lateral sclerosis], it’ll get bad enough that someone will eventually figure it out.” Figuring it out would take nearly three more years and was partly the result of an oddity that Miller mentioned to another neurologist, after she lifted her moratorium on seeing doctors.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20353 - Posted: 11.25.2014

By Jyoti Madhusoodanan Eurasian jays are tricky thieves. They eavesdrop on the noises that other birds make while hiding food in order to steal the stash later, new research shows. Scientists trying to figure out if the jays (Garrulus glandarius) could remember sounds and make use of the information placed trays of two materials—either sand or gravel—in a spot hidden from a listening jay’s view. Other avian participants of the same species, which were given a nut, cached the treat in one of the two trays. Fifteen minutes later, the listening bird was permitted to hunt up the stash (video). When food lay buried in a less noisy material such as sand, jays searched randomly. But if they heard gravel being tossed around as treats were hidden, they headed to the pebbles to pilfer the goods. Previous studies have shown that jays—like crows, ravens, and other bird burglars that belong to the corvid family—can remember where they saw food being hidden and return to the spot to look for the cache. But these new results, published in Animal Cognition this month, provide the first evidence that these corvids can also recollect sounds to locate and steal stashes of food. In their forest homes, where birds are heard more often than they are seen, this sneaky strategy might give eavesdropping jays a better chance at finding hidden feasts.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20339 - Posted: 11.21.2014

By Abby Phillip You know the ones: They seem to be swaying to their own music or clapping along to a beat only they can hear. You may even think that describes you. The majority of humans, however, do this very well. We clap, dance, march in unison with few problems; that ability is part of what sets us apart from other animals. But it is true that rhythm — specifically, coordinating your movement with something you hear — doesn't come naturally to some people. Those people represent a very small sliver of the population and have a real disorder called "beat deafness." Unfortunately, your difficulty dancing or keeping time in band class probably doesn't quite qualify. A new study by McGill University researchers looked more closely at what might be going on with "beat deaf" individuals, and the findings may shed light on why some people seem to be rhythm masters while others struggle. Truly beat deaf people have a very difficult time clapping or tapping to an auditory beat or swaying to one. It's a problem that is far more severe than a lack of coordination. And it isn't attributable to motor skills, hearing problems or even a person's inability to create a regular rhythm. Illustrating how rare the disorder really is, McGill scientists received hundreds of inquiries from people who thought they were beat deaf, but only two qualified as having truly severe problems.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20304 - Posted: 11.13.2014

by Penny Sarchet It's frustrating when your smartphone loses its signal in the middle of a call or when downloading a webpage. But for bats, a sudden loss of its sonar signal means missing an insect meal in mid-flight. Now there's evidence to suggest that bats are sneakily using sonar jamming techniques to make their fellow hunters miss their tasty targets. Like other bats, the Mexican free-tailed bat uses echolocation to pinpoint prey insects in the dark. But when many bats hunt in the same space, they can interfere with each other's echoes, making detection more difficult. Jamming happens when a sound disrupts a bat's ability to extract location information from the echoes returning from its prey, explains Aaron Corcoran of Johns Hopkins University in Baltimore, Maryland. Previous research has shown that Mexican free-tailed bats can get around this jamming by switching to higher pitches. Using different sound frequencies to map the hunting grounds around them allows many bats to hunt in the same space. In these studies, jamming of each other's signals was seemingly inadvertent – a simple consequence of two bats attempting to echolocate in close proximity. But Corcoran has found evidence of sneakier goings-on. Corcoran has found a second type of sonar jamming in these bats – intentional sabotage of a fellow bat. "In this study, the jamming is on purpose and the jamming signal has been designed by evolution to maximally disrupt the other bat's echolocation," he says. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20288 - Posted: 11.08.2014

By Meredith Levine, Word went round Janice Mackay's quiet neighbourhood that she was hitting the bottle hard. She'd been seen more than once weaving along the sidewalk in front of her suburban home in Pickering, just outside Toronto, in a sad, drunken stagger. But Mackay wasn't drunk. As it turned out, her inner ear, the body's balance centre, had been destroyed by medication when she was hospitalized for over a month back in May 2005. At the time, Mackay was diagnosed with a life-threatening infection in one of her ovaries, and so was put on a cocktail of medication, including an IV drip of gentamicin, a well-known, inexpensive antibiotic that is one of the few that hasn't fallen prey to antibiotic-resistant bacteria. A few weeks later, the infection was almost gone when Mackay, still hospitalized, suddenly developed the bed spins and vomiting. Her medical team told her she'd been laying down too long and gave her Gravol, but the symptoms didn't go away. In a follow-up appointment after her discharge, Mackay was told that the dizziness was a side effect of the gentamicin, and that she would probably have to get used to it. But she didn't discover the extent of the damage until later when neurotologist Dr. John Rutka assessed her condition and concluded that the gentamicin had essentially destroyed her vestibular system, the body's motion detector, located deep within the inner ear. © CBC 2014

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20198 - Posted: 10.13.2014

By Sarah C. P. Williams A wind turbine, a roaring crowd at a football game, a jet engine running full throttle: Each of these things produces sound waves that are well below the frequencies humans can hear. But just because you can’t hear the low-frequency components of these sounds doesn’t mean they have no effect on your ears. Listening to just 90 seconds of low-frequency sound can change the way your inner ear works for minutes after the noise ends, a new study shows. “Low-frequency sound exposure has long been thought to be innocuous, and this study suggests that it’s not,” says audiology researcher Jeffery Lichtenhan of the Washington University School of Medicine in in St. Louis, who was not involved in the new work. Humans can generally sense sounds at frequencies between 20 and 20,000 cycles per second, or hertz (Hz)—although this range shrinks as a person ages. Prolonged exposure to loud noises within the audible range have long been known to cause hearing loss over time. But establishing the effect of sounds with frequencies under about 250 Hz has been harder. Even though they’re above the lower limit of 20 Hz, these low-frequency sounds tend to be either inaudible or barely audible, and people don’t always know when they’re exposed to them. For the new study, neurobiologist Markus Drexl and colleagues at the Ludwig Maximilian University in Munich, Germany, asked 21 volunteers with normal hearing to sit inside soundproof booths and then played a 30-Hz sound for 90 seconds. The deep, vibrating noise, Drexl says, is about what you might hear “if you open your car windows while you’re driving fast down a highway.” Then, they used probes to record the natural activity of the ear after the noise ended, taking advantage of a phenomenon dubbed spontaneous otoacoustic emissions (SOAEs) in which the healthy human ear itself emits faint whistling sounds. © 2014 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20144 - Posted: 10.01.2014

By JOHN ROGERS LOS ANGELES (AP) — The founder of a Los Angeles-based nonprofit that provides free music lessons to low-income students from gang-ridden neighborhoods began to notice several years ago a hopeful sign: Kids were graduating high school and heading off to UCLA, Tulane and other big universities. That’s when Margaret Martin asked how the children in the Harmony Project were beating the odds. Researchers at Northwestern University in Illinois believe that the students’ music training played a role in their educational achievement, helping as Martin noticed 90 percent of them graduate from high school while 50 percent or more didn’t from those same neighborhoods. A two-year study of 44 children in the program shows that the training changes the brain in ways that make it easier for youngsters to process sounds, according to results reported in Tuesday’s edition of The Journal of Neuroscience. That increased ability, the researchers say, is linked directly to improved skills in such subjects as reading and speech. But, there is one catch: People have to actually play an instrument to get smarter. They can’t just crank up the tunes on their iPod. Nina Kraus, the study’s lead researcher and director of Northwestern’s auditory neuroscience laboratory, compared the difference to that of building up one’s body through exercise. ‘‘I like to say to people: You’re not going to get physically fit just watching sports,’’ she said.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 20025 - Posted: 09.03.2014

Hearing voices is an experience that is very distressing for many people. Voices – or “auditory verbal hallucinations” – are one of the most common features of schizophrenia and other psychiatric disorders. But for a small minority of people, voice-hearing is a regular part of their lives, an everyday experience that isn’t associated with being unwell. It is only in the past 10 years that we have begun to understand what might be going on in “non-clinical” voice-hearing. Most of what we know comes from a large study conducted by Iris Sommer and colleagues at UMC Utrecht in the Netherlands. In 2006 they launched a nationwide attempt to find people who had heard voices before but didn’t have any sort of psychiatric diagnosis. From an initial response of over 4,000 people, they eventually identified a sample of 103 who heard voices at least once a month, but didn’t have psychosis. Their voice-hearing was also not caused by misuse of drugs or alcohol. Twenty-one of the participants were also given an MRI scan. When this group was compared with voice-hearers who did have psychosis, many of the same brain regions were active for both groups while they were experiencing auditory hallucinations, including the inferior frontal gyrus (involved in speech production) and the superior temporal gyrus (linked to speech perception). Subsequent studies with the same non-clinical voice-hearers have also highlighted differences in brain structure and functional connectivity (the synchronisation between different brain areas) compared with people who don’t hear voices. These results suggest that, on a neural level, the same sort of thing is going on in clinical and non-clinical voice-hearing. We know from first-person reports that the voices themselves can be quite similar, in terms of how loud they are, where they are coming from, and whether they speak in words or sentences. © 2014 Guardian News and Media Limited

Related chapters from BP7e: Chapter 16: Psychopathology: Biological Basis of Behavior Disorders; Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 12: Psychopathology: Biological Basis of Behavioral Disorders; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19958 - Posted: 08.14.2014

By NICHOLAS BAKALAR A new study reports that caffeine intake is associated with a reduced risk for tinnitus — ringing or buzzing in the ears. Researchers tracked caffeine use and incidents of tinnitus in 65,085 women in the Nurses’ Health Study II. They were 30 to 34 and without tinnitus at the start of the study. Over the next 18 years, 5,289 developed the disorder. The women recorded their use of soda, coffee and tea (caffeinated and not), as well as intake of candy and chocolate, which can contain caffeine. The results appear in the August issue of The American Journal of Medicine. Compared with women who consumed less than 150 milligrams of caffeine a day (roughly the amount in an eight-ounce cup of coffee), those who had 450 to 599 milligrams a day were 15 percent less likely to have tinnitus, and those who consumed 600 milligrams or more were 21 percent less likely. The association persisted after controlling for other hearing problems, hypertension, diabetes, use of anti-inflammatory Nsaid drugs, a history of depression and other factors. Decaffeinated coffee consumption had no effect on tinnitus risk. “We can’t conclude that caffeine is a cure for tinnitus,” said the lead author, Dr. Jordan T. Glicksman, a resident physician at the University of Western Ontario. “But our results should provide some assurance to people who do drink caffeine that it’s reasonable to continue doing so.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 19955 - Posted: 08.14.2014

|By Ingrid Wickelgren One important function of your inner ear is stabilizing your vision when your head is turning. When your head turns one way, your vestibular system moves your eyes in the opposite direction so that what you are looking at remains stable. To see for yourself how your inner ears make this adjustment, called the vestibulo-ocular reflex, hold your thumb upright at arm’s length. Shake your head back and forth about twice per second while looking at your thumb. See that your thumb remains in focus. Now create the same relative motion by swinging your arm back and forth about five inches at the same speed. Notice that your thumb is blurry. To see an object clearly, the image must remain stationary on your retina. When your head turns, your vestibular system very rapidly moves your eyes in the opposite direction to create this stability. When the thumb moves, your visual system similarly directs the eyes to follow, but the movement is too slow to track a fast-moving object, causing blur. © 2014 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 7: Vision: From Eye to Brain
Link ID: 19895 - Posted: 07.30.2014

|By James Phillips Our inner ear is a marvel. The labyrinthine vestibular system within it is a delicate, byzantine structure made up of tiny canals, crystals and pouches. When healthy, this system enables us to keep our balance and orient ourselves. Unfortunately, a study in the Archives of Internal Medicine found that 35 percent of adults over age 40 suffer from vestibular dysfunction. A number of treatments are available for vestibular problems. During an acute attack of vertigo, vestibular suppressants and antinausea medications can reduce the sensation of motion as well as nausea and vomiting. Sedatives can help patients sleep and rest. Anti-inflammatory drugs can reduce any damage from acute inflammation and antibiotics can treat an infection. If a structural change in the inner ear has loosened some of its particulate matter—for instance, if otolith (calcareous) crystals, which are normally in tilt-sensitive sacs, end up in the semicircular canals, making the canals tilt-sensitive—simple repositioning exercises in the clinic can shake the loose material, returning it where it belongs. After a successful round of therapy, patients no longer sense that they are tilting whenever they turn their heads. If vertigo is a recurrent problem, injecting certain medications can reduce or eliminate the fluctuating function in the affected ear. As a last resort, a surgeon can effectively destroy the inner ear—either by directly damaging the end organs or by cutting the eighth cranial nerve fibers, which carry vestibular information to the brain. The latter surgery involves removing a portion of the skull and shifting the brain sideways, so it is not for the faint of heart. © 2014 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19886 - Posted: 07.28.2014

by Claudia Caruana GOT that ringing in your ears? Tinnitus, the debilitating condition that plagued Beethoven and Darwin, affects roughly 10 per cent of the world's population, including 30 million people in the US alone. Now, a device based on vagus nerve stimulation promises to eliminate the sounds for good by retraining the brain. At the moment, many chronic sufferers turn to state of the art hearing aids configured to play specific tones meant to cancel out the tinnitus. But these do not always work because they just mask the noise. The new device, developed by MicroTransponder in Dallas, Texas, works in an entirely different way. The Serenity System uses a transmitter connected to the vagus nerve in the neck – the vagus nerve connects the brain to many of the body's organs. The thinking goes that most cases of chronic tinnitus result from changes in the signals sent from the ear to neurons in the brain's auditory cortex. This device is meant to retrain those neurons to forget the annoying noise. To use the system, a person wears headphones and listens to computer-generated sounds. First, they listen to tones that trigger the tinnitus before being played different frequencies close to the problematic one. Meanwhile, the implant stimulates the vagus nerve with small pulses. The pulses trigger the release of chemicals that increase the brain's ability to reconfigure itself. The process has already worked in rats (Nature, doi.org/b63kt9) and in a small human trial this year, where it helped around half of the participants. "Vagus nerve stimulation takes advantage of the brain's neuroplasticity – the ability to reconfigure itself," says Michael Kilgard at the University of Texas at Dallas, and a consultant to MicroTransponder. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19880 - Posted: 07.26.2014

Philip Ball Lead guitarists usually get to play the flashy solos while the bass player gets only to plod to the beat. But this seeming injustice could have been determined by the physiology of hearing. Research published today in the Proceedings of the National Academy of Sciences1 suggests that people’s perception of timing in music is more acute for lower-pitched notes. Psychologist Laurel Trainor of McMaster University in Hamilton, Canada, and her colleagues say that their findings explain why in the music of many cultures the rhythm is carried by low-pitched instruments while the melody tends to be taken by the highest pitched. This is as true for the low-pitched percussive rhythms of Indian classical music and Indonesian gamelan as it is for the walking double bass of a jazz ensemble or the left-hand part of a Mozart piano sonata. Earlier studies2 have shown that people have better pitch discrimination for higher notes — a reason, perhaps, that saxophonists and lead guitarists often have solos at a squealing register. It now seems that rhythm works best at the other end of the scale. Trainor and colleagues used the technique of electroencephalography (EEG) — electrical sensors placed on the scalp — to monitor the brain signals of people listening to streams of two simultaneous piano notes, one high-pitched and the other low-pitched, at equally spaced time intervals. Occasionally, one of the two notes was played slightly earlier, by just 50 milliseconds. The researchers studied the EEG recordings for signs that the listeners had noticed. © 2014 Nature Publishing Group,

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19776 - Posted: 07.01.2014

by Frank Swain WHEN it comes to personal electronics, it's difficult to imagine iPhones and hearing aids in the same sentence. I use both and know that hearing aids have a well-deserved reputation as deeply uncool lumps of beige plastic worn mainly by the elderly. Apple, on the other hand, is the epitome of cool consumer electronics. But the two are getting a lot closer. The first "Made for iPhone" hearing aids have arrived, allowing users to stream audio and data between smartphones and the device. It means hearing aids might soon be desirable, even to those who don't need them. A Bluetooth wireless protocol developed by Apple last year lets the prostheses connect directly to Apple devices, streaming audio and data while using a fraction of the power consumption of conventional Bluetooth. LiNX, made by ReSound (pictured), and Halo hearing aids made by Starkey – both international firms – use the iPhone as a platform to offer users new features and added control over their hearing aids. "The main advantage of Bluetooth is that the devices are talking to each other, it's not just one way," says David Nygren, UK general manager of ReSound. This is useful as hearing aids have long suffered from a restricted user interface – there's not much room for buttons on a device the size of a kidney bean. This is a major challenge for hearing-aid users, because different environments require different audio settings. Some devices come with preset programmes, while others adjust automatically to what their programming suggests is the best configuration. This is difficult to get right, and often devices calibrated in the audiologist's clinic fall short in the real world. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19757 - Posted: 06.23.2014

Regina Nuzzo Gene therapy delivered to the inner ear can help shrivelled auditory nerves to regrow — and in turn, improve bionic ear technology, researchers report today in Science Translational Medicine1. The work, conducted in guinea pigs, suggests a possible avenue for developing a new generation of hearing prosthetics that more closely mimics the richness and acuity of natural hearing. Sound travels from its source to ears, and eventually to the brain, through a chain of biological translations that convert air vibrations to nerve impulses. When hearing loss occurs, it’s usually because crucial links near the end of this chain — between the ear’s cochlear cells and the auditory nerve — are destroyed. Cochlear implants are designed to bridge this missing link in people with profound deafness by implanting an array of tiny electrodes that stimulate the auditory nerve. Although cochlear implants often work well in quiet situations, people who have them still struggle to understand music or follow conversations amid background noise. After long-term hearing loss, the ends of the auditory nerve bundles are often frayed and withered, so the electrode array implanted in the cochlea must blast a broad, strong signal to try to make a connection, instead of stimulating a more precise array of neurons corresponding to particular frequencies. The result is an ‘aural smearing’ that obliterates fine resolution of sound, akin to forcing a piano player to wear snow mittens or a portrait artist to use finger paints. To try to repair auditory nerve endings and help cochlear implants to send a sharper signal to the brain, researchers turned to gene therapy. Their method took advantage of the electrical impulses delivered by the cochlear-implant hardware, rather than viruses often used to carry genetic material, to temporarily turn inner-ear cells porous. This allowed DNA to slip in, says lead author Jeremy Pinyon, an auditory scientist at the University of New South Wales in Sydney, Australia. © 2014 Nature Publishing Group

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19533 - Posted: 04.24.2014

By KATHERINE BOUTON Like almost all newborns in this country, Alex Justh was given a hearing test at birth. He failed, but his parents were told not to worry: He was a month premature and there was mucus in his ears. A month later, an otoacoustic emission test, which measures the response of hair cells in the inner ear, came back normal. Alex was the third son of Lydia Denworth and Mark Justh (pronounced Just), and at first they “reveled at what a sweet and peaceful baby he was,” Ms. Denworth writes in her new book, “I Can Hear You Whisper: An Intimate Journey Through the Science of Sound and Language,” being published this week by Dutton. But Alex began missing developmental milestones. He was slow to sit up, slow to stand, slow to walk. His mother felt a “vague uneasiness” at every delay. He seemed not to respond to questions, the kind one asks a baby: “Can you show me the cow?” she’d ask, reading “Goodnight, Moon.” Nothing. No response. At 18 months Alex unequivocally failed a hearing test, but there was still fluid in his ears, so the doctor recommended a second test. It wasn’t until 2005, when Alex was 2 ½, that they finally realized he had moderate to profound hearing loss in both ears. This is very late to detect deafness in a child; the ideal time is before the first birthday. Alex’s parents took him to Dr. Simon Parisier, an otolaryngologist at New York Eye and Ear Infirmary, who recommended a cochlear implant as soon as possible. “Age 3 marked a critical juncture in the development of language,” Ms. Denworth writes. “I began to truly understand that we were not just talking about Alex’s ears. We were talking about his brain.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Language and Our Divided Brain
Link ID: 19485 - Posted: 04.15.2014

Nicola Davis The moment when 40-year old Joanne Milne, who has been deaf since birth, first hears sound is heart-wrenching scene. Amateur footage showing her emotional reaction has taken social media by storm and touched viewers across the world, reinforcing the technological triumph of cochlear implants. It’s a story I have touched on before. Earlier this month I wrote about how cochlear implants changed the lives of the Campbells whose children Alice and Oliver were born with the condition auditory neuropathy spectrum disorder (ANSD). Implants, together with auditory verbal therapy, have allowed them to embrace the hearing world. It was incredibly moving to glimpse the long and difficult journey this family had experienced, and the joy that hearing - a sense so many of us take for granted - can bring. Cochlear implants are not a ‘cure’ for deafness. They make use of electrodes to directly stimulate auditory nerve fibres in the cochlea of the inner ear, creating a sense of sound that is not the same as that which hearing people experience, but nevertheless allows users to perceive speech, develop language and often enjoy music. As an adult Milne, who was born with the rare condition Usher syndrome, is unusual in receiving cochlear implants on both sides. Such bilateral implantation enables users to work out where sounds are coming from, enhances speech perception in bustling environments and means that should something go wrong with one device, the user isn’t cut off from the hearing world. © 2014 Guardian News

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 19421 - Posted: 03.29.2014