Links for Keyword: Hearing

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 562

By Sarah Lewin Evolutionary biologists have long wondered why the eardrum—the membrane that relays sound waves to the inner ear—looks in humans and other mammals remarkably like the one in reptiles and birds. Did the membrane and therefore the ability to hear in these groups evolve from a common ancestor? Or did the auditory systems evolve independently to perform the same function, a phenomenon called convergent evolution? A recent set of experiments performed at the University of Tokyo and the RIKEN Evolutionary Morphology Laboratory in Japan resolves the issue. When the scientists genetically inhibited lower jaw development in both fetal mice and chickens, the mice formed neither eardrums nor ear canals. In contrast, the birds grew two upper jaws, from which two sets of eardrums and ear canals sprouted. The results, published in Nature Communications, confirm that the middle ear grows out of the lower jaw in mammals but emerges from the upper jaw in birds—all supporting the hypothesis that the similar anatomy evolved independently in mammals and in reptiles and birds. (Scientific American is part of Springer Nature.) Fossils of auditory bones had supported this conclusion as well, but eardrums do not fossilize and so could not be examined directly. © 2015 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 21098 - Posted: 06.27.2015

By Rachel Feltman Here's why you hate the sound of your own voice(1:07) If you've ever listened to your voice recorded, chances are you probably didn't like what you heard. So, why do most people hate the sound of their own voice? The answer: It's all in how sound travel to your ears. (Pamela Kirkland/The Washington Post) Whether you've heard yourself talking on the radio or just gabbing in a friend's Instagram video, you probably know the sound of your own voice -- and chances are pretty good that you hate it. As the video above explains, your voice as you hear it when you speak out loud is very different from the voice the rest of the world perceives. That's because it comes to you via a different channel than everyone else. When sound waves from the outside world -- someone else's voice, for example -- hit the outer ear, they're siphoned straight through the ear canal to hit the ear drum, creating vibrations that the brain will translate into sound. When we talk, our ear drums and inner ears vibrate from the sound waves we're putting out into the air. But they also have a another source of vibration -- the movements caused by the production of the sound. Our vocal cords and airways are trembling, too, and those vibrations make their way over to auditory processing as well. Your body is better at carrying low, rich tones than the air is. So when those two sources of sound get combined into one perception of your own voice, it sounds lower and richer. That's why hearing the way your voice sounds without all the body vibes can be off-putting -- it's unfamiliar -- or even unpleasant, because of the relative tinniness.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 21062 - Posted: 06.17.2015

How echolocation really works By Dwayne Godwin and Jorge Cham © 2015 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 21017 - Posted: 06.06.2015

Lauren Silverman Jiya Bavishi was born deaf. For five years, she couldn't hear and she couldn't speak at all. But when I first meet her, all she wants to do is say hello. The 6-year-old is bouncing around the room at her speech therapy session in Dallas. She's wearing a bright pink top; her tiny gold earrings flash as she waves her arms. "Hi," she says, and then uses sign language to ask who I am and talk about the ice cream her father bought for her. Jiya is taking part in a clinical trial testing a new hearing technology. At 12 months, she was given a cochlear implant. These surgically implanted devices send signals directly to the nerves used to hear. But cochlear implants don't work for everyone, and they didn't work for Jiya. A schoolboy with a cochlear implant listens to his teacher during lessons at a school for the hearing impaired in Germany. The implants have dramatically changed the way deaf children learn and transition out of schools for the deaf and into classrooms with non-disabled students. "The physician was able to get all of the electrodes into her cochlea," says Linda Daniel, a certified auditory-verbal therapist and rehabilitative audiologist with HEAR, a rehabilitation clinic in Dallas. Daniel has been working with Jiya since she was a baby. "However, you have to have a sufficient or healthy auditory nerve to connect the cochlea and the electrodes up to the brainstem." But Jiya's connection between the cochlea and the brainstem was too thin. There was no way for sounds to make that final leg of the journey and reach her brain. © 2015 NPR

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 21005 - Posted: 06.01.2015

By Meeri Kim The dangers of concussions, caused by traumatic stretching and damage to nerve cells in the brain that lead to dizziness, nausea and headache, has been well documented. But ear damage that is sometimes caused by a head injury has symptoms so similar to the signs of a concussion that doctors may misdiagnose it and administer the wrong treatment. A perilymph fistula is a tear or defect in the small, thin membranes that normally separate the air-filled middle ear from the inner ear, which is filled with a fluid called perilymph. When a fistula forms, tiny amounts of this fluid leak out of the inner ear, an organ crucial not only for hearing but also for balance. Losing even a few small drops of perilymph leaves people disoriented, nauseous and often with a splitting headache, vertigo and memory loss. While most people with a concussion recover within a few days, a perilymph fistula can leave a person disabled for months. There is some controversy around perilymph fistula due to its difficulty of diagnosis — the leak is not directly observable, but rather identified by its symptoms. However, it is generally accepted as a real condition by otolaryngologists and sports physicians, and typically known to follow a traumatic event. But concussions — as well as post-concussion syndrome, which is marked by dizziness, headache and other symptoms that can last even a year after the initial blow — also occur as the result of such an injury.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Language and Our Divided Brain
Link ID: 20968 - Posted: 05.23.2015

Jon Hamilton When Sam Swiller used hearing aids, his musical tastes ran to AC/DC and Nirvana – loud bands with lots of drums and bass. But after Swiller got a cochlear implant in 2005, he found that sort of music less appealing. "I was getting pushed away from sounds I used to love," he says, "but also being more attracted to sounds that I never appreciated before." So he began listening to folk and alternative music, including the Icelandic singer Bjork. There are lots of stories like this among people who get cochlear implants. And there's a good reason. A cochlear implant isn't just a fancy hearing aid. When his cochlear implant was first switched on, the world sounded different. "A hearing aid is really just an amplifier," says Jessica Phillips-Silver, a neuroscience researcher at Georgetown University. "The cochlear implant is actually bypassing the damaged part of the ear and delivering electrical impulses directly to the auditory nerve." As a result, the experience of listening to music or any other sound through the ear, with or without a hearing aid, can be completely unlike the experience of listening through a cochlear implant. "You're basically remapping the audio world," Swiller says. Swiller is 39 years old and lives in Washington, D.C. He was born with an inherited disorder that caused him to lose much of his hearing by his first birthday. That was in the 1970s, and cochlear implants were still considered experimental devices. So Swiller got hearing aids. They helped, but Swiller still wasn't hearing what other people were. © 2015 NPR

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20945 - Posted: 05.18.2015

By Chris Cesare For bats, too many echoes can be like blurry vision. That’s because the nocturnal creatures navigate by bouncing ultrasonic sound off of their surroundings, a technique known as echolocation. In cramped spots, these sounds can reverberate, creating a noisy background that clouds the mammals’ sonic sight. Now, new research published online before print in the Proceedings of the National Academy of Sciences has discovered one way that bats might overcome this auditory ambush. Scientists found that the animals modify the width of their navigation pulses on the fly by adjusting the size of their mouth gape. The researchers used an array of cameras, flashes, and ultrasonic recorders to take snapshots of bats while they swooped down to take a sip at a desert pond in Israel. As the bats descended toward the confined banks of the pond, they opened their mouths wider to more tightly focus their sound pulses. As the bats left, they narrowed their mouths, projecting an ultrasonic beam up to four times wider than on the descending leg. These counterintuitive effects were due to diffraction, which causes sound waves traveling through a smaller hole to spread out more. The researchers repeated the experiment with captive bats and found the same effect, controlling for the possibility that they had observed a behavior tied to drinking. The team writes that these changes in gape allow the animals to “zoom in” on their view of an area, potentially reducing the amount of distracting echoes in a tight space. © 2015 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20901 - Posted: 05.09.2015

by Clare Wilson WHAT is it like to be a bat? It's a question philosophers interested in consciousness like to ponder. Yet a few people already have something of a bat's world view. Brian Borowski, a 59-year-old Canadian who was born blind, began teaching himself to echolocate aged 3. He clicks with his tongue or snaps his fingers as he moves about, unconsciously decoding the echoes. Although many blind people get information from sounds around them, few turn this into a supersense by making sounds to help themselves get around. "When I'm walking down a sidewalk and I pass trees, I can hear the tree: the vertical trunk of the tree and maybe the branches above me," says Borowski. "I can hear a person in front of me and go around them." Borowski, who works as a programmer at Western University in London, Ontario, suspects he experiences "images" in a similar way to people who can see, just with less detail. "I store maps of information in my head and I compare what I have in my memory with what I'm hearing around me," he says. "I am matching images of some sort." This probably isn't too far from the truth – we know from brain scans of Borowski and another echolocator that the strategy co-opts the same parts of the brain that usually deal with visual information. For his latest scientific collaboration, he helped a team of researchers to explore how well echolocators can determine the relative sizes and distances of objects. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20898 - Posted: 05.08.2015

by Helen Thomson Tinnitus is the debilitating sensation of a high-pitched noise without any apparent source. It can be permanent or fleeting, and affects at least 25 million people in the US alone. To understand more about the condition, William Sedley at the University of Newcastle, UK, and his colleagues took advantage of a rare opportunity to study brain activity in a man with tinnitus who was undergoing surgery for epilepsy. Surgeons placed recording electrodes in several areas of his brain to identify the source of his seizures. The man – who they knew as Bob (not his real name) – was awake during the procedure, which allowed Sedley's team to manipulate his tinnitus while recording from his brain. First they played him 30 seconds of white noise, which suppressed his tinnitus for about 10 seconds before it gradually returned. Bob was asked to rate the loudness of his tinnitus before the experiment started, as well as immediately after the white noise finished and 10 seconds later. This protocol was then repeated many times over two days. "Normally, studies compare brain activity of people with and without tinnitus using non-invasive techniques," says Sedley. "Not only are these measurements less precise, but the people with tinnitus might be concentrating on the sound, while the ones without tinnitus might be thinking about their lunch." This, he says, can make the results hard to interpret. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20847 - Posted: 04.25.2015

Hannah Devlin, science correspondent They may stop short of singing The Bells of Saint Mary’s, as demonstrated by the mouse organ in Monty Python, but scientists have discovered that male mice woo females with ultrasonic songs. The study shows for the first time that mouse song varies depending on the context and that male mice have a specific style of vocalisation reserved for when they smell a female in the vicinity. In turn, females appear to be more interested in this specific style of serenade than other types of squeak that male mice produce. “It was surprising to me how much change occurs to these songs in different social contexts, when the songs are thought to be innate,” said Erich Jarvis, who led the work at Duke University in North Carolina. “It is clear that the mouse’s ability to vocalise is a lot more limited than a songbird’s or human’s, and yet it’s remarkable that we can find these differences in song complexity.” The findings place mice in an elite group of animal vocalisers, that was once thought to be limited to birds, whales, and some primates. Mouse song is too high-pitched for the human ear to detect, but when listened to at a lower frequency, it sounds somewhere between birdsong and the noise of clean glass being scrubbed. The Duke University team recorded the male mice when they were roaming around their cages, when they were exposed to the smell of female urine and when they were placed in the presence of a female mouse. They found that males sing louder and more complex songs when they smell a female but don’t see her. By comparison, the songs were longer and simpler when they were directly addressing their potential mate, according to the findings published in Frontiers of Behavioural Neuroscience. © 2015 Guardian News and Media Limited

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 8: Hormones and Sex
Link ID: 20751 - Posted: 04.02.2015

by Bethany Brookshire Music displays all the harmony and discord the auditory world has to offer. The perfect pair of notes at the end of the Kyrie in Mozart’s Requiem fills churches and concert halls with a single chord of ringing, echoing consonance. Composers such as Arnold Schönberg explored the depths of dissonance — groups of notes that, played together, exist in unstable antagonism, their frequencies crashing and banging against each other. Dissonant chords are difficult to sing and often painful to hear. But they may get less painful with age. As we age, our brains may lose the clear-cut representations of these consonant and dissonant chords, a new study shows. The loss may affect how older people engage with music and shows that age-related hearing loss is more complex than just having to reach for the volume controls. The main mechanism behind age-related hearing loss is the deterioration of the outer hair cells in the cochlea, a coiled structure within our inner ear. When sound waves enter the ear, a membrane vibrates, pulling the hair cells to and fro and kicking off a series of events that produce electrical signals that will be sent onward to the brain. As we age, we lose some of these outer hair cells, and with them goes our ability to hear extremely high frequencies. In a new study, researchers tested how people perceive consonant pairs of musical notes, which are harmonious and generally pleasing, or dissonant ones, which can be harsh and tense. © Society for Science & the Public 2000 - 2015

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 20737 - Posted: 03.31.2015

By Victoria Gill Science reporter, BBC News Researchers in Denmark have revealed how porpoises finely adjust the beams of sound they use to hunt. The animals hunt with clicks and buzzes - detecting the echoes from their prey. This study showed them switching from a narrow to a wide beam of sound - "like adjusting a flashlight" - as they homed in on a fish. Researchers think that other whales and dolphins may use the same technique to trap a fish in their beam of sound in the final phase of an attack. This could help prevent porpoises, whales and dolphins' prey from evading their capture. By revealing these acoustic secrets in detail, researchers are hoping to develop ways to prevent porpoises, and other toothed whales, from becoming trapped in fishing nets. The study, published in the journal eLife, was led by Danuta Wisniewska of Aarhus University. She and her colleagues worked with harbour porpoises in a semi-natural enclosure on the coast of Denmark. "The facility is quite exceptional, " explained Dr Wisniewska. "The animals still have access to the seafloor and are only separated from the harbour by a net. Fish are able to come in, so they're still hunting." In this unique environment, the researchers were able to fit the porpoises with sound-detecting tags, and to place an array of microphones to pick up sound around their enclosure. The team carried out a series of these experiments to work out where the sound energy the porpoises produced was being directed In one experiment, researchers dropped fish into the water to tempt the porpoises to hunt. As echolocating porpoises, whales and dolphins hunt, they switch from an exploratory clicking to a more intense, high frequency buzz - to elicit a continuous echo from the fish they are pursuing. Their beam can be envisaged a cone of sound, said Dr Wisniewska, comparing it to the cone-shaped beam of light from a torch. © 2015 BBC.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20731 - Posted: 03.30.2015

By Virginia Morell Children and parrot and songbird chicks share a rare talent: They can mimic the sounds that adults of their species make. Now, researchers have discovered this vocal learning skill in baby Egyptian fruit bats (Rousettus aegyptiacus, pictured), a highly social species found from Africa to Pakistan. Only a handful of other mammals, including cetaceans and certain insectivorous bats, are vocal learners. The adult fruit bats have a rich vocal repertoire of mouselike squeaks and chatter (listen to a recording here), and the scientists suspected the bat pups had to learn these sounds. To find out, they placed baby bats with their mothers in isolation chambers for 5 months and made video and audio recordings of each pair. Lacking any other adults to vocalize to, the mothers were silent, and their babies made only isolation calls and babbling sounds, the researchers report today in Science Advances. As a control, the team raised another group of bat pups with their mothers and fathers, who chattered to each other. Soon, the control pups’ babbling gave way to specific sounds that matched those of their mothers. But the isolated pups quickly overcame the vocal gap after the scientists united both sets of bats—suggesting that unlike many songbird species (and more like humans), the fruit bats don’t have a limited period for vocal learning. Although the bats’ vocal learning is simple compared with that of humans, it could provide a useful model for understanding the evolution of language, the scientists say. © 2015 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 20726 - Posted: 03.28.2015

Lights, sound, action: we are constantly learning how to incorporate outside sensations into our reactions in specific situations. In a new study, brain scientists have mapped changes in communication between nerve cells as rats learned to make specific decisions in response to particular sounds. The team then used this map to accurately predict the rats’ reactions. These results add to our understanding of how the brain processes sensations and forms memories to inform behavior. “We’re reading the memories in the brain,” said Anthony Zador, M.D., Ph.D., professor at Cold Spring Harbor Laboratory, New York, and senior author of the study, published in Nature. The work was funded by the National Institutes of Health and led by Qiaojie Xiong, Ph.D., a former postdoctoral researcher in Dr. Zador’s laboratory. “For decades scientists have been trying to map memories in the brain,” said James Gnadt, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS), one of the NIH institutes that funded the study. “This study shows that scientists can begin to pinpoint the precise synapses where certain memories form and learning occurs.” The communication points, or synapses, that Dr. Zador’s lab studied were in the striatum, an integrating center located deep inside the brain that is known to play an important role in coordinating the translation of thoughts and sensations into actions. Problems with striatal function are associated with certain neurological disorders such as Huntington’s disease in which affected individuals have severely impaired skill learning.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 20649 - Posted: 03.04.2015

By Barron H. Lerner, M.D. I can’t stand it when someone behind me at a movie chews popcorn with his or her mouth open. I mean, I really can’t stand it. I have misophonia, a condition with which certain sounds can drive someone into a burst of rage or disgust. Although only identified and named in the last 20 years, misophonia has been enthusiastically embraced, with websites, Facebook pages and conferences drawing small armies of frustrated visitors. As a primary care physician, I find that misophonia can present some special challenges: At times, my patients can be the source of annoying sounds. At other times, the condition can be a source of special bonding if I realize that a patient is a fellow sufferer. But some experts question whether misophonia really exists. By naming it, are we giving too much credence to a series of symptoms that are no big deal? Coined by the married researchers Margaret and Pawel Jastreboff of Emory University in 2002, misophonia (“hatred of sound”) is sometimes referred to as selective sound sensitivity syndrome. Like me, those with the disorder identify a series of specific sounds that bother them. A2013 study by Arjan Schröder and his colleagues at the University of Amsterdam identified the most common irritants as eating sounds, including lip smacking and swallowing; breathing sounds, such as nostril noises and sneezing; and hand sounds, such as typing and pen clicking. The range of responses to these noises is broad, from irritation to disgust to anger. Some sufferers even respond with verbal or physical aggression to those making the noises. One woman reported wanting to strangle her boyfriend in response to his chewing. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20609 - Posted: 02.24.2015

By Warren Cornwall The green wings of the luna moth, with their elegant, long tails, aren’t just about style. New research finds they also help save the insect from becoming a snack for a bat. The fluttering tails appear to create an acoustic signal that is attractive to echolocating bats, causing the predators to zero in on the wings rather than more vital body parts. Scientists pinned down the tails’ lifesaving role by taking 162 moths and plucking the tails off 75 of them. They used fishing line to tether two moths—one with tails, the other without—to the ceiling of a darkened room. Then, they let loose a big brown bat. The bats caught 81% of the tailless moths, but just 35% of those with fully intact wings, they report in a study published online today in the Proceedings of the National Academy of Sciences. High-speed cameras helped show why. In 55% of attacks on moths with tails, the bats went after the tails, often missing the body. It’s the first well-documented example of an organism using body shape to confuse predators that use echolocation, the researchers say—the equivalent of fish and insects that display giant eyespots for visual trickery. © 2015 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20587 - Posted: 02.18.2015

Madeline Bonin Bats and moths have been evolving to one-up each other for 65 million years. Many moths can hear bats’ ultrasonic echolocation calls, making it easy for the insects to avoid this predator. A few species of bat have developed echolocation calls that are outside the range of the moths’ hearing, making it harder for the moths to evade them1. But humans short-circuit this evolutionary arms race every time they turn on a porch light, according to a study in the Journal of Applied Ecology2. In field experiments, ecologist Corneile Minnaar of the University of Pretoria and his colleagues examined the diet of Cape serotine bats (Neoromicia capensis) both in the dark and under artificial light in a national park near Pretoria. The bat, an insect-eating species common in South Africa, has an echolocation call that moths can hear. Minnaar and his team determined both the species and quantity of available insect prey at the test sites using a hand-held net and a stationary trap. Cape serotine bats do not normally eat many moths. As the scientists expected, they caught more during the lighted trials than in the dark. What was surprising, however, was the discovery that the insects formed a greater share of the bats' diet during the lighted trials. The percentage of moths eaten in bright areas was six times larger than in dark zones, even though moths represented a smaller share of the total insect population under the lights than in the shade. But surprisingly, though moths represented a smaller share of the total insect population in the lighted areas, they played a larger role in the bats' diet. © 2015 Nature Publishing Group

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20565 - Posted: 02.09.2015

By Monique Brouillette When the first four-legged creatures emerged from the sea roughly 375 million years ago, the transition was anything but smooth. Not only did they have to adjust to the stress of gravity and the dry environment, but they also had to wait another 100 million years to evolve a fully functional ear. But two new studies show that these creatures weren’t deaf; instead, they may have used their lungs to help them hear. Fish hear easily underwater, as sound travels in a wave of vibration that freely passes into their inner ears. If you put a fish in air, however, the difference in the density of the air and tissue is so great that sound waves will mostly be reflected. The modern ear adapted by channeling sound waves onto an elastic membrane (the eardrum), causing it to vibrate. But without this adaptation, how did the first land animals hear? To answer this question, a team of Danish researchers looked at one of the closest living relatives of early land animals, the African lungfish (Protopterus annectens). As its name suggests, the lungfish is equipped with a pair of air-breathing lungs. But like the first animals to walk on land, it lacks a middle ear. The researchers wanted to determine if the fish could sense sound pressure waves underwater, so they filled a long metal tube with water and placed a loudspeaker at one end. They played sounds into the tube in a range of frequencies and carefully positioned the lungfish in areas of the tube where the sound pressure was high. Monitoring the brain stem and auditory nerve activity in the lungfish, the researchers were surprised to discover that the fish could detect pressure waves in frequencies above 200 Hz. © 2015 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20551 - Posted: 02.05.2015

By ANDREW POLLACK Driving to a meeting in 2008, Jay Lichter, a venture capitalist, suddenly became so dizzy he had to pull over and call a friend to take him to the emergency room. The diagnosis: Ménière’s disease, a disorder of the inner ear characterized by debilitating vertigo, hearing loss and tinnitus, or ringing in the ears. But from adversity can spring opportunity. When Mr. Lichter learned there were no drugs approved to treat Ménière’s, tinnitus or hearing loss, he started a company, Otonomy. It is one of a growing cadre of start-ups pursuing drugs for the ear, an organ once largely neglected by the pharmaceutical industry. Two such companies, Otonomy and Auris Medical, went public in 2014. Big pharmaceutical companies like Pfizer and Roche are also exploring the new frontier. A clinical trial recently began of a gene therapy being developed by Novartis that is aimed at restoring lost hearing. The sudden flurry of activity has not yet produced a drug that improves hearing or silences ringing in the ears, but some companies are reporting hints of promise in early clinical trials. There is a huge need, some experts say. About 48 million Americans have a meaningful hearing loss in at least one ear; 30 million of them have it in both ears, said Dr. Frank R. Lin, an associate professor of otolaryngology and geriatric medicine at Johns Hopkins University. That figure is expected to increase as baby boomers grow older. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20466 - Posted: 01.10.2015

By Susan Milius In nighttime flying duels, Mexican free-tailed bats make short, wavering sirenlike waaoo-waaoo sounds that jam each other’s sonar. These “amazing aerial battles” mark the first examples of echolocating animals routinely sabotaging the sonar signals of their own kind, says Aaron Corcoran of Wake Forest University in Winston-Salem, N.C. Many bats, like dolphins, several cave-dwelling birds and some other animals, locate prey and landscape features by pinging out sounds and listening for echoes. Some prey, such as tiger moths, detect an incoming attack and make frenzied noises that can jam bat echolocation, Corcoran and his colleagues showed in 2009 (SN: 1/31/09, p. 10). And hawkmoths under attack make squeaks with their genitals in what also may be defensive jamming (SN Online: 7/3/13). But Corcoran didn’t expect bat-on-bat ultrasonic warfare. He was studying moths dodging bats in Arizona’s Chiricahua Mountains when his equipment picked up a feeding buzz high in the night sky. A free-tailed bat was sending faster and faster echolocation calls to refine the target position during the final second of an attack. (Bats, the only mammals known with superfast muscles, can emit more than 150 sounds a second.) Then another free-tailed bat gave a slip-sliding call. Corcoran, in a grad student frenzy of seeing his thesis topic as relevant to everything, thought the call would be a fine way to jam a buzz. “Then I totally told myself that’s impossible — that’s too good to be true.” Five years later he concluded he wasn’t just hearing things. He and William Conner, also of Wake Forest, report in the Nov. 7 Science that the up-and-down call can cut capture success by about 70 percent. Using multiple microphones, he found that one bat jams another, swoops toward the moth and gets jammed itself. © Society for Science & the Public 2000 - 201

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 20435 - Posted: 12.20.2014