Links for Keyword: Hearing

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 525

By Christie Wilcox Music has a remarkable ability to affect and manipulate how we feel. Simply listening to songs we like stimulates the brain’s reward system, creating feelings of pleasure and comfort. But music goes beyond our hearts to our minds, shaping how we think. Scientific evidence suggests that even a little music training when we’re young can shape how brains develop, improving the ability to differentiate sounds and speech. With education funding constantly on the rocks and tough economic times tightening many parents’ budgets, students often end up with only a few years of music education. Studies to date have focused on neurological benefits of sustained music training, and found many upsides. For example, researchers have found that musicians are better able to process foreign languages because of their ability to hear differences in pitch, and have incredible abilities to detect speech in noise. But what about the kids who only get sparse musical tutelage? Does picking up an instrument for a few years have any benefits? The answer from a study just published in the Journal of Neuroscience is a resounding yes. The team of researchers from Northwestern University’s Auditory Neuroscience Laboratory tested the responses of forty-five adults to different complex sounds ranging in pitch. The adults were grouped based on how much music training they had as children, either having no experience, one to five years of training, or six to eleven years of music instruction. © 2012 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 17188 - Posted: 08.22.2012

by Nicholas St. Fleur With their trumpet-like calls, elephants may seem like some of the loudest animals on Earth. But we can't hear most of the sounds they make. The creatures produce low-frequency noises between 1 to 20 Hertz, known as infrasounds, that help them keep in touch over distances as large as 10 kilometers. A new study reveals for the first time how elephants produce these low notes. Scientists first discovered that elephants made infrasounds in the 1980s. The head female in a herd may produce the noises to guide her group's movements, whereas a male who’s in a mating state called musth might use the calls to thwart competition from other males. Mother elephants even rely on infrasounds to keep tabs on a separated calf, exchanging "I'm here" calls with the wayward offspring in a fashion similar to a game of Marco Polo. These noises, which fall below the hearing range for humans, are often accompanied by strong rumbles with slightly higher frequencies that people can hear. By recording the rumbles and then speeding up the playback, the scientists can increase the frequency of the infrasounds, making them audible. Good vibrations. The vocal folds of the excised larynx vibrating according to the myoelastic-aerodynamic method. Researchers have speculated that the noises come from vibrations in the vocal folds of the elephant larynx. This could happen in two ways. In the first, called active muscular contraction (AMC), neural signals cause the muscles in the larynx to contract in a constant rhythm. Cats do this when they purr. The second possibility is known as the myoelastic-aerodynamic (MEAD) method, and it occurs when air flows through the vocal folds causing them to vibrate—this also happens when humans talk. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 17125 - Posted: 08.04.2012

by Nicholas St. Fleur A house fly couple settles down on the ceiling of a manure-filled cowshed for a romantic night of courtship and copulation. Unbeknownst to the infatuated insects, their antics have attracted the acute ears of a lurking Natterer's bat. But this eavesdropper is no pervert—he's a predator set on a two-for-one dinner special. As a new study reveals, the hungry bat swoops in on the unsuspecting flies, guided by the sound of their precoital "clicks." Previous studies of freshwater amphipods, water striders, and locusts have shown that mating can make animals more vulnerable to predators, but these studies did not determine why. A team from the Max Planck Institute for Ornithology in Germany, led by the late Björn Siemers, found that the bat-fly interactions in the cowshed provided clues for understanding what tips off a predator to a mating couple. The researchers observed a teenage horror film-like scene as Natterer's bats (Myotis nattereri)preyed on mating house flies (Musca domestica). Bats find prey primarily through two methods: echolocation and passive acoustics. For most bats, echolocation is the go-to tracking tool. They send out a series of high frequency calls and listen for the echoes produced when the waves hit something. The researchers found that by using echolocation, bats could easily find and catch house flies midflight, yet they had difficulty hunting stationary house flies. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 8: Hormones and Sex
Link ID: 17085 - Posted: 07.24.2012

By WILLIAM J. BROAD Scientists have long known that man-made, underwater noises — from engines, sonars, weapons testing, and such industrial tools as air guns used in oil and gas exploration — are deafening whales and other sea mammals. The Navy estimates that loud booms from just its underwater listening devices, mainly sonar, result in temporary or permanent hearing loss for more than a quarter-million sea creatures every year, a number that is rising. Now, scientists have discovered that whales can decrease the sensitivity of their hearing to protect their ears from loud noise. Humans tend to do this with index fingers; scientists haven’t pinpointed how whales do it, but they have seen the first evidence of the behavior. “It’s equivalent to plugging your ears when a jet flies over,” said Paul E. Nachtigall, a marine biologist at the University of Hawaii who led the discovery team. “It’s like a volume control.” The finding, while preliminary, is already raising hopes for the development of warning signals that would alert whales, dolphins and other sea mammals to auditory danger. Peter Madsen, a professor of marine biology at Aarhus University in Denmark, said he applauded the Hawaiian team for its “elegant study” and the promise of innovative ways of “getting at some of the noise problems.” But he cautioned against letting the discovery slow global efforts to reduce the oceanic roar, which would aid the beleaguered sea mammals more directly. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 17048 - Posted: 07.17.2012

People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funding by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. It adds to a growing list of discoveries that confirm the impact of experiences and outside influences in molding the developing brain. The study is published in the July 11 online issue of The Journal of Neuroscience. The researchers, Christina M. Karns, Ph.D., a postdoctoral research associate in the Brain Development Lab at the University of Oregon, Eugene, and her colleagues, show that deaf people use the auditory cortex to process touch stimuli and visual stimuli to a much greater degree than occurs in hearing people. The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks. "This research shows how the brain is capable of rewiring in dramatic ways," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "This will be of great interest to other researchers who are studying multisensory processing in the brain." Previous research, including studies performed by the lab director, Helen Neville Ph.D., has shown that people who are born deaf are better at processing peripheral vision and motion. Deaf people may process vision using many different brain regions, especially auditory areas, including the primary auditory cortex. However, no one has tackled whether vision and touch together are processed differently in deaf people, primarily because in experimental settings, it is more difficult to produce the kind of precise tactile stimuli needed to answer this question.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 17035 - Posted: 07.12.2012

by Elizabeth Preston It's 20 million years ago in the forests of Argentina, and Homunculus patagonicus is on the move. The monkey travels quickly, swinging between tree branches as it goes. Scientists have a good idea of how Homunculus got around thanks to a new fossil analysis of its ear canals and those of 15 other ancient primates. These previously hidden passages reveal some surprises about the locomotion of extinct primates—including hints that our own ancestors spent their lives moving at a higher velocity than today's apes. Wherever skeletons of ancient primates exist, anthropologists have minutely analyzed arm, leg, and foot bones to learn about the animals' locomotion. Some of these primates seem to have bodies built for leaping. Others look like they moved more deliberately. But in species such as H. patagonicus, there's hardly anything to go on aside from skulls. That's where the inner ear canals come in. "The semicircular canals function essentially as angular accelerometers for the head," helping an animal keep its balance while its head jerks around, says Timothy Ryan, an anthropologist at Pennsylvania State University, University Park. In the new study, he and colleagues used computed tomography scans to peer inside the skulls of 16 extinct primates, spanning 35 million years of evolution, and reconstruct the architecture of their inner ears. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16951 - Posted: 06.23.2012

Children exposed to HIV in the womb may be more likely to experience hearing loss by age 16 than are their unexposed peers, according to scientists in a National Institutes of Health research network. The researchers estimated that hearing loss affects 9 to 15 percent of HIV-infected children and 5 to 8 percent of children who did not have HIV at birth but whose mothers had HIV infection during pregnancy. Study participants ranged from 7 to 16 years old. The researchers defined hearing loss as the level at which sounds could be detected, when averaged over four frequencies important for speech understanding (500, 1000, 2000, and 4000 Hertz), that was 20 decibels or higher than the normal hearing level for adolescents or young adults in either ear. “Children exposed to HIV before birth are at higher risk for hearing difficulty, and it's important for them—and the health providers who care for them—to be aware of this,” said George K. Siberry, M.D., of the Pediatric, Adolescent, and Maternal AIDS Branch of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH institute that leads the research network. Compared to national averages for other children their age, children with HIV infection were about 200 to 300 percent more likely to have a hearing loss. Children whose mothers had HIV during pregnancy but who themselves were born without HIV were 20 percent more likely than to have hearing loss. The study was published online in The Pediatric Infectious Disease Journal.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 16947 - Posted: 06.21.2012

By Susan Milius ALBUQUERQUE — Unbeknownst to humans, peacocks may be having infrasonic conversations. New recordings reveal that males showing off their feathers make deep rumbling sounds that are too low pitched for humans to hear. Other peacocks hear it though, Angela Freeman reported June 13 at the annual meeting of the Animal Behavior Society. When she played recordings of the newly discovered sound to peafowl, females looked alert and males were likely to shriek out a (human-audible) call. Peacocks are thus the first birds known to make and perceive noises below human hearing, Freeman said. ”Really exciting,” said Roslyn Dakin of Queen’s University in Kingston, Canada, who studies the visual allure of peacock courtship. If peacocks can rumble, she suspects that other birds may be able to, too. “I don’t think this is a weird case,” she said. Such infrasound, or noise below 20 hertz, extends below the limit of human hearing. Biologists watched creatures such as elephants for centuries before recording technology uncovered the infrasound side of those animal conversations. But making infrasound doesn’t always mean communicating with it. Recordings have picked up infrasound from another bird, the capercaillie, but playing back the sounds to those birds has so far revealed no sign that they hear or care about their own infrasound. Freeman, an animal behaviorist at the University of Manitoba, was inspired to make detailed recordings of male peacocks by her coauthor’s impression that their fanned-out feather display curved slightly forward like a shallow satellite dish. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 8: Hormones and Sex
Link ID: 16936 - Posted: 06.20.2012

By Susan Milius New high-speed video of the tropical bats swooping toward various frogs and toads shows that the predators deploy a sequence of senses to update their judgment of prey during an attack to avoid eating a toxic amphibian, says behavioral ecologist Rachel Page of the Smithsonian Tropical Research Institute in Gamboa, Panama. The bats proved hard to fool even when researchers played the call of a favorite edible frog while offering up another species, Page and her colleagues report in an upcoming Naturwissenschaften. In the tropics, various bats will nab a frog if given half a chance, but only the fringe-lipped species (Trachops cirrhosus) is known to follow frog calls, such as the “tuuun chuck” call of the túngara frog (Engystomops pustulosus). In tests in Panama, Page and her colleagues found that fringe-lipped bats turned aside in mid-air if researchers broadcast enticing túngara calls but offered up a cane toad (Rhinella marina), which is way too big for a bat to carry off. The possibility that incoming bats might use echolocation to avoid overweight prey intrigues bat specialist Brock Fenton at the University of Western Ontario in Canada. Early studies of these bats largely ignored possible last-minute echolocation, he says. The new tests also revealed that playing túngara calls while offering a right-sized but toxic leaf litter toad (Rhinella alata) led bats to catch and then drop the unpleasant prey. (Both bats and toads survived.) © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16840 - Posted: 05.26.2012

By Rebecca Cheung Constant low-level noise might cause hearing problems, a new study in rats finds. The discovery, published online May 15 in Nature Communications, suggests that extended exposure to noise at levels usually deemed safe for human ears could actually impair sound perception. The findings are “definitely a warning flag,” says study coauthor Michael Merzenich, an integrative neuroscientist at the University of California, San Francisco. He adds that it will be important to find out whether people employed at factories where continuous low-intensity noise is emitted throughout the workday experience similar consequences. “The big picture is that there is no safe sound,” says Jos Eggermont, an auditory neuroscientist at the University of Calgary in Canada. Even sounds considered safe can cause damage if delivered in a repetitive way, he says. “There might be not-so-subtle effects that accumulate and affect communication and speech understanding.” It’s common knowledge that sustained exposure to louder noises — such as that above 85 decibels — or brief exposures to very loud noises above 100 decibels can cause inner ear damage and hearing impairments. But until recently, the impact of chronic, quieter sound hasn’t been well studied. In the new study, Merzenich and his colleague Xiaoming Zhou of East China Normal University in Shanghai exposed adult mice to 65 decibel sound — roughly at the higher end of normal human speech volume — for 10 hours daily. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16802 - Posted: 05.16.2012

DULL fingers? Blame your genes. It has just been discovered that sensitivity to touch is heritable, and apparently linked to hearing as well. Gary Lewin and colleagues at the Max Delbrück Center for Molecular Medicine in Berlin, Germany, measured touch in 100 healthy pairs of fraternal and identical twins. They tested finger sensitivity in two ways: by response to a high-frequency vibration and the ability to identify the orientation of very fine grating. Lewin's team found that up to 50 per cent of the variation in sensitivity to touch was genetically determined. Audio tests also showed that those with good hearing were more likely to have sensitive touch. The link between the two is logical, as both touch and hearing rely on sensory cells that detect mechanical forces. Next the researchers studied touch sensitivity in students with congenital deafness. They found that 1 in 5 also had impaired touch, indicating that some genes causing deafness may also dull the sense of touch. When they looked at a subset of individuals who were deaf and blind due to Usher syndrome, they found that mutations in a single gene, USH2A, caused both the disease and reduced sensitivity to touch (PLoS Biology, DOI: 10.1371/journal.pbio.1001318). The next step is to try to identify more genes that affect our sense of touch. "There are many more genes than just the one we found," says Lewin, adding that finding them "will hopefully show us more about the biology of touch". © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 16785 - Posted: 05.14.2012

By ANNE EISENBERG DIGITAL hearing aids can do wonders for faded hearing. But other devices can help, too, as audio technology adds new options to help people converse at a noisy restaurant, or talk quietly with a pharmacist at a crowded drugstore counter. Richard Einhorn, a composer who suddenly lost much of his hearing two years ago, relies on his hearing aid, of course, for general use. But when he is meeting friends at a busy coffee shop — where his hearing aid is not always good at distinguishing their voices amid the clatter — he removes it. He has a better solution. He pops on a pair of in-ear earphones and snaps a directional mike on his iPhone, which has an app to amplify and process sound. “I put the iPhone on the table,” he said. “I point it at whoever’s talking, and I can have conversations with them. Soon we forget the iPhone is sitting there.” Mr. Einhorn’s ad hoc solution to restaurant racket is a feasible one, said Jay T. Rubinstein, a professor of bioengineering and otolaryngology at the University of Washington. “It makes sense when you need to capture a speaker’s voice in a noisy environment,” he said. “A system that gives you a high-quality directional mike and good earphones can help people hear in a complex setting.” © 2012 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16755 - Posted: 05.07.2012

by Jane J. Lee Whales use sound to communicate over entire oceans, search for food, and coordinate attacks. But just how baleen whales—a group that uses comblike projections from the roof of their mouth to catch food—heard these grunts and moans was something of a mystery. Toothed whales, including dolphins and porpoises, use lobes of fat connected to their jawbones and ears to pick up sounds. But in-depth analyses of baleen whales weren't previously possible because their sheer size made them impossible to fit into scanners such that use computed tomography and magnetic resonance imaging, which analyze soft tissues. So in a new study, published online this month in The Anatomical Record, researchers focused on one of the smaller species, minke whales (Balaenoptera acutorostrata). They found that triangular patches of fat surrounding minke whale ears (yellow patches, above) could be key to how they hear. They scanned seven minke whale heads in CT and MRI machines, created computer models of the ears and surrounding soft tissue, and dissected the whale noggins to reveal ear fat running from blubber just under the skin to the ear bones. This is similar to the arrangement found in toothed whales. The novel analysis allowed the authors to speculate that the ear fat in both toothed and baleen whales could have shared a common evolutionary origin. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16684 - Posted: 04.21.2012

By Larry Greenemeier Loud, concussive explosions on the battlefield may last only a few seconds, but many soldiers returning from combat in the Middle East are experiencing lingering symptoms that cause them to perceive sounds even when it is quiet. Doctors can do little to treat the problem—typically described as a ringing in the ears—because they lack an effective way of delivering medication to the inner ear. That could change in a few years, in the form of an implantable polymer-based microscale drug-release system that delivers medicine to the inner ear. Called tinnitus, the condition afflicts at least one in every 10 American adults and is the most common disability among Afghanistan and Iraq war veterans, according to the U.S. Department of Veterans Affairs (VA). Up to 40 percent of all veterans may be suffering from tinnitus, and the VA spends about $1 billion annually on disability payments for tinnitus, according to a study published last year in Nature. (Scientific American is part of Nature Publishing Group.) To address the problem, the U.S. Department of Defense has commissioned Draper Laboratory in Cambridge, Mass., to spend the next year fleshing out a concept for a small delivery device inserted near the membrane-covered window—no more than three millimeters in diameter—separating the middle ear from the inner ear. Once at the membrane the device (essentially a polymer capsule, although Draper is not developing any of medicines that might be placed inside) would release a drug into the cochlea, the tubular organ residing in the inner ear that enables us to hear. The plan is to embed wireless communications into the capsule so that a patient or doctor can control the dosage. After the capsule finishes delivering its supply of drugs, it would dissolve. © 2012 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16565 - Posted: 03.24.2012

by Zoë Corbyn A tarsier could be screaming its head off and you would never know it. Uniquely among primates, some of the diminutive mammal's calls are made up of pure ultrasound. Marissa Ramsier of Humboldt State University in California and her colleagues were puzzled to sometimes hear no sound when Philippine tarsiers (Tarsius syrichta) opened their mouths as if to call. Placing 35 wild animals in front of an ultrasound detector revealed that what they assumed to be yawns were high-pitched screams beyond the range of human hearing. While some primates can emit and respond to calls with ultrasonic components, none are known to use only ultrasonic frequencies in a call. The dominant frequency of the Philippine tarsier's ultrasonic call was 70 kilohertz, amongst the highest recorded for any terrestrial mammal. They can hear up to 91 kHz, well beyond the 20 kHz limit of human hearing. Whales, dolphins, domestic cats and some bats and rodents are the only other mammals known to communicate in this way. Having the equivalent of a private communication channel could help tarsiers warn others of predators such as lizards, snakes and birds which can't detect such frequencies, says Ramsier. Eavesdropping on insects could also help them locate their prey. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Language and Our Divided Brain
Link ID: 16361 - Posted: 02.09.2012

by Helen Fields Scientists have long wondered whether propeller and engine noises from big ships stress whales out. Now, thanks to a poop-sniffing dog and an accidental experiment born of a national tragedy, they may finally have their answer. Baleen whales use low-frequency sounds to communicate in the ocean. "They're in an environment where there's not a lot of light; they're underwater. They can't rely on eyesight like we do," says veterinarian Roz Rolland of the New England Aquarium in Boston. Some studies have found that whales alter their behavior and vocalizations when noise increases, and it stands to reason, she says, that noise pollution would hinder their ability to communicate and cause them stress. But because scientists can't control the amount of noise in the sea, that's been very hard to prove. Researchers couldn't stop traffic, but the September 2001 terrorist attacks did. At the time, Rolland was collecting feces of right whales in the Bay of Fundy in Canada so she could try to develop pregnancy tests and other ways to study the animals' reproduction. Animals break up their hormones and get rid of the leftovers in their poop, so feces can show whether an animal is pregnant and reveal its levels of stress. Blood samples would do the same, but feces are much easier to collect. In the first few days after the terrorist attacks, ship traffic in the region decreased dramatically. "There was nobody else there. It was like being on the primal ocean," Rolland says. The whales seem to have noticed the difference, too. The levels of stress hormones in their feces went down, suggesting that ship noise places whales chronically under strain. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 11: Emotions, Aggression, and Stress
Link ID: 16360 - Posted: 02.09.2012

By JANE E. BRODY Hearing loss, a disability currently untreated in about 85 percent of those affected, may be the nation’s most damaging and costly sensory handicap. It is a hidden disability, often not obvious to others or even to those who have it. Its onset is usually insidious, gradually worsening over years and thus easily ignored. Most of those affected can still hear sounds and think the real problem is that people aren’t speaking clearly. They often ask others to speak up, repeat what was said or speak more slowly. Or they pretend they can hear, but their conversations may be filled with non sequiturs. As hearing worsens, they are likely to become increasingly frustrated and socially isolated. Unable to hear well in social settings, they gradually stop going to the theater, movies, places of worship, senior centers or parties or out to restaurants with friends or family. Social isolation, in turn, has been linked to depression and an increased risk of death from conditions like heart disease. And now there is another major risk associated with hearing problems: dementia and Alzheimer’s disease. This finding alone should prompt more people to get their hearing tested and, if found impaired, get properly fitted with aids that can help to keep them cognitively engaged. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 16263 - Posted: 01.17.2012

By Dwayne Godwin and Jorge Cham © 2012 Scientific American

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16260 - Posted: 01.16.2012

By Sarah C.P. Williams, When a rattlesnake shakes its tail, does it hear the rattling? Scientists have long struggled to understand how snakes, which lack external ears, sense sounds. Now, a new study shows that sound waves cause vibrations in a snake’s skull that are then “heard” by the inner ear. “There’s been this enduring myth that snakes are deaf,” says neurobiologist Bruce Young of the University of Massachusetts, Lowell, who was not involved in the new research. “Behavioral studies have suggested that snakes can in fact hear, and now this work has gone one step further and explained how.” In humans, sound waves traveling through the air hit the eardrum, causing the movement of tiny bones and vibrations of tiny hair cells in the inner ear. These vibrations are then translated into nerve impulses that travel to the brain. Snakes have fully formed inner ear structures but no eardrum. Instead, their inner ear is connected directly to their jawbone, which rests on the ground as they slither. Previous studies have shown that vibrations traveling through the ground—such as the footsteps of predators or prey—cause vibrations in a snake’s jawbone, relaying a signal to the brain via that inner ear. It was still unclear, however, whether snakes could hear sounds traveling through the air. So Biologist Christian Christensen of Aarhus University in Denmark took a closer look at one particular type of snake, the ball python (Python regius). Studying them wasn’t easy. “You can’t train snakes to respond to sounds with certain behaviors, like you might be able to do with mice,” says Christensen. Instead, he and his colleagues used electrodes attached to the reptiles’ heads to monitor the activity of neurons connecting the snakes’ inner ears to their brains. © 1996-2012 The Washington Post

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16207 - Posted: 01.03.2012

by Sarah C. P. Williams When a rattlesnake shakes its tail, does it hear the rattling? Scientists have long struggled to understand how snakes, which lack external ears, sense sounds. Now, a new study shows that sound waves cause vibrations in a snake's skull that are then "heard" by the inner ear. "There's been this enduring myth that snakes are deaf," says neurobiologist Bruce Young of the University of Massachusetts, Lowell, who was not involved in the new research. "Behavioral studies have suggested that snakes can in fact hear, and now this work has gone one step further and explained how." In humans, sound waves traveling through the air hit the eardrum, causing the movement of tiny bones and vibrations of tiny hair cells in the inner ear. These vibrations are then translated into nerve impulses that travel to the brain. Snakes have fully formed inner ear structures but no eardrum. Instead, their inner ear is connected directly to their jawbone, which rests on the ground as they slither. Previous studies have shown that vibrations traveling through the ground—such as the footsteps of predators or prey—cause vibrations in a snake's jawbone, relaying a signal to the brain via that inner ear. It was still unclear, however, whether snakes could hear sounds traveling through the air. So Biologist Christian Christensen of Aarhus University in Denmark took a closer look at one particular type of snake, the ball python (Python regius). Studying them wasn't easy. "You can't train snakes to respond to sounds with certain behaviors, like you might be able to do with mice," says Christensen. Instead, he and his colleagues used electrodes attached to the reptiles' heads to monitor the activity of neurons connecting the snakes' inner ears to their brains. Each time a sound was played through a speaker suspended above the snake's cage, the researchers measured whether the nerve relayed an electrical pulse (the snakes showed no outward response to the sounds). The nerve pulses were strongest, the researchers found, with frequencies between 80 and 160 hertz—around the frequency for the lowest notes of a cello, though not necessarily sounds that snakes encounter often in the wild. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 16176 - Posted: 12.23.2011