Links for Keyword: Hearing

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 723

Dana Boebinger Roughly 15 percent of Americans report some sort of hearing difficulty; trouble understanding conversations in noisy environments is one of the most common complaints. Unfortunately, there’s not much doctors or audiologists can do. Hearing aids can amplify things for ears that can’t quite pick up certain sounds, but they don’t distinguish between the voice of a friend at a party and the music in the background. The problem is not only one of technology, but also of brain wiring. Most hearing aid users say that even with their hearing aids, they still have difficulty communicating in noisy environments. As a neuroscientist who studies speech perception, this issue is prominent in much of my own research, as well as that of many others. The reason isn’t that they can’t hear the sounds; it’s that their brains can’t pick out the conversation from the background chatter. Harvard neuroscientists Dan Polley and Jonathon Whitton may have found a solution, by harnessing the brain’s incredible ability to learn and change itself. They have discovered that it may be possible for the brain to relearn how to distinguish between speech and noise. And the key to learning that skill could be a video game. People with hearing aids often report being frustrated with how their hearing aids handle noisy situations; it’s a key reason many people with hearing loss don’t wear hearing aids, even if they own them. People with untreated hearing loss – including those who don’t wear their hearing aids – are at increased risk of social isolation, depression and even dementia. © 2010–2018, The Conversation US, Inc.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory and Learning
Link ID: 24618 - Posted: 02.06.2018

By Jim Daley Researchers at the D’Or Institute for Research and Education in Brazil have created an algorithm that can use functional magnetic resonance imaging (fMRI) data to identify which musical pieces participants are listening to. The study, published last Friday (February 2) in Scientific Reports, involved six participants listening to 40 pieces of music from various genres, including classical, rock, pop, and jazz. “Our approach was capable of identifying musical pieces with improving accuracy across time and spatial coverage,” the researchers write in the paper. “It is worth noting that these results were obtained for a heterogeneous stimulus set . . . including distinct emotional categories of joy and tenderness.” The researchers first played different musical pieces for the participants and used fMRI to measure the neural signatures of each song. With that data, they taught a computer to identify brain activity that corresponded with the musical dimensions of each piece, including tonality, rhythm, and timbre, as well as a set of lower-level acoustic features. Then, the researchers played the pieces for the participants again while the computer tried to identify the music each person was listening to, based on fMRI responses. The computer was successful in decoding the fMRI information and identifying the musical pieces around 77 percent of the time when it had two options to choose from. When the researchers presented 10 possibilities, the computer was correct 74 percent of the time. © 1986-2018 The Scientist

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 24617 - Posted: 02.06.2018

By Matt Warren The cheetah is built for running, with long limbs and powerful muscles that propel it along as it chases down its prey. But a new study has found that the world’s fastest land mammal has another, less obvious adaptation hidden away in its inner ear. Scientists suspected that the cheetah might also rely on a specialized vestibular system, the part of the inner ear that detects head movements and helps animals maintain their gaze and posture. Using computerized tomography scans, they created detailed 3D images of the inner ear from the skulls of cheetahs and other cat species, from leopards to domestic cats. They found that the vestibular system took up a much greater part of the inner ear in cheetahs than in any other cat. The cheetahs also had elongated semicircular canals, parts of the system involved in head movement and eye direction. These features help the animal catch dinner by letting it keep its head still and its eyes on the prize, even when the rest of its body is rapidly moving, the researchers write in Scientific Reports. The extinct giant cheetah did not have the same features, suggesting that the distinct vestibular system evolved fairly recently, they say. © 2018 American Association for the Advancement of Science

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24607 - Posted: 02.03.2018

By Kimberly Hickok A rooster’s crow is so loud, it can deafen you if you stand too close. So how do the birds keep their hearing? To find out, researchers attached recorders to the heads of three roosters, just below the base of their skulls. Crows lasted 1 to 2 seconds and averaged more than 130 decibels. That’s about the same intensity as standing 15 meters away from a jet taking off. One rooster’s crows reached more than 143 decibels, which is more like standing in the middle of an active aircraft carrier. The researchers then used a micro–computerized tomography scan to create a 3D x-ray image of the birds’ skulls. When a rooster’s beak is fully open, as it is when crowing, a quarter of the ear canal completely closes and soft tissue covers 50% of the eardrum, the team reports in a paper in press at Zoology. This means roosters aren’t capable of hearing their own crows at full strength. The intensity of a rooster’s crow diminishes greatly with distance, so it probably doesn’t cause significant hearing loss in nearby hens. But if it did, she’d likely be OK. Unlike mammals, birds can quickly regenerate hair cells in the inner ear if they become damaged. © 2018 American Association for the Advancement of Science.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24542 - Posted: 01.20.2018

Alison Abbott The brain’s navigation system — which keeps track of where we are in space — also monitors the movements of others, experiments in bats and rats suggest. In a study published in Science1 on 11 January, neuroscientists in Israel pinpoint individual brain cells that seem specialized to track other animals or objects. These cells occur in the same region of the brain — the hippocampus — as cells that are known to map a bat’s own location. In a second paper2, scientists in Japan report finding similar brain activity when rats watched other rats moving. The unexpected findings deepen insight into the mammalian brain’s complex navigation system. Bats and rats are social animals that, like people, need to know the locations of other members of their group so that they can interact, learn from each other and move around together. Researchers have already discovered several different types of cell whose signals combine to tell an animal where it is: ‘place’ cells, for example, fire when animals are in a particular location, whereas other types correspond to speed or head direction, or even act as a kind of compass. The latest reports mark the first discovery of cells that are attuned to other animals, rather than the self. “Obviously, the whereabouts of others must be encoded somewhere in the brain, but it is intriguing to see that it seems be in the same area that tracks self,” says Edvard Moser, a neuroscientist at the Kavli Institute for Systems Neuroscience in Trondheim, Norway, who shared the 2014 Nobel Prize in Physiology or Medicine for revealing elements of the navigation system. © 2018 Macmillan Publishers Limited,

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Language and Lateralization
Link ID: 24523 - Posted: 01.12.2018

Hannah Devlin Science correspondent Deafness has been prevented in mice using gene editing for the first time, in an advance that could transform future treatment of genetic hearing loss. The study found that a single injection of a gene editing cocktail prevented progressive deafness in baby animals that were destined to lose their hearing. “We hope that the work will one day inform the development of a cure for certain forms of genetic deafness in people,” said Prof David Liu, who led the work at Harvard University and MIT. Nearly half of all cases of deafness have a genetic root, but current treatment options are limited. However, the advent of new high-precision gene editing tools such as Crispr has raised the prospect of a new class of therapies that target the underlying problem. The study, published in the journal Nature, focused on a mutation in a gene called Tmc1, a single wrong letter in the genetic code, that causes the loss of the inner ear’s hair cells over time. The delicate hairs, which sit in a spiral-shaped organ called the cochlea, vibrate in response to sound waves. Nerve cells pick up the physical motion and transmit it to the brain, where it is perceived as sound. If a child inherits one copy of the mutated Tmc1 gene they will suffer progressive hearing loss, normally starting in the first decade of life and resulting in profound deafness within 10 to 15 years. However, since most people affected by the mutation will also have a healthy version of the gene, inherited from their other parent, the scientists wanted to explore whether deleting the faulty version worked as a treatment. © 2017 Guardian News and Media Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24449 - Posted: 12.21.2017

by Ben Guarino Each year between February and June, the fish gather to spawn in Mexico's Colorado River Delta. The fish, a type of croaker called the Gulf corvina, meet in water as cloudy as chocolate milk. It's a reunion for the entire species, all members of which reproduce within a dozen-mile stretch of the delta. When the time is right, a few days before the new or full moons, the male fish begin to sing. To humans, the sound is machine guns going off just below the waterline. To female fish, the rapid burr-burr-burr is a Bing Crosby croon. Make that Bing cranked up to 11. Marine biologists who recorded the sound describe the animals as the “loudest fish ever documented,” said Timothy J. Rowell, at the Scripps Institution of Oceanography in California. Rowell and Brad E. Erisman, a University of Texas at Austin fisheries scientist, spent four days in 2014 snooping on the fish with sonar and underwater microphones. The land surrounding the delta is desolate, Rowell said. Fresh water that once fed wild greenery has been diverted to faucets and hoses. But the delta is alive with the sound of fish. “When you arrive at the channels of the delta, you can hear it in the air even while the engine is running on the boat,” Rowell said. © 1996-2017 The Washington Post

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24443 - Posted: 12.20.2017

Scientists have found a new way to explain the hearing loss caused by cisplatin, a powerful drug used to treat many forms of cancer. Using a highly sensitive technique to measure and map cisplatin in mouse and human inner ear tissues, researchers found that forms of cisplatin build up in the inner ear. They also found a region in the inner ear that could be targeted for efforts to prevent hearing loss from cisplatin. The study is published in Nature Communications (link is external), and was supported by the National Institute on Deafness and other Communications Disorders (NIDCD), part of the National Institutes of Health. Cisplatin and similar platinum-based drugs are prescribed for an estimated 10 to 20 percent of all cancer patients. The NIH’s National Cancer Institute supported research that led to the 1965 discovery of cisplatin and continued development leading to its success as an essential weapon in the battle against cancer. The drugs cause permanent hearing loss in 40 to 80 percent of adult patients and at least half of children who receive the drug. The new findings help explain why cisplatin is so toxic to the inner ear, and why hearing loss gets worse after each treatment, can occur long after treatment, and is more severe in children than adults. “Hearing loss can have a major impact on a person’s life,” said James F. Battey, Jr., M.D., Ph.D., director of NIDCD. “Many adults with hearing loss struggle with social isolation and depression, among other conditions. Children who lose their hearing often have problems with social development and keeping up at school. Helping to preserve hearing in cancer patients who benefit from these drugs would be a major contribution to the quality of their lives.”

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24442 - Posted: 12.20.2017

Can you hear this gif? Remember the white and gold dress that some internet users were certain was actually blue and black? Well, this time the dilemma being discussed online is whether you can hear anything in a silent animation of skipping pylons. The gif was created in 2008 by @IamHappyToast as part of a photoshop challenge on the boards of b3ta.com and has been circulating online since then - such as on Reddit's r/noisygifs subreddit in 2013. Many social media users have discussed the noisy-gif phenomenon, as on Imgur in 2011, for example, where it was titled an "optical illusion for the ears". It resurfaced again last weekend when Dr Lisa DeBruine from the Institute of Neuroscience & Psychology at the University of Glasgow posted it on Twitter, asking her followers to describe whether they experienced any auditory sensations while watching it. One person who suffers from ringing ears replied: "I hear a vibrating thudding sound, and it also cuts out my tinnitus during the camera shake." Others offered explanations as to why. While another suggested it may have something to do with correlated neuronal activity: "The brain is 'expecting/predicting' what is coming visually and then fires a version of what it expects across the relevant senses. Also explains why some might 'feel' a physical shake." "My gut says the camera shake is responsible for the entire effect. Anything that shook the camera like that, would probably make the 'thud' sound," posted another Twitter user.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24401 - Posted: 12.07.2017

By Marilla Steuter-Martin, CBC News By stimulating neural pathways, a team of Quebec researchers was able in a recent experiment to influence how much a group a twenty-somethings enjoyed their favourite music. The results of the Montreal Neurological Institute study might spur feelings of mistrust or dystopian images of mass-marketing mind control. But as McGill professor Alain Dagher tells it, the chances of this kind of technology being used to win over consumers is slim to none. "We don't have an interest to use this method to help sell music," Dagher, a co-author of the study, told CBC's All in a Weekend. Instead, he's hoping the process can be adapted to serve a more noble cause: in this case as an alternative treatment for mental illnesses such as depression or addiction. Dagher and his co-authors published the study last month in the journal Nature Human Behaviour. A group of 20 subjects in their 20s listened to music selected by the researchers and rated how much they enjoyed it, how it made them feel and how likely they would be to go out and buy it. Their brain responses were also measured. ©2017 CBC/Radio-Canada.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24386 - Posted: 12.04.2017

By Deirdre Sackett A few years ago, I watched a YouTube video called “Virtual Barbershop.” It was one of those viral videos that attempted to be somewhat educational. It featured (somewhat silly) barbershop sounds recorded with a special microphone that made the sounds appear as if in 3-D, to demonstrate how the brain localizes sounds. Although it was meant to be funny and a bit of a gag video, I noticed that some of the 3-D sounds actually relaxed me. In fact, I realized it was the same calming feeling I got when watching, of all things, Bob Ross’ “Joy of Painting” videos. Curious, I watched some of Bob’s YouTube videos, and sure enough, his soothing voice, brushing and tapping sounds, and calm, deliberate actions had me nearly falling asleep. By some happy little accident, I noticed a “recommended” video in the YouTube side bar called “Oh, such a good 3-D ASMR video.” I immediately felt relaxed upon hearing the sounds in the video, and even felt a small “tingle” in my head. That’s how I discovered that I had ASMR. ASMR? It sounds like some horrible affliction—an acronym for a weird, one-in-100 million condition. “Hi, I’m Deirdre, and I have ASMR.” What is it—and why is my brain tingling? © 2017 Scientific American,

Related chapters from BN: Chapter 15: Emotions, Aggression, and Stress; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24244 - Posted: 10.26.2017

By Frank Swain Just what you need in the age of ubiquitous surveillance: the latest cochlear implants will allow users stream audio directly from their iPhone into their cochlear nerve. Apple and implant manufacturer Cochlear have made “Made for iPhone” connectivity available for any hearing implants that use the next-generation Nucleus 7 sound processor. The advance means that these implants can also stream music and Netflix shows. The technology was first unveiled in 2014 when it was added to hearing aids such as the Starkey Halo and ReSound LiNX. But this is the first time it’s been linked into the central nervous system. While some cochlear implants already offer Bluetooth connectivity, these often require users to wear extra dongles or other intermediary devices to pick up digital signals, and then rebroadcast them to the hearing aid as radio. This technology simply beams the signal right into the brain. It’s also a better way to use Bluetooth. Bluetooth headsets have been commonplace since the early 2000s, but the energy-sapping technology has meant they are typically clunky devices with poor battery life. In 2014, Apple technicians developed a way to stream audio over the low energy Bluetooth format used by wearables such as FitBits. Now, tiny devices like hearing aids – and Apple’s Airpods — can stream audio signals for up to a week on a battery the size of an aspirin. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24163 - Posted: 10.09.2017

Amber Dance Ninad Kothari’s workplace looks like something out of a sci-fi film. The graduate student at Johns Hopkins University works in a darkened, red-lit room, where he trains bats to fly through obstacle courses. Shielding within the walls keeps radio and other human-made signals from interfering with transmissions from the tiny electrical signals he’s recording from the bats’ brains as the animals bob and weave. Layers of foam further insulate the cavelike lab against sound waves. An array of cameras and microphones complete the futuristic scene. The high-tech setup has its homemade touches, too: In one obstacle course, bats dodge dangling Quaker oatmeal cylinders. Kothari is part of a small cadre of neuroscientists who are getting the best sense yet of how bat brains work at a cellular level, thanks to modern technologies. Eavesdropping tools, which rely on tiny probes that track the activities of individual nerve cells, or neurons, are now miniaturized enough to outfit bats with head-mounted, wireless sensors. As the animals fly freely around the lab, the researchers can listen in on neurons. By allowing the bats to behave naturally, unencumbered by bulky equipment, scientists will discover exciting new facets of how bat brains work, says neuroscientist Nachum Ulanovsky of the Weizmann Institute of Science in Rehovot, Israel, who invented the new wireless sensors with colleagues. He and others, studying several different species of bats, are investigating how the flying mammals perceive their environment and navigate through it. |© Society for Science & the Public 2000 - 2017

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24089 - Posted: 09.21.2017

Patrick Barkham Humans trying to chat each other up in a noisy nightclub may find verbal communication futile. But it appears even more pointless for pumpkin toadlets after scientists discovered that females have lost the ability to hear the sound of male mating calls. An international team from Brazil, Denmark and the UK has discovered that the males of two species of tiny orange frogs continue to make high-pitched calls despite neither females nor males being able to hear them. It is believed to be the first case in the animal kingdom of a communication signal enduring even after its target audience has lost the ability to detect it. Field studies began in Brazil’s Atlantic forest by playing frog calls to determine how these species, which possess a middle ear, could hear their own calls. Lead researcher Dr Sandra Goutte at the Universidade Estadual de Campinas, São Paulo, was surprised to find the frogs refused to respond to her playback communication, didn’t change their calling behaviour and didn’t even orient themselves towards the sounds. “I thought I would find the sound transmission pathway from the outside to the middle ear,” she said. “We didn’t think it would be possible that they would not be able to hear their own calls.” © 2017 Guardian News and Media Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Higher Cognition
Link ID: 24088 - Posted: 09.21.2017

by Helen Thompson Barn owl ears age well. Unlike other animals, the birds don’t suffer from hearing loss as a hallmark of aging, a new study suggests. Beyond people, age-related hearing loss has been documented in mice, gerbils and chinchillas. Those deficits are linked to deterioration of the tiny hair cells that line the sensory layer of the eardrum. But some evidence hints that birds may not suffer from dips in hearing. Bianca Krumm and her colleagues at the University of Oldenburg in Germany tested the ear sensitivity of seven barn owls (Tyto alba) grouped by age. There weren’t significant differences in what 2-year-old owls could hear versus those age 13 or older, suggesting the birds’ ears remain intact despite age, the researchers conclude September 20 in Proceedings of the Royal Society B. While the exact mechanism for this apparent ear agelessness remains elusive, the researchers suspect that the birds must continuously regenerate sensory ear tissue — a process that wanes with age in other species. © Society for Science & the Public 2000 - 2017

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24078 - Posted: 09.20.2017

Bruno Martin “I heard a thud behind me,” says zoologist Stefan Greif, recalling the first time he noticed a bat crash into a metal plate propped up against a wall in his lab’s flight chamber. Now, in a study published on 7 September in Science1, a team led by Greif — of the Max Planck Institute for Ornithology in Seewiesen, Germany — explains why bats often slam into vertical panes, such as glass windows.These smooth surfaces interfere with bats’ echolocation by reflecting sound away from the creatures. Bats rely on echolocation to navigate in the dark. They locate and identify objects by sending out shrill calls and listening to the echoes that bounce back. Greif and his colleagues tested the echolocation of 21 wild-caught greater mouse-eared bats (Myotis myotis) in the lab. The researchers placed a featureless metal plate on a side wall at the end of a flight tunnel. The bats interpreted the smooth surface — but not the adjacent, felt-covered walls — as a clear flight path. Over an an average of around 20 trials for each bat, 19 of them crashed into the panel at least once. The researchers also put up smooth, vertical plates near wild bat colonies, and saw similar results. The animals became confused owing to a property of smooth surfaces called ‘acoustic mirroring’. Whereas rough objects bounce some echoes back towards the bat, says Greif, a smooth surface reflects all echolocation calls away from the source. This makes a smooth wall appear as empty space to the bats, until they are directly in front of it. Only once a bat is facing the surface are their perpendicular echoes reflected back, which alerts the bat to its mistake. This explains why some bats attempted to swerve out of harm’s way at the last second — but often too late. © 2017 Macmillan Publishers Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24048 - Posted: 09.08.2017

By Clare Wilson Some people who are blind can echolocate like bats, making clicks with their mouths that help them understand the environment around them. Now researchers are beginning to understand how this works, so non-sighted people may one day be able to learn the technique. While many people who are blind get information from ambient echoes, only a few make noises themselves to echolocate. Some, such as Daniel Kish (pictured), are so proficient they can draw a sketch of a room after clicking their way around it, or even go mountain biking along unfamiliar routes. Daniel Kish: Blind children should be allowed to echolocate like me Previous research revealed that this human echolocation involves some brain areas that are used for vision in sighted people. Kish, who was blind almost from birth, thinks he experiences the sensations as something akin to images. “It’s not computational. There’s a real palpable experience of the image as a spatial representation – here are walls, here are the corners, here is the presence of objects.” In the latest study, Lore Thaler of Durham University, UK, and her team carried out the first in-depth acoustic analysis of the mouth clicks. They worked with Kish and two other blind echolocators from the Netherlands and Austria. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24021 - Posted: 09.01.2017

Andrea Hsu Dan Fabbio was 25 and working on a master's degree in music education when he stopped being able to hear music in stereo. Music no longer felt the same to him. When he was diagnosed with a brain tumor, he immediately worried about cancer. Fortunately, his tumor was benign. Unfortunately, it was located in a part of the brain known to be active when people listen to and make music. Fabbio told his surgeon that music was the most important thing is his life. It was his passion as well as his profession. His surgeon understood. He's someone whose passion has been mapping the brain so he can help patients retain as much function as possible. Dr. Web Pilcher, chair of the Department of Neurosurgery at the University of Rochester Medical Center, and his colleague Brad Mahon, a cognitive neuroscientist, had developed a brain mapping program. Since 2011, they've used the program to treat all kinds of patients with brain tumors: mathematicians, lawyers, a bus driver, a furniture maker. Fabbio was their first musician. The idea behind the program is to learn as much as possible about the patient's life and the patient's brain before surgery to minimize damage to it during the procedure. "Removing a tumor from the brain can have significant consequences depending upon its location," Pilcher says. "Both the tumor itself and the operation to remove it can damage tissue and disrupt communication between different parts of the brain." © 2017 npr

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 24002 - Posted: 08.26.2017

Susan Milius Sonar pings from a hungry bat closing in can inspire hawkmoths to get their genitals trilling. The ultrasonic “eeeee” of scraping moth sex organs may serve as a last-second acoustic defense, says behavioral ecologist Jesse Barber of Boise State University in Idaho. In theory, the right squeak could jam bats’ targeting sonar, remind them of a noisy moth that tasted terrible or just startle them enough for the hawkmoth to escape. Males of at least three hawkmoth species in Malaysia squeak in response to recorded echolocation sounds of the final swoop in a bat attack, Barber and Akito Kawahara of the University of Florida in Gainesville report July 3 in Biology Letters. Female hawkmoths are hard to catch, but the few Barber and Kawahara have tested squeak too. Although they’re the same species as the males, they use their genitals in a different way to make ultrasound. Squeak power may have arisen during courtship and later proved useful during attacks. Until now, researchers knew of only two insect groups that talk back to bats: some tiger moths and tiger beetles. Neither is closely related to hawkmoths, so Barber speculates that anti-bat noises might be widespread among insects. Slowed-down video shows first the male and then the female hawkmoth creating ultrasonic trills at the tips of their abdomens. Males use a pair of claspers that grasp females in mating. To sound off, these quickly slide in and out of the abdomen, rasping specialized scales against the sides. Females rub the left and right sides of their abdominal structures together. J. Barber and A.Y. Kawahara. Hawkmoths produce anti-bat ultrasound. Biology Letters. Posted July 3, 2013. doi: 10.1098/rsbl.2013.0161 [Go to] |© Society for Science & the Public 2000 - 2017.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23864 - Posted: 07.24.2017

By Aylin Woodward See, hear. Our eardrums appear to move to shift our hearing in the same direction as our eyes are looking. Why this happens is unclear, but it may help us work out which objects we see are responsible for the sounds we can hear. Jennifer Groh at Duke University in Durham, North Carolina, and her team have been using microphones inserted into people’s ears to study how their eardrums change during saccades – the movement that occurs when we shift visual focus from one place to another. You won’t notice it, but our eyes go through several saccades a second to take in our surroundings. Examining 16 people, the team detected changes in ear canal pressure that were probably caused by middle-ear muscles tugging on the eardrum. These pressure changes indicate that when we look left, for example, the drum of our left ear gets pulled further into the ear and that of our right ear pushed out, before they both swing back and forth a few times. These changes to the eardrums began as early as 10 milliseconds before the eyes even started to move, and continued for a few tens of milliseconds after the eyes stopped. Making sense “We think that before actual eye movement occurs, the brain sends a signal to the ear to say ‘I have commanded the eyes to move 12 degrees to the right’,” says Groh. The eardrum movements that follow the change in focus may prepare our ears to hear sounds from a particular direction. © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 7: Vision: From Eye to Brain
Link ID: 23860 - Posted: 07.22.2017