Links for Keyword: Hearing

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 632

By Frank Swain Just what you need in the age of ubiquitous surveillance: the latest cochlear implants will allow users stream audio directly from their iPhone into their cochlear nerve. Apple and implant manufacturer Cochlear have made “Made for iPhone” connectivity available for any hearing implants that use the next-generation Nucleus 7 sound processor. The advance means that these implants can also stream music and Netflix shows. The technology was first unveiled in 2014 when it was added to hearing aids such as the Starkey Halo and ReSound LiNX. But this is the first time it’s been linked into the central nervous system. While some cochlear implants already offer Bluetooth connectivity, these often require users to wear extra dongles or other intermediary devices to pick up digital signals, and then rebroadcast them to the hearing aid as radio. This technology simply beams the signal right into the brain. It’s also a better way to use Bluetooth. Bluetooth headsets have been commonplace since the early 2000s, but the energy-sapping technology has meant they are typically clunky devices with poor battery life. In 2014, Apple technicians developed a way to stream audio over the low energy Bluetooth format used by wearables such as FitBits. Now, tiny devices like hearing aids – and Apple’s Airpods — can stream audio signals for up to a week on a battery the size of an aspirin. © Copyright New Scientist Ltd.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24163 - Posted: 10.09.2017

Amber Dance Ninad Kothari’s workplace looks like something out of a sci-fi film. The graduate student at Johns Hopkins University works in a darkened, red-lit room, where he trains bats to fly through obstacle courses. Shielding within the walls keeps radio and other human-made signals from interfering with transmissions from the tiny electrical signals he’s recording from the bats’ brains as the animals bob and weave. Layers of foam further insulate the cavelike lab against sound waves. An array of cameras and microphones complete the futuristic scene. The high-tech setup has its homemade touches, too: In one obstacle course, bats dodge dangling Quaker oatmeal cylinders. Kothari is part of a small cadre of neuroscientists who are getting the best sense yet of how bat brains work at a cellular level, thanks to modern technologies. Eavesdropping tools, which rely on tiny probes that track the activities of individual nerve cells, or neurons, are now miniaturized enough to outfit bats with head-mounted, wireless sensors. As the animals fly freely around the lab, the researchers can listen in on neurons. By allowing the bats to behave naturally, unencumbered by bulky equipment, scientists will discover exciting new facets of how bat brains work, says neuroscientist Nachum Ulanovsky of the Weizmann Institute of Science in Rehovot, Israel, who invented the new wireless sensors with colleagues. He and others, studying several different species of bats, are investigating how the flying mammals perceive their environment and navigate through it. |© Society for Science & the Public 2000 - 2017

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24089 - Posted: 09.21.2017

Patrick Barkham Humans trying to chat each other up in a noisy nightclub may find verbal communication futile. But it appears even more pointless for pumpkin toadlets after scientists discovered that females have lost the ability to hear the sound of male mating calls. An international team from Brazil, Denmark and the UK has discovered that the males of two species of tiny orange frogs continue to make high-pitched calls despite neither females nor males being able to hear them. It is believed to be the first case in the animal kingdom of a communication signal enduring even after its target audience has lost the ability to detect it. Field studies began in Brazil’s Atlantic forest by playing frog calls to determine how these species, which possess a middle ear, could hear their own calls. Lead researcher Dr Sandra Goutte at the Universidade Estadual de Campinas, São Paulo, was surprised to find the frogs refused to respond to her playback communication, didn’t change their calling behaviour and didn’t even orient themselves towards the sounds. “I thought I would find the sound transmission pathway from the outside to the middle ear,” she said. “We didn’t think it would be possible that they would not be able to hear their own calls.” © 2017 Guardian News and Media Limited

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Consciousness
Link ID: 24088 - Posted: 09.21.2017

by Helen Thompson Barn owl ears age well. Unlike other animals, the birds don’t suffer from hearing loss as a hallmark of aging, a new study suggests. Beyond people, age-related hearing loss has been documented in mice, gerbils and chinchillas. Those deficits are linked to deterioration of the tiny hair cells that line the sensory layer of the eardrum. But some evidence hints that birds may not suffer from dips in hearing. Bianca Krumm and her colleagues at the University of Oldenburg in Germany tested the ear sensitivity of seven barn owls (Tyto alba) grouped by age. There weren’t significant differences in what 2-year-old owls could hear versus those age 13 or older, suggesting the birds’ ears remain intact despite age, the researchers conclude September 20 in Proceedings of the Royal Society B. While the exact mechanism for this apparent ear agelessness remains elusive, the researchers suspect that the birds must continuously regenerate sensory ear tissue — a process that wanes with age in other species. © Society for Science & the Public 2000 - 2017

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24078 - Posted: 09.20.2017

Bruno Martin “I heard a thud behind me,” says zoologist Stefan Greif, recalling the first time he noticed a bat crash into a metal plate propped up against a wall in his lab’s flight chamber. Now, in a study published on 7 September in Science1, a team led by Greif — of the Max Planck Institute for Ornithology in Seewiesen, Germany — explains why bats often slam into vertical panes, such as glass windows.These smooth surfaces interfere with bats’ echolocation by reflecting sound away from the creatures. Bats rely on echolocation to navigate in the dark. They locate and identify objects by sending out shrill calls and listening to the echoes that bounce back. Greif and his colleagues tested the echolocation of 21 wild-caught greater mouse-eared bats (Myotis myotis) in the lab. The researchers placed a featureless metal plate on a side wall at the end of a flight tunnel. The bats interpreted the smooth surface — but not the adjacent, felt-covered walls — as a clear flight path. Over an an average of around 20 trials for each bat, 19 of them crashed into the panel at least once. The researchers also put up smooth, vertical plates near wild bat colonies, and saw similar results. The animals became confused owing to a property of smooth surfaces called ‘acoustic mirroring’. Whereas rough objects bounce some echoes back towards the bat, says Greif, a smooth surface reflects all echolocation calls away from the source. This makes a smooth wall appear as empty space to the bats, until they are directly in front of it. Only once a bat is facing the surface are their perpendicular echoes reflected back, which alerts the bat to its mistake. This explains why some bats attempted to swerve out of harm’s way at the last second — but often too late. © 2017 Macmillan Publishers Limited

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24048 - Posted: 09.08.2017

By Clare Wilson Some people who are blind can echolocate like bats, making clicks with their mouths that help them understand the environment around them. Now researchers are beginning to understand how this works, so non-sighted people may one day be able to learn the technique. While many people who are blind get information from ambient echoes, only a few make noises themselves to echolocate. Some, such as Daniel Kish (pictured), are so proficient they can draw a sketch of a room after clicking their way around it, or even go mountain biking along unfamiliar routes. Daniel Kish: Blind children should be allowed to echolocate like me Previous research revealed that this human echolocation involves some brain areas that are used for vision in sighted people. Kish, who was blind almost from birth, thinks he experiences the sensations as something akin to images. “It’s not computational. There’s a real palpable experience of the image as a spatial representation – here are walls, here are the corners, here is the presence of objects.” In the latest study, Lore Thaler of Durham University, UK, and her team carried out the first in-depth acoustic analysis of the mouth clicks. They worked with Kish and two other blind echolocators from the Netherlands and Austria. © Copyright New Scientist Ltd.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 24021 - Posted: 09.01.2017

Andrea Hsu Dan Fabbio was 25 and working on a master's degree in music education when he stopped being able to hear music in stereo. Music no longer felt the same to him. When he was diagnosed with a brain tumor, he immediately worried about cancer. Fortunately, his tumor was benign. Unfortunately, it was located in a part of the brain known to be active when people listen to and make music. Fabbio told his surgeon that music was the most important thing is his life. It was his passion as well as his profession. His surgeon understood. He's someone whose passion has been mapping the brain so he can help patients retain as much function as possible. Dr. Web Pilcher, chair of the Department of Neurosurgery at the University of Rochester Medical Center, and his colleague Brad Mahon, a cognitive neuroscientist, had developed a brain mapping program. Since 2011, they've used the program to treat all kinds of patients with brain tumors: mathematicians, lawyers, a bus driver, a furniture maker. Fabbio was their first musician. The idea behind the program is to learn as much as possible about the patient's life and the patient's brain before surgery to minimize damage to it during the procedure. "Removing a tumor from the brain can have significant consequences depending upon its location," Pilcher says. "Both the tumor itself and the operation to remove it can damage tissue and disrupt communication between different parts of the brain." © 2017 npr

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 24002 - Posted: 08.26.2017

Susan Milius Sonar pings from a hungry bat closing in can inspire hawkmoths to get their genitals trilling. The ultrasonic “eeeee” of scraping moth sex organs may serve as a last-second acoustic defense, says behavioral ecologist Jesse Barber of Boise State University in Idaho. In theory, the right squeak could jam bats’ targeting sonar, remind them of a noisy moth that tasted terrible or just startle them enough for the hawkmoth to escape. Males of at least three hawkmoth species in Malaysia squeak in response to recorded echolocation sounds of the final swoop in a bat attack, Barber and Akito Kawahara of the University of Florida in Gainesville report July 3 in Biology Letters. Female hawkmoths are hard to catch, but the few Barber and Kawahara have tested squeak too. Although they’re the same species as the males, they use their genitals in a different way to make ultrasound. Squeak power may have arisen during courtship and later proved useful during attacks. Until now, researchers knew of only two insect groups that talk back to bats: some tiger moths and tiger beetles. Neither is closely related to hawkmoths, so Barber speculates that anti-bat noises might be widespread among insects. Slowed-down video shows first the male and then the female hawkmoth creating ultrasonic trills at the tips of their abdomens. Males use a pair of claspers that grasp females in mating. To sound off, these quickly slide in and out of the abdomen, rasping specialized scales against the sides. Females rub the left and right sides of their abdominal structures together. J. Barber and A.Y. Kawahara. Hawkmoths produce anti-bat ultrasound. Biology Letters. Posted July 3, 2013. doi: 10.1098/rsbl.2013.0161 [Go to] |© Society for Science & the Public 2000 - 2017.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23864 - Posted: 07.24.2017

By Aylin Woodward See, hear. Our eardrums appear to move to shift our hearing in the same direction as our eyes are looking. Why this happens is unclear, but it may help us work out which objects we see are responsible for the sounds we can hear. Jennifer Groh at Duke University in Durham, North Carolina, and her team have been using microphones inserted into people’s ears to study how their eardrums change during saccades – the movement that occurs when we shift visual focus from one place to another. You won’t notice it, but our eyes go through several saccades a second to take in our surroundings. Examining 16 people, the team detected changes in ear canal pressure that were probably caused by middle-ear muscles tugging on the eardrum. These pressure changes indicate that when we look left, for example, the drum of our left ear gets pulled further into the ear and that of our right ear pushed out, before they both swing back and forth a few times. These changes to the eardrums began as early as 10 milliseconds before the eyes even started to move, and continued for a few tens of milliseconds after the eyes stopped. Making sense “We think that before actual eye movement occurs, the brain sends a signal to the ear to say ‘I have commanded the eyes to move 12 degrees to the right’,” says Groh. The eardrum movements that follow the change in focus may prepare our ears to hear sounds from a particular direction. © Copyright New Scientist Ltd.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 7: Vision: From Eye to Brain
Link ID: 23860 - Posted: 07.22.2017

Nicola Davis People who experience hearing loss could be at greater risk of memory and thinking problems later in life than those without auditory issues, research suggests. The study focused on people who were at risk of Alzheimer’s disease, revealing that those who were diagnosed with hearing loss had a higher risk of “mild cognitive impairment” four years later. “It’s really not mild,” said Clive Ballard, professor of age-related disease at the University of Exeter. “They are in the lowest 5% of cognitive performance and about 50% of those individuals will go on to develop dementia.” Guardian Morning Briefing - sign up and start the day one step ahead Read more Presented at the Alzheimer’s Association International Conference in London, researchers from the US looked at the memory and thinking skills of 783 cognitively healthy participants in late middle age, more than two-thirds of whom had at least one parent who had been diagnosed with Alzheimer’s disease. The team carried out a range of cognitive tests on the participants over a four-year period, aimed at probing memory and mental processing, revealing that those who had hearing loss at the start of the study were more than twice as likely to be found to have mild cognitive impairment four years later than those with no auditory problems, once a variety of other risk factors were taken into account. Taylor Fields, a PhD student at the University of Wisconsin who led the research, said that the findings suggest hearing loss could be an early warning sign that an individual might be at greater risk of future cognitive impairment - but added more research was necessary to unpick the link. “There is something here and it should be looked into,” she said. © 2017 Guardian News and Media Limited

Related chapters from BN8e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23840 - Posted: 07.17.2017

By Mo Costandi You can’t teach an old dog new tricks—or can you? Textbooks tell us that early infancy offers a narrow window of opportunity during which sensory experience shapes the way neuronal circuits wire up to process sound and other inputs. A lack of proper stimulation during this “critical period” has a permanent and detrimental effect on brain development. But new research shows the auditory system in the adult mouse brain can be induced to revert to an immature state similar to that in early infancy, improving the animals’ ability to learn new sounds. The findings, published Thursday in Science, suggest potential new ways of restoring brain function in human patients with neurological diseases—and of improving adults’ ability to learn languages and musical instruments. In mice, a critical period occurs during which neurons in a portion of the brain’s wrinkled outer surface, the cortex, are highly sensitized to processing sound. This state of plasticity allows them to strengthen certain connections within brain circuits, fine-tuning their auditory responses and enhancing their ability to discriminate between different tones. In humans, a comparable critical period may mark the beginning of language acquisition. But heightened plasticity declines rapidly, and this continues throughout life, making it increasingly difficult to learn. In 2011 Jay Blundon, a developmental neurobiologist at Saint Jude Children's Research Hospital, and his colleagues reported that the critical periods for circuits connecting the auditory cortex and the thalamus occur at about the same time. © 2017 Scientific American,

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory, Learning, and Development
Link ID: 23793 - Posted: 06.30.2017

Elizabeth Hellmuth Margulis Whether tapping a foot to samba or weeping at a ballad, the human response to music seems almost instinctual. Yet few can articulate how music works. How do strings of sounds trigger emotion, inspire ideas, even define identities? Cognitive scientists, anthropologists, biologists and musicologists have all taken a crack at that question (see go.nature.com/2sdpcb5), and it is into this line that Adam Ockelford steps. Comparing Notes draws on his experience as a composer, pianist, music researcher and, most notably, a music educator working for decades with children who have visual impairments or are on the autistic spectrum, many with extraordinary musical abilities. Through this “prism of the overtly remarkable”, Ockelford seeks to shed light on music perception and cognition in all of us. Existing models based on neurotypical children could overlook larger truths about the human capacity to learn and make sense of music he contends. Some of the children described in Comparing Notes might (for a range of reasons) have trouble tying their shoelaces or carrying on a basic conversation. Yet before they hit double digits in age, they can hear a complex composition for the first time and immediately play it on the piano, their fingers flying to the correct notes. This skill, Ockelford reminds us, eludes many adults with whom he studied at London's Royal Academy of Music. Weaving together the strands that let these children perform such stunning feats, Ockelford constructs an argument for rethinking conventional wisdom on music education. He positions absolute pitch (AP) as central to these abilities to improvise, listen and play. © 2017 Macmillan Publishers Limited,

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23744 - Posted: 06.15.2017

Paula Span A few years hence, when you’ve finally tired of turning up the TV volume and making dinner reservations at 5:30 p.m. because any later and the place gets too loud, you may go shopping. Perhaps you’ll head to a local boutique called The Hear Better Store, or maybe Didja Ear That? (Reader nominees for kitschy names invited.) Maybe you’ll opt for a big-box retailer or a kiosk at your local pharmacy. If legislation now making its way through Congress succeeds, these places will all offer hearing aids. You’ll try out various models — they’ll all meet newly established federal requirements — to see what seems to work and feel best. Your choices might include products from big consumer electronics specialists like Apple, Samsung and Bose. If you want assistance, you might pay an audiologist to provide customized services, like adjusting frequencies or amplification levels. But you won’t need to go through an audiologist-gatekeeper, as you do now, to buy hearing aids. The best part of this over-the-counter scenario: Instead of spending an average of $1,500 to $2,000 per device (and nearly everyone needs two), you’ll find that the price has plummeted. You might pay $300 per ear, maybe even less. So many people will be using these new over-the-counter hearing aids — along with the hordes wearing earbuds for other reasons — that you won’t feel self-conscious. You’ll blend right in. That, at least, represents the future envisioned by supporters of the Over-the-Counter Hearing Aid Act of 2017, which would give the Food and Drug Administration three years to create a regulatory category for such devices and to establish standards for safety, effectiveness and labeling.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23736 - Posted: 06.13.2017

By Lore Thaler, Liam Norman Echolocation is probably most associated with bats and dolphins. These animals emit bursts of sounds and listen to the echoes that bounce back to detect objects in their environment and to perceive properties of the objects (e.g. location, size, material). Bats, for example, can tell the distance of objects with high precision using the time delay between emission and echo, and are able to determine a difference in distance as small as one centimeter. This is needed for them to be able to catch insects in flight. People, remarkably, can also echolocate. By making mouth clicks, for example, and listening for the returning echoes, they can perceive their surroundings. Humans, of course, cannot hear ultrasound, which may put them at a disadvantage. Nonetheless, some people have trained themselves to an extraordinary level. Daniel Kish, who is blind and is a well-known expert echolocator, is able to ride his bicycle, hike in unfamiliar terrain, and travel in unfamiliar cities on his own. Daniel is the founder and president of World Access for the Blind, a non-profit charity in the US that offers training in echolocation alongside training in other mobility techniques such as the long cane. Since 2011, the scientific interest in human echolocation has gained momentum. For example, technical advances have made it feasible to scan people’s brains while they echolocate. This research has shown that people who are blind and have expertise in echolocation use ‘visual’ parts of their brain to process information from echoes. It has also been found that anyone with normal hearing can learn to use echoes to determine the sizes, locations, or distance of objects or to use it to avoid obstacles during walking. Remarkably, both blind and sighted people can improve their ability to interpret and use sound echoes within a session or two. © 2017 Scientific American

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23570 - Posted: 05.04.2017

By Chris Baraniuk Bat-detecting drones could help us find out what the animals get up to when flying. Ultrasonic detectors on drones in the air and on the water are listening in on bat calls, in the hope of discovering more about the mammals’ lives beyond the reach of ground-based monitoring devices. Drone-builder Tom Moore and bat enthusiast Tom August have developed three different drones to listen for bat calls while patrolling a pre-planned route. Since launching the scheme, known as Project Erebus, in 2014, they have experimented with two flying drones and one motorised boat, all equipped with ultrasonic detectors. The pair’s latest tests have demonstrated the detection capabilities of the two airborne drone models: a quadcopter and a fixed-wing drone. Last month, the quadcopter successfully followed a predetermined course and picked up simulated bat calls produced by an ultrasonic transmitter. The bat signal Moore says one of the major hurdles is detecting the call of bats over the noise of the drones’ propellers, which emit loud ultrasonic frequencies. They overcame this with the quadcopter by dangling the detector underneath the body and rotors of the drone. This is not such a problem for the water-based drone. Last year, Moore and August tested a remote-controlled boat in Oxfordshire, UK, and picked up bat calls thought to belong to common pipistrelle and Daubenton’s bats. The different species often emit different ultrasonic frequencies. © Copyright Reed Business Information Ltd.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23524 - Posted: 04.22.2017

By C. CLAIBORNE RAY. The yellow stuff in the outer part of the ear canal, scientifically named cerumen, is only partly a waxy substance, according to the National Institute on Deafness and Other Communication Disorders. The rest of the so-called wax is an accretion of some dust and lots of dead skin cells, which normally collect in the passage as they are shed. The waxy part, which holds the compacted waste together and smooths the way for it to leave the ear, comes from the ceruminous glands, which secrete lipids and other substances. They are specialized sweat glands just under the surface of the skin in the outer part of the canal. Besides lubricating the skin of the canal while keeping it dry, the lipids also help maintain a protective acidic coating, which helps kill bacteria and fungi that can cause infection and irritation. The normal working of muscles in the head, especially those that move the jaw, help guide the wax outward along the ear canal. The ceruminous glands commonly shrink in old age, producing less of the lipids and making it harder for waste to leave the ear. Excess wax buildup can usually be safely softened with warm olive or almond oil or irrigated with warm water, though specialized softening drops are also sold. Take care not to compress the buildup further with cotton swabs or other tools. If it cannot be safely removed, seek medical help. © 2017 The New York Times Company

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23472 - Posted: 04.11.2017

By David Owen When my mother’s mother was in her early twenties, a century ago, a suitor took her duck hunting in a rowboat on a lake near Austin, Texas, where she grew up. He steadied his shotgun by resting the barrel on her right shoulder—she was sitting in the bow—and when he fired he not only missed the duck but also permanently damaged her hearing, especially on that side. The loss became more severe as she got older, and by the time I was in college she was having serious trouble with telephones. (“I’m glad it’s not raining! ” I’d shout, for the third or fourth time, while my roommates snickered.) Her deafness probably contributed to one of her many eccentricities: ending phone conversations by suddenly hanging up. I’m a grandparent myself now, and lots of people I know have hearing problems. A guy I played golf with last year came close to making a hole in one, then complained that no one in our foursome had complimented him on his shot—even though, a moment before, all three of us had complimented him on his shot. (We were walking behind him.) The man who cuts my wife’s hair began wearing two hearing aids recently, to compensate for damage that he attributes to years of exposure to professional-quality blow-dryers. My sister has hearing aids, too. She traces her problem to repeatedly listening at maximum volume to Anne’s Angry and Bitter Breakup Song Playlist, which she created while going through a divorce. My ears ring all the time—a condition called tinnitus. I blame China, because the ringing started, a decade ago, while I was recovering from a monthlong cold that I’d contracted while breathing the filthy air in Beijing, and whose symptoms were made worse by changes in cabin pressure during the long flight home. Tinnitus is almost always accompanied by hearing loss. My internist ordered an MRI, to make sure I didn’t have a brain tumor, and held up a vibrating tuning fork and asked me to tell him when I could no longer hear it. After a while, he leaned forward to make sure the tuning fork was still humming, since he himself could no longer hear it. (We’re about the same age.) © 2017 Condé Nast.

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23434 - Posted: 03.31.2017

By Catherine Offord | Recognizing when you’re singing the right notes is a crucial skill for learning a melody, whether you’re a human practicing an aria or a bird rehearsing a courtship song. But just how the brain executes this sort of trial-and-error learning, which involves comparing performances to an internal template, is still something of a mystery. “It’s been an important question in the field for a long time,” says Vikram Gadagkar, a postdoctoral neurobiologist in Jesse Goldberg’s lab at Cornell University. “But nobody’s been able to find out how this actually happens.” Gadagkar suspected, as others had hypothesized, that internally driven learning might rely on neural mechanisms similar to traditional reward learning, in which an animal learns to anticipate a treat based on a particular stimulus. When an unexpected outcome occurs (such as receiving no treat when one was expected), the brain takes note via changes in dopamine signaling. So Gadagkar and his colleagues investigated dopamine signaling in a go-to system for studying vocal learning, male zebra finches. First, the researchers used electrodes to record the activity of dopaminergic neurons in the ventral tegmental area (VTA), a brain region important in reward learning. Then, to mimic singing errors, they used custom-written software to play over, and thus distort, certain syllables of that finch’s courtship song while the bird practiced. “Let’s say the bird’s song is ABCD,” says Gadagkar. “We distort one syllable, so it sounds like something between ABCD and ABCB.” © 1986-2017 The Scientist

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 8: Hormones and Sex
Link ID: 23426 - Posted: 03.30.2017

By Tim Falconer HOUSE OF ANANSI, MAY 2016I’ve spent my career bothering people. As a journalist and author, I hang around and watch what folks do, and I ask too many questions, some better than others. Later, I have follow-up queries and clarification requests, and I bug them for those stats they promised to provide me. But something different happened when I started researching congenital amusia, the scientific term for tone deafness present at birth, for my new book, Bad Singer. The scientists were as interested in me as I was in them. My idea was to learn to sing and then write about the experience as a way to explore the science of singing. After my second voice lesson, I went to the Université de Montréal’s International Laboratory for Brain, Music, and Sound Research (BRAMS). I fully expected Isabelle Peretz, a pioneer in amusia research, to say I was just untrained. Instead, she diagnosed me as amusic. “So this means what?” I asked. “We would love to test you more.” The BRAMS researchers weren’t alone. While still at Harvard’s Music and Neuroimaging Lab, Psyche Loui—who now leads Wesleyan University’s Music, Imaging, and Neural Dynamics (MIND) Lab—identified a neural pathway called the arcuate fasciculus as the culprit of congenital amusia. So I emailed her to set up an interview. She said sure—and asked if I’d be willing to undergo an fMRI scan. And I’d barely started telling my story to Frank Russo, who runs Ryerson University’s Science of Music, Auditory Research, and Technology (SMART) Lab in Toronto, before he blurted out, “Sorry, I’m restraining myself from wanting to sign you up for all kinds of research and figuring what we can do with you.” © 1986-2017 The Scientist

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23425 - Posted: 03.30.2017

By Bob Grant In the past decade, some bat species have been added to the ranks of “singing” animals, with complex, mostly ultrasonic vocalizations that, when slowed down, rival the tunes of some songbirds. Like birds, bats broadcast chirps, warbles, and trills to attract mates and defend territories. There are about 1,300 known bat species, and the social vocalizations of about 50 have been studied. Of those, researchers have shown that about 20 species seem to be singing, with songs that are differentiated from simpler calls by both their structural complexity and their function. Bats don’t sound like birds to the naked ear; most singing species broadcast predominately in the ultrasonic range, undetectable by humans. And in contrast to the often lengthy songs of avian species, the flying mammals sing in repeated bursts of only a few hundred milliseconds. Researchers must first slow down the bat songs—so that their frequencies drop into the audible range—to hear the similarities. Kirsten Bohn, a behavioral biologist at Johns Hopkins University, first heard Brazilian free-tailed bats (Tadarida brasiliensis) sing more than 10 years ago, when she was a postdoc in the lab of Mike Smotherman at Texas A&M University. “I started hearing a couple of these songs slowed down,” she recalls. “And it really was like, ‘Holy moly—that’s a song! That sounds like a bird.’” The neural circuitry used to learn and produce song may also share similarities between bats and birds. Bohn and Smotherman say they’ve gathered some tantalizing evidence that bats use some of the same brain regions—namely, the basal ganglia and prefrontal cortex—that birds rely upon to produce, process, and perhaps even learn songs. “We have an idea of how the neural circuits control vocalizing in the bats and how they might be adapted to produce song,” Smotherman says. © 1986-2017 The Scientist

Related chapters from BN8e: Chapter 9: Hearing, Vestibular Perception, Taste, and Smell; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23369 - Posted: 03.17.2017