Links for Keyword: Hearing

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 722

By Lisa Sanders, M.D. “We were thinking about going bowling with the kids tomorrow,” the woman told her 43-year-old brother as they settled into their accustomed spots in the living room of their mother’s home in Chicago. It was late — nearly midnight — and he had arrived from Michigan to spend the days between Christmas and New Year’s with this part of his family. She and her husband and her brother grew up together and spent many late nights laughing and talking. She knew her brother was passionate about bowling. He had spent almost every day in his local alley two summers ago. So she was taken by surprise when he answered, “I can’t do that anymore.” Certainly, her brother had had a tough year. It seemed to start with his terrible heartburn. For most of his life, he had what he described as run-of-the-mill heartburn, usually triggered by eating late at night, and he would have to take a couple of antacid tablets. But that year his heartburn went ballistic. His mouth always tasted like metal. And the reflux of food back up the esophagus would get so bad that it would make him vomit. Nothing seemed to help. He quit drinking coffee. Quit drinking alcohol. Stopped eating spicy foods. He told his doctor, who started him on a medication known as a proton pump inhibitor (P.P.I.) to reduce the acid or excess protons his stomach made. That pill provided relief from the burning pain. But he still had the metallic taste in his mouth, still felt sick after eating. He still vomited several times a week. When he discovered that he wouldn’t throw up when he drank smoothies, he almost completely gave up solid foods. When he was still feeling awful after weeks on the P.P.I., his gastroenterologist used a tiny camera to take a look at his esophagus. His stomach looked fine, but the region where the esophagus entered the stomach was a mess. Normally the swallowing tube ends with a tight sphincter that stays closed to protect delicate tissue from the harsh acid of the stomach. It opens when swallowing, to let the food pass. But his swallowing tube was wide open and the tissue around the sphincter was red and swollen. © 2024 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 29137 - Posted: 02.08.2024

By Gina Kolata Aissam Dam, an 11-year-old boy, grew up in a world of profound silence. He was born deaf and had never heard anything. While living in a poor community in Morocco, he expressed himself with a sign language he invented and had no schooling. Last year, after moving to Spain, his family took him to a hearing specialist, who made a surprising suggestion: Aissam might be eligible for a clinical trial using gene therapy. On Oct. 4, Aissam was treated at the Children’s Hospital of Philadelphia, becoming the first person to get gene therapy in the United States for congenital deafness. The goal was to provide him with hearing, but the researchers had no idea if the treatment would work or, if it did, how much he would hear. The treatment was a success, introducing a child who had known nothing of sound to a new world. “There’s no sound I don’t like,” Aissam said, with the help of interpreters during an interview last week. “They’re all good.” While hundreds of millions of people in the world live with hearing loss that is defined as disabling, Aissam is among those whose deafness is congenital. His is an extremely rare form, caused by a mutation in a single gene, otoferlin. Otoferlin deafness affects about 200,000 people worldwide. The goal of the gene therapy is to replace the mutated otoferlin gene in patients’ ears with a functional gene. Although it will take years for doctors to sign up many more patients — and younger ones — to further test the therapy, researchers said that success for patients like Aissam could lead to gene therapies that target other forms of congenital deafness. © 2024 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 29119 - Posted: 01.27.2024

By Shaena Montanari Around 2012, Jennifer Groh and her colleagues began a series of experiments investigating the effect of eye movements on auditory signals in the brain. It wasn’t until years later that they noticed something curious in their data: In both an animal model and in people, eye movements coincide with ripples across the eardrum. The finding, published in 2018, seemed “weird,” says Groh, professor of psychology and neuroscience at Duke University — and ripe for further investigation. “You can go your whole career never studying something that is anywhere near as beautifully regular and reproducible,” she says. “Signals that are really robust are unlikely to be just random.” A new experiment from Groh’s lab has now taken her observation a step further and suggests the faint sounds — dubbed “eye movement-related eardrum oscillations,” or EMREOs for short — serve to link two sensory systems. The eardrum oscillations contain “clean and precise” information about the direction of eye movements and, according to Groh’s working hypothesis, help animals connect sound with a visual scene. “The basic problem is that the way we localize visual information and the way we localize sounds leads to two different reference frames,” Groh says. EMREOs, she adds, play a part in relating those frames. The brain, and not the eyes, must generate the oscillations, Groh and her colleagues say, because they happen at the same time as eye movements, or sometimes even before. To learn more about the oscillations, the team placed small microphones in the ears of 10 volunteers, who then performed visual tasks while the researchers tracked their eye movements. The group published their results in Proceedings of the National Academy of Sciences in November. © 2024 Simons Foundation

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 29115 - Posted: 01.27.2024

Allison Aubrey Among the roughly 40 million adults in the U.S. who have hearing loss, most don't use hearing aids. This means they may be missing out on more than just good hearing. Research shows hearing loss, if left untreated, can increase the risk of frailty, falls, social isolation, depression and cognitive decline. One study from scientists at Johns Hopkins University found that even people with mild hearing loss doubled their risk of dementia. Now a new study finds that restoring hearing loss with hearing aids may lengthen people's lives. Dr. Janet Choi, an otolaryngologist with Keck Medicine of USC, wanted to evaluate whether restoring hearing with hearing aids may increase the chances of living longer. Using data from the the National Health and Nutrition Examination Survey, a large, national study, Choi and her colleagues tracked the status of nearly 1,900 adults who had been shown to have hearing loss during screenings. The participants completed questionnaires about their use of hearing aids. "The group of patients who were using hearing aids regularly had a 24% lower risk of mortality compared to the group who never use hearing aids," Choi says. Meaning, the participants who were in the habit of wearing hearing aids were significantly less likely to die early. The researchers had hypothesized this would be the case given all the studies pointing to the negative impacts of untreated hearing loss. But Choi says they did not expect such a big difference in mortality risk. "We were surprised," she says. Prior research has shown that age-related hearing loss – if untreated – can take its toll on physical and mental health. And a recent study found restoring hearing with hearing aids may slow cognitive decline among people at high risk. © 2024 npr

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 29079 - Posted: 01.06.2024

By Henkjan Honing In 2009, my research group found that newborns possess the ability to discern a regular pulse— the beat—in music. It’s a skill that might seem trivial to most of us but that’s fundamental to the creation and appreciation of music. The discovery sparked a profound curiosity in me, leading to an exploration of the biological underpinnings of our innate capacity for music, commonly referred to as “musicality.” In a nutshell, the experiment involved playing drum rhythms, occasionally omitting a beat, and observing the newborns’ responses. Astonishingly, these tiny participants displayed an anticipation of the missing beat, as their brains exhibited a distinct spike, signaling a violation of their expectations when a note was omitted. Yet, as with any discovery, skepticism emerged (as it should). Some colleagues challenged our interpretation of the results, suggesting alternate explanations rooted in the acoustic nature of the stimuli we employed. Others argued that the observed reactions were a result of statistical learning, questioning the validity of beat perception being a separate mechanism essential to our musical capacity. Infants actively engage in statistical learning as they acquire a new language, enabling them to grasp elements such as word order and common accent structures in their native language. Why would music perception be any different? To address these challenges, in 2015, our group decided to revisit and overhaul our earlier beat perception study, expanding its scope, method and scale, and, once more, decided to include, next to newborns, adults (musicians and non-musicians) and macaque monkeys. The results, recently published in Cognition, confirm that beat perception is a distinct mechanism, separate from statistical learning. The study provides converging evidence on newborns’ beat perception capabilities. In other words, the study was not simply a replication but utilized an alternative paradigm leading to the same conclusion. © 2023 NautilusNext Inc., All rights reserved.

Related chapters from BN: Chapter 8: General Principles of Sensory Processing, Touch, and Pain; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 14: Attention and Higher Cognition
Link ID: 29067 - Posted: 12.27.2023

By Esther Landhuis When Frank Lin was in junior high, his grandma started wearing hearing aids. During dinner conversations, she was often painfully silent, and communicating by phone was nearly impossible. As a kid, Lin imagined “what her life would be like if she wasn’t always struggling to communicate.” It was around that time that Lin became interested in otolaryngology, the study of the ears, nose, and throat. He would go on to study to be an ENT physician, which, he hoped, could equip him to help patients with similar age-related hardships. Those aspirations sharpened during his residency at Johns Hopkins University School of Medicine in the late 2000s. Administering hearing tests in the clinic, Lin noticed that his colleagues had vastly different reactions to the same results in young versus old patients. If mild deficits showed up in a kid, “it would be like, ‘Oh, that hearing is critically important,’” said Lin, who today is the director of the Cochlear Center for Hearing and Public Health at Hopkins. But when they saw that same mild to moderate hearing loss in a 70-something patient, many would downplay the findings. Yet today, research increasingly suggests that untreated hearing loss puts people at higher risk for cognitive decline and dementia. And, unlike during Lin’s early training, many patients can now do something about it: They can assess their own hearing using online tests or mobile phone apps, and purchase over-the-counter hearing aids, which are generally more affordable their predecessors and came under regulation by the Food and Drug Administration in October 2022. Despite this expanded accessibility, interest in direct-to-consumer hearing devices has lagged thus far — in part, experts suggest, due to physician inattention to adult hearing health, inadequate insurance coverage for hearing aids, and lingering stigma around the issue. (As Lin put it: “There’s always been this notion that everyone has it as you get older, how can it be important?”) Even now, hearing tests aren’t necessarily recommended for individuals unless they report a problem.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 15: Language and Lateralization
Link ID: 29064 - Posted: 12.27.2023

By Carolyn Wilke Newborn bottlenose dolphins sport a row of hairs along the tops of their jaws. But once the animals are weaned, the whiskers fall out. “Everybody thought these structures are vestigial — so without any function,” said Guido Dehnhardt, a marine mammal zoologist at the University of Rostock in Germany. But Dr. Dehnhardt and his colleagues have discovered that the pits left by those hairs can perceive electricity with enough sensitivity that they may help the dolphins snag fish or navigate the ocean. The team reported its findings Thursday in The Journal of Experimental Biology. Dr. Dehnhardt first studied the whisker pits of a different species, the Guiana dolphin. He expected to find the typical structures of hair follicles, but those were missing. Yet the pits were loaded with nerve endings. He and his colleagues realized that the hairless follicles looked like the electricity-sensing structures on sharks and found that one Guiana dolphin responded to electrical signals. They wondered whether other toothed cetaceans, including bottlenose dolphins, could also sense electricity. For the new study, the researchers trained two bottlenose dolphins to rest their jaws, or rostrums, on a platform and swim away anytime they experienced a sensory cue like a sound or a flash of light. If they didn’t detect one of these signals, the dolphins were to stay put. “It’s basically the same as when we go to the doctor’s and do a hearing test — we have to press a button as soon as we hear a sound,” said Tim Hüttner, a biologist at the Nuremberg Zoo in Germany and a study co-author. Once trained, the dolphins also received electrical signals. “The dolphins responded correctly on the first trial,” Dr. Hüttner said. The animals were able to transfer what they had learned, revealing that they could also detect electric fields. Further study showed that the dolphins’ sensitivity to electricity was similar to that of the platypus, which is thought to use its electrical sense for foraging. © 2023 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 5: The Sensorimotor System
Link ID: 29037 - Posted: 12.09.2023

By Paula Span A year ago, the Food and Drug Administration announced new regulations allowing the sale of over-the-counter hearing aids and setting standards for their safety and effectiveness. That step — which was supposed to take three years but required five — portended cheaper, high-quality hearing aids that people with mild to moderate hearing loss could buy online or at local pharmacies and big stores. So how’s it going? It’s a mixed picture. Manufacturers and retailers have become serious about making hearing aids more accessible and affordable. Yet the O.T.C. market remains confusing, if not downright chaotic, for the mostly older consumers the new regulations were intended to help. The past year also brought renewed focus on the importance of treating hearing loss, which affects two-thirds of people over age 70. Researchers at Johns Hopkins University published the first randomized clinical trial showing that hearing aids could help reduce the pace of cognitive decline. Some background: In 2020, the influential Lancet Commission on Dementia Prevention, Intervention and Care identified hearing loss as the greatest potentially modifiable risk factor for dementia. Previous studies had demonstrated a link between hearing loss and cognitive decline, said Dr. Frank Lin, an otolaryngologist and epidemiologist at Johns Hopkins and lead author of the new research. “What remained unanswered was, If we treat hearing loss, does it actually reduce cognitive loss?” he said. The ACHIEVE study (for Aging and Cognitive Health Evaluation in Elders) showed that, at least for a particular group of older adults, it could. Of nearly 1,000 people ages 70 to 84 with untreated mild to moderate hearing loss, half received hearing assessments from audiologists, were fitted with midpriced hearing aids and were counseled on how to use them for several months. The control group participated in a health education program. Over three years, the study found that hearing-aid use had scant effect on healthy volunteers at low risk of cognitive loss. But among participants who were older and less affluent, hearing aids reduced the rate of cognitive decline by 48 percent, compared with the control group, a difference the researchers deemed “clinically meaningful.” © 2023 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 13: Memory and Learning
Link ID: 28979 - Posted: 11.01.2023

By Claudia López Lloreda In what seems like something out of a sci-fi movie, scientists have plucked the famous Pink Floyd song “Another Brick in the Wall” from individuals’ brains. Using electrodes, computer models and brain scans, researchers previously have been able to decode and reconstruct individual words and entire thoughts from people’s brain activity (SN: 11/15/22; SN: 5/1/23). The new study, published August 15 in PLOS Biology, adds music into the mix, showing that songs can also be decoded from brain activity and revealing how different brain areas pick up an array of acoustical elements. The finding could eventually help improve devices that allow communication from people with paralysis or other conditions that limit one’s ability to speak. People listened to Pink Floyd’s “Another Brick in the Wall” song while having their brain activity monitored. Using that data and a computer model, researchers were able to reconstruct sounds that resemble the song. To decode the song, neuroscientist Ludovic Bellier of the University of California, Berkeley and colleagues analyzed the brain activity recorded by electrodes implanted in the brains of 29 individuals with epilepsy. While in the hospital undergoing monitoring for the disorder, the individuals listened to the 1979 rock song. People’s nerve cells, particularly those in auditory areas, responded to hearing the song, and the electrodes detected not only neural signals associated with words but also rhythm, harmony and other musical aspects, the team found. With that information, the researchers developed a computer model to reconstruct sounds from the brain activity data, and found that they could produce sounds that resemble the song. © Society for Science & the Public 2000–2023.

Related chapters from BN: Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28876 - Posted: 08.19.2023

By Elizabeth Preston Some things need no translation. No matter what language you speak, you can probably recognize a fellow human who is cheering in triumph or swearing in anger. If you are a crocodile, you may recognize the sound of a young animal crying in distress, even if that animal is a totally different species — like, say, a human baby. That sound means you are close to a meal. In a study published Wednesday in Proceedings of the Royal Society B, researchers put speakers near crocodiles and played recordings of human, bonobo and chimpanzee infants. The crocodiles were attracted to the cries, especially shrieks that sounded more distressed. “That means that distress is something that is shared by species that are really, really distant,” said Nicolas Grimault, a bioacoustic research director at the French National Centre for Scientific Research and one of the paper’s authors. “You have some kind of emotional communication between crocodiles and humans.” These infant wails most likely drew crocodiles because they signaled an easy meal nearby, the authors say. But in some cases, the opposite may have been true: The crocs were trying to help. The animals in the study were Nile crocodiles, African predators that can reach up to 18 feet long. Understandably, the researchers kept their distance. They visited the reptiles at a Moroccan zoo and placed remote-controlled loudspeakers on the banks of outdoor ponds. The researchers played recordings of cries from those speakers while groups of up to 25 crocodiles were nearby. Some cries came from infant chimpanzees or bonobos calling to their mothers. Others were human babies, recorded either at bath time or in the doctor’s office during a vaccination. Nearly all of the recordings prompted some crocodiles to look or to move toward the speaker. When they heard the sounds of human babies getting shots, for example, almost half the crocodiles in a group responded. Dr. Grimault said the reptiles seemed most tempted by cries with a harsh quality that other studies have linked to distress in mammals. © 2023 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28871 - Posted: 08.09.2023

By Freda Kreier Pregnancy can do weird things to the body. For some bats, it can hamper their ability to “see” the world around them. Kuhl’s pipistrelle bats (Pipistrellus kuhlii) echolocate less frequently while pregnant, researchers report March 28 in BMC Biology. The change may make it harder for the tiny bats to detect prey and potential obstacles in the environment. The study is among the first to show that pregnancy can shape how nonhuman mammals sense their surroundings, says Yossi Yovel, a neuroecologist at Tel Aviv University in Israel. Nocturnal bats like Kuhl’s pipistrelles famously use sound to navigate and hunt prey in the dark (SN: 9/20/17). Their calls bounce off whatever is nearby and bats use the echoes to reconstruct what’s around them, a process aptly named echolocation. The faster a bat makes calls, the better it can make out its surroundings. But rapid-fire calling requires breathing deeply, which is something that pregnancy can get in the way of. “Although I’ve never been pregnant, I know that when I eat a lot, it’s more difficult to breathe,” Yovel says. So pregnancy — which can add a full gram to a 7-gram Kuhl’s pipistrelle and may push up on the lungs — might hamper echolocation. Yovel and colleagues tested their hypothesis by capturing 10 Kuhl’s pipistrelles, five of whom were pregnant, and training the bats to find and land on a platform. Recordings of the animals’ calls revealed that bats that weren’t pregnant made around 130 calls on average while searching for the platform. But bats that were pregnant made only around 110 calls, or 15 percent fewer. © Society for Science & the Public 2000–2023.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 8: Hormones and Sex
Link ID: 28774 - Posted: 05.10.2023

By Neelam Bohra Ayla Wing’s middle school students don’t always know what to make of their 26-year-old teacher’s hearing aids. The most common response she hears: “Oh, my grandma has them, too.” But grandma’s hearing aids were never like this: Bluetooth-enabled and connected to her phone, they allow Ms. Wing to toggle with one touch between custom settings. She can shut out the world during a screeching subway ride, hear her friends in noisy bars during a night out and even understand her students better by switching to “mumbly kids.” A raft of new hearing aids have hit the market in recent years, offering greater appeal to a generation of young adults that some experts say is both developing hearing problems earlier in life and — perhaps paradoxically — becoming more comfortable with an expensive piece of technology pumping sound into their ears. Some of the new models, including Ms. Wing’s, are made by traditional prescription brands, which usually require a visit to a specialist. But the Food and Drug Administration opened up the market last year when it allowed the sale of hearing aids over the counter. In response, brand names like Sony and Jabra began releasing their own products, adding to the new wave of designs and features that appeal to young consumers. “These new hearing aids are sexy,” said Pete Bilzerian, a 25-year-old in Richmond, Va., who has worn the devices since he was 7. He describes his early models as distinctly unsexy: “big, funky, tan-colored hearing aids with the molding that goes all around the ear.” But increasingly, those have given way to sleeker, smaller models with more technological capabilities. Nowadays, he said, no one seems to notice the electronics in his ear. “If it ever does come up as a topic, I just brush it off and say, ‘Hey, I got these very expensive AirPods.’” More people in Mr. Bilzerian’s age group might need the equivalent of expensive AirPods, experts say. By the time they turn 30, about a fifth of Americans today have had their hearing damaged by noise, the Centers for Disease Control and Prevention recently estimated. This number adds to the already substantial population of young people with hearing loss tied to genetics or medical conditions. © 2023 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28770 - Posted: 05.06.2023

Nicola Davis Science Correspondent If the sound of someone chewing gum or slurping their tea gets on your nerves, you are not alone. Researchers say almost one in five people in the UK has strong negative reactions to such noises. Misophonia is a disorder in which people feel strong emotional responses to certain sounds, feeling angry, distressed or even unable to function in social or work settings as a result. But just how common the condition is has been a matter of debate. Now researchers say they have found 18.4% of the UK population have significant symptoms of misophonia. “This is the very first study where we have a representative sample of the UK population,” said Dr Silia Vitoratou, first author of the study at King’s College London. “Most people with misophonia think they are alone, but they are not. This is something we need to know [about] and make adjustments if we can.” Writing in the journal Plos One, the team report how they gathered responses from 768 people using metrics including the selective sound sensitivity syndrome scale. This included one questionnaire probing the sounds that individuals found triggering, such as chewing or snoring, and another exploring the impact of such sounds – including whether they affected participants’ social life and whether the participant blamed the noise-maker – as well as the type of emotional response participants felt to the sounds and the intensity of their emotions. As a result, each participant was given an overall score. The results reveal more than 80% of participants had no particular feelings towards sounds such as “normal breathing” or “yawning” but this plummeted to less than 25% when it came to sounds including “slurping”, “chewing gum” and “sniffing”. © 2023 Guardian News & Media Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 14: Attention and Higher Cognition
Link ID: 28712 - Posted: 03.23.2023

By Erin Blakemore Tinnitus — a ringing or whistling sound in the ears — plagues millions worldwide. Though the estimates of those bothered by the condition vary, a new study suggests they may have something in common: exposure to road traffic noise at home. The paper, published in Environmental Health Perspectives, looked to Denmark to find a potential link between road noise and tinnitus levels. The nationwide study included data on 3.5 million Danish residents who were 30 and older between 2000 and 2017. Over that time, 40,692 were diagnosed with tinnitus. When the researchers calculated likely traffic and noise levels at the quietest facade of their residences in that period, they found those living with louder road noise were more likely to be diagnosed with tinnitus than those who lived in quieter areas. People’s risk rose 6 percent with every 10-decibel increase in road traffic noise compared with controls. Levels rose the longer a person had been exposed to higher road traffic noise. Women, people without a previous history of hearing loss, and people with higher education and income were at increased risk. The study did not find an association between railway noise and tinnitus diagnoses. Though the paper shows an association between tinnitus and traffic noise, it does not prove that one causes the other. The researchers say it’s important to learn more about the potential effects of residential noise exposure — and posit that if traffic noise does cause tinnitus, it might do so by disrupting people’s sleep. “We know that traffic noise can make us stressed and affect our sleep. And that tinnitus can get worse when we live under stressful situations and we do not sleep well,” said Jesper Hvass Schmidt, an associate professor at the University of Southern Denmark and the paper’s co-author, in a news release.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28661 - Posted: 02.11.2023

Niyazi Arslan Cochlear implants are among the most successful neural prostheses on the market. These artificial ears have allowed nearly 1 million people globally with severe to profound hearing loss to either regain access to the sounds around them or experience the sense of hearing for the first time. However, the effectiveness of cochlear implants varies greatly across users because of a range of factors, such as hearing loss duration and age at implantation. Children who receive implants at a younger age may may be able to acquire auditory skills similar to their peers with natural hearing. I am a researcher studying pitch perception with cochlear implants. Understanding the mechanics of this technology and its limitations can help lead to potential new developments and improvements in the future. In fully-functional hearing, sound waves enter the ear canal and are converted into neural impulses as they move through hairlike sensory cells in the cochlea, or inner ear. These neural signals then travel through the auditory nerve behind the cochlea to the central auditory areas of the brain, resulting in a perception of sound. Analysis of the world, from experts People with severe to profound hearing loss often have damaged or missing sensory cells and are unable to convert sound waves into electrical signals. Cochlear implants bypass these hairlike cells by directly stimulating the auditory nerve with electrical pulses. Cochlear implants consist of an external part wrapped behind the ear and an internal part implanted under the skin. © 2010–2023, The Conversation US, Inc.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28639 - Posted: 01.25.2023

Miryam Naddaf Stimulating neurons that are linked to alertness helps rats with cochlear implants learn to quickly recognize tunes, researchers have found. The results suggest that activity in a brain region called the locus coeruleus (LC) improves hearing perception in deaf rodents. Researchers say the insights are important for understanding how the brain processes sound, but caution that the approach is a long way from helping people. “It’s like we gave them a cup of coffee,” says Robert Froemke, an otolaryngologist at New York University School of Medicine and a co-author of the study, published in Nature on 21 December1. Cochlear implants use electrodes in the inner-ear region called the cochlea, which is damaged in people who have severe or total hearing loss. The device converts acoustic sounds into electrical signals that stimulate the auditory nerve, and the brain learns to process these signals to make sense of the auditory world. Some people with cochlear implants learn to recognize speech within hours of the device being implanted, whereas others can take months or years. “This problem has been around since the dawn of cochlear implants, and it shows no signs of being resolved,” says Gerald Loeb at the University of Southern California in Los Angeles, who helped to develop one of the first cochlear implants. Researchers say that a person’s age, the duration of their hearing loss and the type of processor and electrodes in the implant don’t account for this variation, but suggest that the brain could be the source of the differences. “It’s sort of the black box,” says Daniel Polley, an auditory neuroscientist at Harvard Medical School in Boston, Massachusetts. Most previous research has focused on improving the cochlear device and the implantation procedure. Attempts to improve the brain’s ability to use the device open up a way to improve communication between the ear and the brain, says Polley. © 2022 Springer Nature Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28615 - Posted: 12.28.2022

Hannah Devlin Science correspondent Music makes you lose control, Missy Elliott once sang on a hit that is almost impossible to hear without bopping along. Now scientists have discovered that rats also find rhythmic beats irresistible, showing how they instinctively move in time to music. This ability was previously thought to be uniquely human and scientists say the discovery provides insights into the animal mind and the origins of music and dance. “Rats displayed innate – that is, without any training or prior exposure to music – beat synchronisation,” said Dr Hirokazu Takahashi of the University of Tokyo. “Music exerts a strong appeal to the brain and has profound effects on emotion and cognition,” he added. While there have been previous demonstrations of animals dancing along to music – TikTok has a wealth of examples – the study is one of the first scientific investigations of the phenomenon. In the study, published in the journal Science Advances, 10 rats were fitted with wireless, miniature accelerometers to measure the slightest head movements. They were then played one-minute excerpts from Mozart’s Sonata for Two Pianos in D Major, at four different tempos: 75%, 100%, 200% and 400% of the original speed. Twenty human volunteers also participated. The scientists thought it possible that rats would prefer faster music as their bodies, including heartbeat, work at a faster pace. By contrast, the time constant of the brain is surprisingly similar across species. © 2022 Guardian News & Media Limited

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28547 - Posted: 11.13.2022

Elizabeth Pennisi Think of the chattiest creatures in the animal kingdom and songbirds, dolphins, and—yes—humans probably come to mind. Turtles probably don’t register. But these charismatic reptiles also communicate using a large repertoire of clicks, snorts, and chortles. Now, by recording the “voices” of turtles and other supposedly quiet animals, scientists have concluded that all land vertebrate vocalizations—from the canary’s song to the lion’s roar—have a common root that dates back more than 400 million years. The findings imply animals began to vocalize very early in their evolutionary history—even before they possessed well-developed ears, says W. Tecumseh Fitch, a bioacoustician at the University of Vienna who was not involved with the work. “It suggests our ears evolved to hear these vocalizations.” Several years ago, University of Arizona evolutionary ecologist John Wiens and his graduate student Zhuo Chen started looking into the evolutionary roots of acoustic communication—basically defined as the sounds animals make with their mouths using their lungs. Combing the scientific literature, the duo compiled a family tree of all the acoustic animals known at the time, eventually concluding such soundmaking abilities arose multiple times in vertebrates between 100 million and 200 million years ago. But Gabriel Jorgewich-Cohen, an evolutionary biologist at the University of Zürich, noticed an oversight: turtles. Though Wiens and Chen had found that only two of 14 families of turtles made sounds, he was finding a lot more. He spent 2 years recording 50 turtle species in the act of “speaking.”

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28530 - Posted: 10.28.2022

By Paula Span The world of hearing health will change on Oct. 17, when the Food and Drug Administration’s new regulations, announced in August, will make quality hearing aids an over-the-counter product. It just won’t transform as quickly or as dramatically, at least at first, as advocates, technology and consumer electronics companies and people with mild to moderate hearing loss have been hoping. “It finally, actually happened after all these years,” said Dr. Frank Lin, the director of the Johns Hopkins Cochlear Center for Hearing and Public Health and a longtime supporter of the regulations, which Congress authorized five years ago. “Ninety-plus percent of adults with hearing loss have needs that can be served by over-the-counter hearing aids,” he said. For decades, the sale of hearing aids was restricted to licensed audiologists and other professionals; that has kept prices high — prescription hearing aids can cost $4,000 to $5,000 — and access limited. In contrast, the regulations provide “a clear glide path for new companies to enter this field,” Dr. Lin said. But, he quickly added, “it may be the Wild West for the next few years.” Barbara Kelley, the executive director of the Hearing Loss Association of America, concurred: “It’s a new frontier, and it is confusing. We need time to see how the market settles out.” In an ideal scenario, a person would be able to walk into almost any pharmacy or big-box store and buy a sophisticated pair of hearing aids for a few hundred dollars, no prescription required. But the shift won’t materialize right away, experts say. In 2017, Congress granted the F.D.A. three years to develop standards for safe and effective over-the-counter hearing aids. The agency took five years instead, and the long delay and continued industry opposition made manufacturers skittish about investing, Dr. Lin said. © 2022 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 28509 - Posted: 10.13.2022

By Carolyn Gramling Hot or not? Peeking inside an animal’s ear — even a fossilized one — may tell you whether it was warm- or cold-blooded. Using a novel method that analyzes the size and shape of the inner ear canals, researchers suggest that mammal ancestors abruptly became warm-blooded about 233 million years ago, the team reports in Nature July 20. Warm-bloodedness, or endothermy, isn’t unique to mammals — birds, the only living dinosaurs, are warm-blooded, too. But endothermy is one of mammals’ key features, allowing the animals to regulate their internal body temperatures by controlling their metabolic rates. This feature allowed mammals to occupy environmental niches from pole to equator, and to weather the instability of ancient climates (SN: 6/7/22). When endothermy evolved, however, has been a mystery. Based on fossil analyses of growth rates and oxygen isotopes in bones, researchers have proposed dates for its emergence as far back as 300 million years ago. The inner ear structures of mammals and their ancestors hold the key to solving that mystery, says Ricardo Araújo, a vertebrate paleontologist at the University of Lisbon. In all vertebrates, the labyrinth of semicircular canals in the inner ear contains a fluid that responds to head movements, brushing against tiny hair cells in the ear and helping to maintain a sense of balance. That fluid can become thicker or thinner depending on body temperature. “Mammals have very unique inner ears,” Araújo says. Compared with cold-blooded vertebrates of similar size, the dimensions of mammals’ semicircular canals — such as thickness, length and radius of curvature — is particularly small, he says. “The ducts are very thin and tend to be very circular compared with other animals.” By contrast, fish have the largest for their body size. © Society for Science & the Public 2000–2022.

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 28408 - Posted: 07.23.2022