Chapter 6. Hearing, Balance, Taste, and Smell
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Ingrid Wickelgren One important function of your inner ear is stabilizing your vision when your head is turning. When your head turns one way, your vestibular system moves your eyes in the opposite direction so that what you are looking at remains stable. To see for yourself how your inner ears make this adjustment, called the vestibulo-ocular reflex, hold your thumb upright at arm’s length. Shake your head back and forth about twice per second while looking at your thumb. See that your thumb remains in focus. Now create the same relative motion by swinging your arm back and forth about five inches at the same speed. Notice that your thumb is blurry. To see an object clearly, the image must remain stationary on your retina. When your head turns, your vestibular system very rapidly moves your eyes in the opposite direction to create this stability. When the thumb moves, your visual system similarly directs the eyes to follow, but the movement is too slow to track a fast-moving object, causing blur. © 2014 Scientific American
By STEPHANIE FAIRYINGTON A few months ago, I was on a Manhattan-bound D train heading to work when a man with a chunky, noisy newspaper got on and sat next to me. As I watched him softly turn the pages of his paper, a chill spread like carbonated bubbles through the back of my head, instantly relaxing me and bringing me to the verge of sweet slumber. It wasn’t the first time I’d felt this sensation at the sound of rustling paper — I’ve experienced it as far back as I can remember. But it suddenly occurred to me that, as a lifelong insomniac, I might be able to put it to use by reproducing the experience digitally whenever sleep refused to come. Under the sheets of my bed that night, I plugged in some earphones, opened the YouTube app on my phone and searched for “Sound of pages.” What I discovered stunned me. There were nearly 2.6 million videos depicting a phenomenon called autonomous sensory meridian response, or A.S.M.R., designed to evoke a tingling sensation that travels over the scalp or other parts of the body in response to auditory, olfactory or visual forms of stimulation. The sound of rustling pages, it turns out, is just one of many A.S.M.R. triggers. The most popular stimuli include whispering; tapping or scratching; performing repetitive, mundane tasks like folding towels or sorting baseball cards; and role-playing, where the videographer, usually a breathy woman, softly talks into the camera and pretends to give a haircut, for example, or an eye examination. The videos span 30 minutes on average, but some last more than an hour. For those not wired for A.S.M.R. — and even for those who, like me, apparently are — the videos and the cast of characters who produce them — sometimes called “ASMRtists” or “tingle-smiths” — can seem weird, creepy or just plain boring. (Try pitching the pleasures of watching a nerdy German guy slowly and silently assemble a computer for 30 minutes.) © 2014 The New York Times Company
|By James Phillips Our inner ear is a marvel. The labyrinthine vestibular system within it is a delicate, byzantine structure made up of tiny canals, crystals and pouches. When healthy, this system enables us to keep our balance and orient ourselves. Unfortunately, a study in the Archives of Internal Medicine found that 35 percent of adults over age 40 suffer from vestibular dysfunction. A number of treatments are available for vestibular problems. During an acute attack of vertigo, vestibular suppressants and antinausea medications can reduce the sensation of motion as well as nausea and vomiting. Sedatives can help patients sleep and rest. Anti-inflammatory drugs can reduce any damage from acute inflammation and antibiotics can treat an infection. If a structural change in the inner ear has loosened some of its particulate matter—for instance, if otolith (calcareous) crystals, which are normally in tilt-sensitive sacs, end up in the semicircular canals, making the canals tilt-sensitive—simple repositioning exercises in the clinic can shake the loose material, returning it where it belongs. After a successful round of therapy, patients no longer sense that they are tilting whenever they turn their heads. If vertigo is a recurrent problem, injecting certain medications can reduce or eliminate the fluctuating function in the affected ear. As a last resort, a surgeon can effectively destroy the inner ear—either by directly damaging the end organs or by cutting the eighth cranial nerve fibers, which carry vestibular information to the brain. The latter surgery involves removing a portion of the skull and shifting the brain sideways, so it is not for the faint of heart. © 2014 Scientific American
Link ID: 19886 - Posted: 07.28.2014
by Claudia Caruana GOT that ringing in your ears? Tinnitus, the debilitating condition that plagued Beethoven and Darwin, affects roughly 10 per cent of the world's population, including 30 million people in the US alone. Now, a device based on vagus nerve stimulation promises to eliminate the sounds for good by retraining the brain. At the moment, many chronic sufferers turn to state of the art hearing aids configured to play specific tones meant to cancel out the tinnitus. But these do not always work because they just mask the noise. The new device, developed by MicroTransponder in Dallas, Texas, works in an entirely different way. The Serenity System uses a transmitter connected to the vagus nerve in the neck – the vagus nerve connects the brain to many of the body's organs. The thinking goes that most cases of chronic tinnitus result from changes in the signals sent from the ear to neurons in the brain's auditory cortex. This device is meant to retrain those neurons to forget the annoying noise. To use the system, a person wears headphones and listens to computer-generated sounds. First, they listen to tones that trigger the tinnitus before being played different frequencies close to the problematic one. Meanwhile, the implant stimulates the vagus nerve with small pulses. The pulses trigger the release of chemicals that increase the brain's ability to reconfigure itself. The process has already worked in rats (Nature, doi.org/b63kt9) and in a small human trial this year, where it helped around half of the participants. "Vagus nerve stimulation takes advantage of the brain's neuroplasticity – the ability to reconfigure itself," says Michael Kilgard at the University of Texas at Dallas, and a consultant to MicroTransponder. © Copyright Reed Business Information Ltd.
by Helen Thomson How do you smell after a drink? Quite well, it turns out. A modest amount of alcohol boosts your sense of smell. It is well known that we can improve our sense of smell through practice. But a few people have also experienced a boost after drug use or brain damage. This suggests our sensitivity to smell may be damped by some sort of inhibition in the brain, which can be lifted under some circumstances, says Yaara Endevelt of the Weizmann Institute of Science in Rehovot, Israel. To explore this notion, Endevelt and her colleagues investigated whether drinking alcohol – known to lower inhibitory signals in the brain – affected the sense of smell. In one odour-discrimination test, 20 volunteers were asked to smell three different liquids. Two were a mixture of the same six odours, the third contained a similar mixture with one odour replaced. Each volunteer was given 2 seconds to smell each of the liquids and say which was the odd one out. The test was repeated six times with each of three trios of liquids. They were then given a drink that consisted of 35 millilitres of vodka and sweetened grape juice, or the juice alone, before repeating the experiment with the same set of liquids. In a second experiment with a similar drinking structure, the same volunteers were asked which of three liquids had a rose-like odour. The researchers increased the concentration of the odour until the volunteers got the right answer three times in a row. © Copyright Reed Business Information Ltd.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 19875 - Posted: 07.24.2014
By Virginia Morell Dogs, most of us think, have the best noses on the planet. But a new study reveals that this honor actually goes to elephants. The power of a mammal’s sniffer hinges on the number and type of its olfactory receptor genes. These genes are expressed in sensory cells that line the nasal cavity and are specialized for detecting odor molecules. When triggered, they set off a cascade of signals from the nose to the brain, enabling an animal to identify a particular smell. In the new study, scientists identified and examined olfactory receptor genes from 13 mammalian species. The researchers found that every species has a highly unique variety of such genes: Of the 10,000 functioning olfactory receptor genes the team studied, only three are shared among the 13 species. Perhaps not surprisingly, given the length of its trunk, the African elephant has the largest number of such genes—nearly 2000, the scientists report online today in the Genome Research. In contrast, dogs have only 1000, and humans and chimpanzees, less than 400—possibly because higher primates rely more on their vision and less on their sense of smell. The discovery fits with another recent study showing that Asian elephants are as good as mice (which have nearly 1300 olfactory receptor genes) at discriminating between odors; dogs and elephants have yet to be put to a nose-to-trunk sniffer test. Other research has also shown just how important a superior sense of smell is to the behemoths. A slight whiff is all that’s necessary, for instance, for elephants, such as those in the photo above, to distinguish between two Kenyan ethnic groups—the Maasai, who sometimes spear them, and the Kamba, who rarely pose a threat. They can also recognize as many as 30 different family members from cues in their urine. © 2014 American Association for the Advancement of Science.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 19868 - Posted: 07.23.2014
By PETER ANDREY SMITH Sweet, salty, sour and bitter — every schoolchild knows these are the building blocks of taste. Our delight in every scrumptious bonbon, every sizzling hot dog, derives in part from the tongue’s ability to recognize and signal just four types of taste. But are there really just four? Over the last decade, research challenging the notion has been piling up. Today, savory, also called umami, is widely recognized as a basic taste, the fifth. And now other candidates, perhaps as many as 10 or 20, are jockeying for entry into this exclusive club. “What started off as a challenge to the pantheon of basic tastes has now opened up, so that the whole question is whether taste is even limited to a very small number of primaries,” said Richard D. Mattes, a professor of nutrition science at Purdue University. Taste plays an intrinsic role as a chemical-sensing system for helping us find what is nutritious (stimulatory) and as a defense against what is poison (aversive). When we put food in our mouths, chemicals slip over taste buds planted into the tongue and palate. As they respond, we are thrilled or repulsed by what we’re eating. But the body’s reaction may not always be a conscious one. In the late 1980s, in a windowless laboratory at Brooklyn College, the psychologist Anthony Sclafani was investigating the attractive power of sweets. His lab rats loved Polycose, a maltodextrin powder, even preferring it to sugar. That was puzzling for two reasons: Maltodextrin is rarely found in plants that rats might feed on naturally, and when human subjects tried it, the stuff had no obvious taste. More than a decade later, a team of exercise scientists discovered that maltodextrin improved athletic performance — even when the tasteless additive was swished around in the mouth and spit back out. Our tongues report nothing; our brains, it seems, sense the incoming energy. © 2014 The New York Times Company
Keyword: Chemical Senses (Smell & Taste)
Link ID: 19867 - Posted: 07.22.2014
|By Roni Jacobson Last week, nine-year-old Hally Yust died after contracting a rare brain-eating amoeba infection while swimming near her family’s home in Kansas. The organism responsible, Naegleria fowleri, dwells in warm freshwater lakes and rivers and usually targets children and young adults. Once in the brain it causes a swelling called primary meningoencephalitis. The infection is almost universally fatal: it kills more than 97 percent of its victims within days. Although deadly, infections are exceedingly uncommon—there were only 34 reported in the U.S. during the past 10 years—but evidence suggests they may be increasing. Prior to 2010 more than half of cases came from Florida, Texas and other southern states. Since then, however, infections have popped up as far north as Minnesota. “We’re seeing it in states where we hadn’t seen cases before,” says Jennifer Cope, an epidemiologist and expert in amoeba infections at the U.S. Centers for Disease Control and Prevention. The expanding range of Naegleria infections could potentially be related to climate change, she adds, as the organism thrives in warmer temperatures. “It’s something we’re definitely keeping an eye on.” Still, “when it comes to Naegleria there’s a lot we don’t know,” Cope says—including why it chooses its victims. The amoeba has strategies to evade the immune system, and treatment options are meager partly because of how fast the infection progresses. But research suggests that the infectioncan be stopped if it is caught soon enough. So what happens during an N. fowleri infection? © 2014 Scientific American
Keyword: Chemical Senses (Smell & Taste)
Link ID: 19855 - Posted: 07.21.2014
By Virginia Morell Many moth species sing courtship songs, and until now, scientists knew of only two types of such melodies. Some species imitate attacking bats, causing a female to freeze in place, whereas others croon tunes that directly woo the ladies. But the male yellow peach moth (Conogethes punctiferalis, pictured) belts out a combination song, scientists report online today in the Proceedings of the Royal Society B. These tiny troubadours, which are found throughout Asia, emit ultrasonic refrains composed of short and long pulses by contracting their abdominal tymbals, sound-producing membranes. (Listen to a male’s courtship song above.) The short pulses, the scientists say, are similar to the hunting calls of insectivorous horseshoe bats. However, unlike other moth species, these males aren’t directing the batlike tunes at females, but rather at rival males. Using playback experiments, the scientists showed that a male drives away competitors with the short pulses of his ditty, while inducing a female to mate with the long note. Indeed, a receptive virgin female moth (1 to 3 days old) typically raises her wings after hearing this part of the male’s song—a sign that she accepts the male, the scientists say. It is thus the first moth species known to have a dual-purpose melody. © 2014 American Association for the Advancement of Science
Philip Ball Lead guitarists usually get to play the flashy solos while the bass player gets only to plod to the beat. But this seeming injustice could have been determined by the physiology of hearing. Research published today in the Proceedings of the National Academy of Sciences1 suggests that people’s perception of timing in music is more acute for lower-pitched notes. Psychologist Laurel Trainor of McMaster University in Hamilton, Canada, and her colleagues say that their findings explain why in the music of many cultures the rhythm is carried by low-pitched instruments while the melody tends to be taken by the highest pitched. This is as true for the low-pitched percussive rhythms of Indian classical music and Indonesian gamelan as it is for the walking double bass of a jazz ensemble or the left-hand part of a Mozart piano sonata. Earlier studies2 have shown that people have better pitch discrimination for higher notes — a reason, perhaps, that saxophonists and lead guitarists often have solos at a squealing register. It now seems that rhythm works best at the other end of the scale. Trainor and colleagues used the technique of electroencephalography (EEG) — electrical sensors placed on the scalp — to monitor the brain signals of people listening to streams of two simultaneous piano notes, one high-pitched and the other low-pitched, at equally spaced time intervals. Occasionally, one of the two notes was played slightly earlier, by just 50 milliseconds. The researchers studied the EEG recordings for signs that the listeners had noticed. © 2014 Nature Publishing Group,
Link ID: 19776 - Posted: 07.01.2014
Nicola Davis The old adage that we eat with our eyes appears to be correct, according to research that suggests diners rate an artistically arranged meal as more tasty – and are prepared to pay more for it. The team at Oxford University tested the idea by gauging the reactions of diners to food presented in different ways. Inspired by Wassily Kandinsky's "Painting Number 201" Franco-Columbian chef and one of the authors of the study, Charles Michel, designed a salad resembling the abstract artwork to explore how the presentation of food affects the dining experience. "A number of chefs now are realising that they are being judged by how their foods photograph – be it in the fancy cookbooks [or], more often than not, when diners instagram their friends," explains Professor Charles Spence, experimental psychologist at the University of Oxford and a co-author of the study. Thirty men and 30 women were each presented with one of three salads containing identical ingredients, arranged either to resemble the Kandinsky painting, a regular tossed salad, or a "neat" formation where each component was spaced away from the others. Seated alone at a table mimicking a restaurant setting, and unaware that other versions of the salad were on offer, each participant was given two questionnaires asking them to rate various aspects of the dish on a 10-point scale, before and after tucking into the salad. Before participants sampled their plateful, the Kandinsky-inspired dish was rated higher for complexity, artistic presentation and general liking. Participants were prepared to pay twice as much for the meal as for either the regular or "neat arrangements". © 2014 Guardian News and Media Limited
by Frank Swain WHEN it comes to personal electronics, it's difficult to imagine iPhones and hearing aids in the same sentence. I use both and know that hearing aids have a well-deserved reputation as deeply uncool lumps of beige plastic worn mainly by the elderly. Apple, on the other hand, is the epitome of cool consumer electronics. But the two are getting a lot closer. The first "Made for iPhone" hearing aids have arrived, allowing users to stream audio and data between smartphones and the device. It means hearing aids might soon be desirable, even to those who don't need them. A Bluetooth wireless protocol developed by Apple last year lets the prostheses connect directly to Apple devices, streaming audio and data while using a fraction of the power consumption of conventional Bluetooth. LiNX, made by ReSound (pictured), and Halo hearing aids made by Starkey – both international firms – use the iPhone as a platform to offer users new features and added control over their hearing aids. "The main advantage of Bluetooth is that the devices are talking to each other, it's not just one way," says David Nygren, UK general manager of ReSound. This is useful as hearing aids have long suffered from a restricted user interface – there's not much room for buttons on a device the size of a kidney bean. This is a major challenge for hearing-aid users, because different environments require different audio settings. Some devices come with preset programmes, while others adjust automatically to what their programming suggests is the best configuration. This is difficult to get right, and often devices calibrated in the audiologist's clinic fall short in the real world. © Copyright Reed Business Information Ltd.
Link ID: 19757 - Posted: 06.23.2014
By Ian Randall The human tongue may have a sixth sense—and no, it doesn’t have anything to do with seeing ghosts. Researchers have found that in addition to recognizing sweet, sour, salty, savory, and bitter tastes, our tongues can also pick up on carbohydrates, the nutrients that break down into sugar and form our main source of energy. Past studies have shown that some rodents can distinguish between sugars of different energy densities, while others can still tell carbohydrate and protein solutions apart even when their ability to taste sweetness is lost. A similar ability has been proposed in humans, with research showing that merely having carbohydrates in your mouth can improve physical performance. How this works, however, has been unclear. In the new study, to be published in Appetite, the researchers asked participants to squeeze a sensor held between their right index finger and thumb when shown a visual cue. At the same time, the participants’ tongues were rinsed with one of three different fluids. The first two were artificially sweetened—to identical tastes—but with only one containing carbohydrate; the third, a control, was neither sweet nor carb-loaded. When the carbohydrate solution was used, the researchers observed a 30% increase in activity for the brain areas that control movement and vision. This reaction, they propose, is caused by our mouths reporting that additional energy in the form of carbs is coming. The finding may explain both why diet products are often viewed as not being as satisfying as their real counterparts and why carbohydrate-loaded drinks seem to immediately perk up athletes—even before their bodies can convert the carbs to energy. Learning more about how this “carbohydrate sense” works could lead to the development of artificially sweetened foods, the researchers propose, “as hedonistically rewarding as the real thing.” © 2014 American Association for the Advancement of Science
Keyword: Chemical Senses (Smell & Taste)
Link ID: 19700 - Posted: 06.06.2014
by Catherine de Lange Could your ideal diet be written in your genes? That's the promise of nutrigenomics, which looks for genetic differences in the way people's bodies process food so that diets can be tailored accordingly. The field had a rocky start after companies overhyped its potential, but with advances in genetic sequencing, and a slew of new studies, the concept is in for a reboot. Last week, Nicola Pirastu at the University of Trieste, Italy, and his colleagues told the European Society of Human Genetics meeting in Milan that diets tailored to genes that are related to metabolism can help people lose weight. The team used the results of a genetic test to design specific diets for 100 obese people that also provided them with 600 fewer calories than usual. A control group was placed on a 600-calorie deficit, untailored diet. After two years, both groups had lost weight, but those in the nutrigenetic group lost 33 per cent more. They also took only a year to lose as much weight as the group on the untailored diet lost in two years. If this is shown to work in bigger, randomised trials, it would be fantastic, says Ana Valdes, a genetic epidemiologist at the University of Nottingham, UK. Some preliminary information will soon be available from Europe's Food4Me project. It is a study of 1200 people across several countries who were given either standard nutrition advice, or a similarly genetically tailored diet. "It's testing whether we can get bigger changes in diet using a personalised approach, and part of that is using genetic information," says team member John Mathers, director of the Human Nutrition Research Centre at Newcastle University, UK. © Copyright Reed Business Information Ltd.
|By Christie Nicholson Conventional wisdom once had it that each brain region is responsible for a specific task. And so we have the motor cortex for handling movements, and the visual cortex, for processing sight. And scientists thought that such regions remained fixed for those tasks beyond the age of three. But within the past decade researchers have realized that some brain regions can pinch hit for other regions, for example, after a damaging stroke. And now new research finds that the visual cortex is constantly doing double duty—it has a role in processing not just sight, but sound. When we hear [siren sound], we see a siren. In the study, scientists scanned the brains of blindfolded participants as the subjects listened to three sounds: [audio of birds, audio of traffic, audio of a talking crowd.] And the scientists could tell what specific sounds the subjects were hearing just by analyzing the brain activity in the visual cortex. [Petra Vetter, Fraser W. Smith and Lars Muckli, Decoding Sound and Imagery Content in Early Visual Cortex, in Current Biology] The next step is to determine why the visual cortex is horning in on the audio action. The researchers think the additional role conferred an evolutionary advantage: having a visual system primed by sound to see the source of that sound could have given humans an extra step in the race for survival. © 2014 Scientific American
Tastes are a privilege. The oral sensations not only satisfy foodies, but also on a primal level, protect animals from toxic substances. Yet cetaceans—whales and dolphins—may lack this crucial ability, according to a new study. Mutations in a cetacean ancestor obliterated their basic machinery for four of the five primary tastes, making them the first group of mammals to have lost the majority of this sensory system. The five primary tastes are sweet, bitter, umami (savory), sour, and salty. These flavors are recognized by taste receptors—proteins that coat neurons embedded in the tongue. For the most part, taste receptor genes present across all vertebrates. Except, it seems, cetaceans. Researchers uncovered a massive loss of taste receptors in these animals by screening the genomes of 15 species. The investigation spanned the two major lineages of cetaceans: Krill-loving baleen whales—such as bowheads and minkes—were surveyed along with those with teeth, like bottlenose dolphins and sperm whales. The taste genes weren’t gone per se, but were irreparably damaged by mutations, the team reports online this month in Genome Biology and Evolution. Genes encode proteins, which in turn execute certain functions in cells. Certain errors in the code can derail protein production—at which point the gene becomes a “pseudogene” or a lingering shell of a trait forgotten. Identical pseudogene corpses were discovered across the different cetacean species for sweet, bitter, umami, and sour taste receptors. Salty tastes were the only exception. © 2014 American Association for the Advancement of Science.
Jessica Morrison Interference from electronics and AM radio signals can disrupt the internal magnetic compasses of migratory birds, researchers report today in Nature1. The work raises the possibility that cities have significant effects on bird migration patterns. Decades of experiments have shown that migratory birds can orient themselves on migration paths using internal compasses guided by Earth's magnetic field. But until now, there has been little evidence that electromagnetic radiation created by humans affects the process. Like most biologists studying magnetoreception, report co-author Henrik Mouritsen used to work at rural field sites far from cities teeming with electromagnetic noise. But in 2002, he moved to the University of Oldenburg, in a German city of around 160,000 people. As part of work to identify the part of the brain in which compass information is processed, he kept migratory European robins (Erithacus rubecula) inside wooden huts — a standard procedure that allows researchers to investigate magnetic navigation while being sure that the birds are not getting cues from the Sun or stars. But he found that on the city campus, the birds could not orient themselves in their proper migratory direction. “I tried all kinds of stuff to make it work, and I couldn’t make it work,” Mouritsen says, “until one day we screened the wooden hut with aluminium.” Mouritsen and his colleagues covered the huts with aluminium plates and electrically grounded them to cut out electromagnetic noise in frequencies ranging from 50 kilohertz to 5 megahertz — which includes the range used for AM radio transmissions. The shielding reduced the intensity of the noise by about two orders of magnitude. Under those conditions, the birds were able to orient themselves. © 2014 Nature Publishing Group,
Keyword: Animal Migration
Link ID: 19590 - Posted: 05.08.2014
Humans stink, and it’s wonderful. A few whiffs of a pillow in the morning can revive memories of a lover. The sweaty stench of a gym puts us in the mood to exercise. Odors define us, yet the scientific zeitgeist is that we don’t communicate through pheromones—scents that influence behavior. A new study challenges that thinking, finding that scent can change whether we think someone is masculine or feminine. Humans carry more secretion and sweat glands in their skin than any other primate. Yet 70% of people lack a vomeronasal organ, a crescent-shaped bundle of neurons at the base of each nostril that allows a variety of species—from reptiles to nonprimate mammals—to pick up on pheromones. (If you’ve ever seen your cat huff something, he’s using this organ.) Still, scientists have continued to hunt for examples of pheromones that humans might sense. Two strong candidates are androstadienone (andro) and estratetraenol (estra). Men secrete andro in their sweat and semen, while estra is primarily found in female urine. Researchers have found hints that both trigger arousal—by improving moods and switching on the brain’s “urge” center, the hypothalamus—in the opposite sex. Yet to be true pheromones, these chemicals must shape how people view different genders. That’s exactly what they do, researchers from the Chinese Academy of Sciences in Beijing report online today in Current Biology. The team split men and women into groups of 24 and then had them watch virtual simulations of a human figure walking (see video). The head, pelvis, and major joints in each figure were replaced with moving dots. Subjects in prior studies had ranked the videos as being feminine or masculine. For instance, watch the figure on the far left, which was gauged as having a quintessential female strut. Notice a distinctive swagger in the “hip” dots and how they contrast with the flat gait of the “male” prototype all the way to the right. © 2014 American Association for the Advancement of Science
by Bethany Brookshire When you are waiting with a friend to cross a busy intersection, car engines running, horns honking and the city humming all around you, your brain is busy processing all those sounds. Somehow, though, the human auditory system can filter out the extraneous noise and allow you to hear what your friend is telling you. But if you tried to ask your iPhone a question, Siri might have a tougher time. A new study shows how the mammalian brain can distinguish the signal from the noise. Brain cells in the primary auditory cortex can both turn down the noise and increase the gain on the signal. The results show how the brain processes sound in noisy environments, and might eventually help in the development of better voice recognition devices, including improvements to cochlear implants for those with hearing loss. Not to mention getting Siri to understand you on a chaotic street corner. Nima Mesgarani and colleagues at the University of Maryland in College Park were interested in how mammalian brains separate speech from background noise. Ferrets have an auditory system that is extremely similar to humans. So the researchers looked at the A1 area of the ferret cortex, which corresponds to our auditory A1 region. Equipped with carefully implanted electrodes, the alert ferrets listened to both ferret sounds and parts of human speech. The ferret sounds and speech were presented alone, against a background of white noise, against pink noise (noise with equal energy at all octaves that sounds lower in pitch than white noise) and against reverberation. Then they took the neural signals recorded from the electrodes and used a computer simulation to reconstruct the sounds the animal was hearing. In results published April 21 in Proceedings of the National Academy of Sciences, the researchers show the ferret brain is quite good at detecting both ferrets sounds and speech in all three noisy conditions. “We found that the noise is drastically decreased, as if the brain of the ferret filtered it out and recovered the cleaned speech,” Mesgarani says. © Society for Science & the Public 2000 - 2013.
Regina Nuzzo Gene therapy delivered to the inner ear can help shrivelled auditory nerves to regrow — and in turn, improve bionic ear technology, researchers report today in Science Translational Medicine1. The work, conducted in guinea pigs, suggests a possible avenue for developing a new generation of hearing prosthetics that more closely mimics the richness and acuity of natural hearing. Sound travels from its source to ears, and eventually to the brain, through a chain of biological translations that convert air vibrations to nerve impulses. When hearing loss occurs, it’s usually because crucial links near the end of this chain — between the ear’s cochlear cells and the auditory nerve — are destroyed. Cochlear implants are designed to bridge this missing link in people with profound deafness by implanting an array of tiny electrodes that stimulate the auditory nerve. Although cochlear implants often work well in quiet situations, people who have them still struggle to understand music or follow conversations amid background noise. After long-term hearing loss, the ends of the auditory nerve bundles are often frayed and withered, so the electrode array implanted in the cochlea must blast a broad, strong signal to try to make a connection, instead of stimulating a more precise array of neurons corresponding to particular frequencies. The result is an ‘aural smearing’ that obliterates fine resolution of sound, akin to forcing a piano player to wear snow mittens or a portrait artist to use finger paints. To try to repair auditory nerve endings and help cochlear implants to send a sharper signal to the brain, researchers turned to gene therapy. Their method took advantage of the electrical impulses delivered by the cochlear-implant hardware, rather than viruses often used to carry genetic material, to temporarily turn inner-ear cells porous. This allowed DNA to slip in, says lead author Jeremy Pinyon, an auditory scientist at the University of New South Wales in Sydney, Australia. © 2014 Nature Publishing Group