Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Michael Graziano Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it? The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species. Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing. © 2016 by The Atlantic Monthly Group
Tina Hesman Saey Gut microbes cause obesity by sending messages via the vagus nerve to pack on pounds, new research in rodents suggests. Bacteria in the intestines produce a molecule called acetate, which works through the brain and nervous system to make rats and mice fat, researchers report in the June 9 Nature. If the results hold up in humans, scientists would understand one mechanism by which gut microbes induce obesity: First, the microbes convert fats in food to a short-chain fatty acid called acetate. Acetate in the blood somehow makes its way to the brain. The brain sends a signal through the vagus nerve to the pancreas to increase insulin production. Insulin tells fat cells to store more energy. Fat builds up, leading to obesity. Acetate also increases levels of a hunger hormone called ghrelin, which could lead animals and people to eat even more, says Yale University endocrinologist Gerald Shulman, who led the study. “This is a tour-de-force paper,” says biochemist Jonathan Schertzer of McMaster University in Hamilton, Canada. Most studies that examine the health effects of intestinal microbes just list which bacteria, viruses, fungi and other microorganisms make up the gut microbiome, Schertzer says. But a catalog of differences between lean and obese individuals doesn’t address what those microbes do, he says. “What’s in name?” he asks. “When you find a factor that actually influences metabolism, that’s important.” © Society for Science & the Public 2000 - 2016.
Link ID: 22305 - Posted: 06.09.2016
By Stephen L. Macknik Every few decades there’s a major new neuroscience discovery that changes everything. I’m not talking about your garden variety discovery. Those happen frequently (this is the golden age of neuroscience after all). But no, what I’m talking about are the holy-moly, scales-falling-from-your-eyes, time-to-rewrite-the-textbooks, game-changing discoveries. Well one was reported in this last month—simultaneously by two separate labs—and it redefines the primary organizational principle of the visual system in the cortex of the brain. This may sound technical, but it concerns how we see light and dark, and the perception of contrast. Since all sensation functions at the pleasure of contrast, these new discoveries impact neuroscience and psychology as a whole. I’ll explain below. The old way of thinking about how the wiring of the visual cortex was organized orbited around the concept of visual-edge orientation. David Hubel (my old mentor) and Torsten Wiesel (my current fellow Brooklynite)—who shared the Nobel Prize in Physiology or Medicine in 1981—arguably made the first major breakthrough concerning how information was organized in the cortex versus earlier stages of visual processing. Before their discovery, the retina (and the whole visual system) was thought to be a kind of neural camera that communicated its image into the brain. The optic nerves connect the eyes’ retinas to the thalamus at the center of the brain—and then the thalamus connects to the visual cortex at the back of the brain through a neural information superhighway called the optic radiations. Scientists knew, even way back then, that neurons at a given point of the visual scene lie physically next to the neuron that sees the neighboring piece of the visual scene. The discovery of this so called retinotopic map in the primary visual cortex (by Talbot and Marshall) was of course important, but because it matched the retinotopic mapping of the retina and thalamus, it didn’t constitute a new way of thinking. It wasn’t a game-changing discovery. © 2016 Scientific American
Link ID: 22301 - Posted: 06.09.2016
By BENEDICT CAREY Jerome S. Bruner, whose theories about perception, child development and learning informed education policy for generations and helped launch the modern study of creative problem solving, known as the cognitive revolution, died on Sunday at his home in Manhattan. He was 100. His death was confirmed by his partner, Eleanor M. Fox. Dr. Bruner was a researcher at Harvard in the 1940s when he became impatient with behaviorism, then a widely held theory, which viewed learning in terms of stimulus and response: the chime of a bell before mealtime and salivation, in Ivan Pavlov’s famous dog experiments. Dr. Bruner believed that behaviorism, rooted in animal experiments, ignored many dimensions of human mental experience. In one 1947 experiment, he found that children from low-income households perceived a coin to be larger than it actually was — their desires apparently shaping not only their thinking but also the physical dimensions of what they saw. In subsequent work, he argued that the mind is not a passive learner — not a stimulus-response machine — but an active one, bringing a full complement of motives, instincts and intentions to shape comprehension, as well as perception. His writings — in particular the book “A Study of Thinking” (1956), written with Jacqueline J. Goodnow and George A. Austin — inspired a generation of psychologists and helped break the hold of behaviorism on the field. To build a more complete theory, he and the experimentalist George A. Miller, a Harvard colleague, founded the Center for Cognitive Studies, which supported investigation into the inner workings of human thought. Much later, this shift in focus from behavior to information processing came to be known as the cognitive revolution. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22300 - Posted: 06.09.2016
By Rachel Feltman Archerfish are already stars of the animal kingdom for their stunning spit-takes. They shoot high-powered water jets from their mouths to stun prey, making them one of just a few fish species known to use tools. But by training Toxotes chatareus to direct those jets of spit at certain individuals, scientists have shown that the little guys have another impressive skill: They seem to be able to distinguish one human face from another, something never before witnessed in fish and spotted just a few times in non-human animals. The results, published Tuesday in the Nature journal Scientific Reports, could help us understand how humans got so good at telling each other apart. Or how most people got to be good at that, anyway. I'm terrible at it. It's generally accepted that the fusiform gyrus, a brain structure located in the neocortex, allows humans to tell one another apart with a speed and accuracy that other species can't manage. But there's some debate over whether human faces are so innately complex — and that distinguishing them is more difficult than other tricks of memory or pattern recognition — that this region of the brain is a necessary facilitator of the skill that evolved especially for it. Birds, which have been shown to distinguish humans from one another, have the same structure. But some researchers still think that facial recognition might be something that humans learn — it's not an innate skill — and that the fusiform gyrus is just the spot where we happen to process all the necessary information.
Jean Fain When Sandra Aamodt talks about dieting, people listen ... or, they stick their fingers in their ears and go la, la, la. Aamodt's neuroscientific take on why diets backfire is that divisive. Aamodt is a neuroscientist, book author and former editor of a leading brain research journal. She also has become a prominent evangelist of the message that traditional diets just don't work and often leave the dieter worse off than before. And she's an enthusiastic proponent of mindful eating. "I define it as eating with attention and joy, without judgment," Aamodt said in an interview. "That includes attention to hunger and fullness, to the experience of eating and to its effects on our bodies." Even if you've never heard of her, you likely will soon. Her new book, Why Diets Make Us Fat, is bound to change the weight-loss conversation, if not dismantle Biggest Loser-sized dreams. I am a therapist specializing in eating issues, and among my clients, Aamodt has already shifted the focus from weight loss to self-care. Most clients are reluctant to accept her central argument: That our body weight tends to settle at "set points" — that 10- to 15-pound range the brain maintains despite repeated efforts to lower it. However, once they see how the set-point theory reflects their dieting experience, they realize that although they don't have the final say on their weight (their brain does), they do have real influence — through exercise and other health-affirming activities — over their health and well-being. © 2016 npr
Link ID: 22298 - Posted: 06.08.2016
By Virginia Morell Sex is never simple—even among lizards. Unlike mammals, the sex of central bearded dragons, large lizards found in eastern Australia, is determined by their chromosomes and the environment. If the eggs are incubated in high temperatures, male embryos turn into females. Such sex-reversed lizards still retain the chromosomal makeup of a male, but they develop into functional superfemales, whose output of eggs exceeds that of the regular females. Now, a new study predicts that—in some cases—these superfemales may be able to drive regular ones to extinction. That’s because superfemales not only produce more eggs, but they’re also exceptionally bold. Looking at the shape, physiology, and behavior of 20 sex-reversed females, 55 males, and 40 regular females, scientists found that the sex-reversed dragons were physically similar to regular males: They had a male dragon’s long tail and high body temperature. They were also behaviorally similar, acting like bold, active males—even as they produced viable eggs. Indeed, the scientists report in the current issue of the Proceedings of the Royal Society B that these sex-reversed females were behaviorally more malelike than the genetic males. Because of these advantages, this third sex could reproductively outcompete normal females, the scientists say, possibly causing some populations to lose the female sex chromosome. (Females are the heterogametic sex, like human males.) In such a population, the dragons’ sex would then be determined solely by temperature instead of genetics—something that’s occurred in the lab within a single generation. Could it happen in the wild? The scientists are still investigating. © 2016 American Association for the Advancement of Science
By Julia Shaw A cure for almost every memory ailment seems to be just around the corner. Alzheimer’s affected brains can have their memories restored, we can create hippocampal implants to give us better memory, and we can effectively implant false memories with light. Except that we can’t really do any of these things, at least not in humans. We sometimes forget that developments in memory science need to go through a series of stages in order to come to fruition, each of which requires tremendous knowledge and skill. From coming up with a new idea, to designing an appropriate methodology, obtaining ethical approval, getting research funding, recruiting research assistants and test subjects, conducting the experiment(s), completing complex statistical analysis for which computer code is often required, writing a manuscript, surviving the peer review process, and finally effectively distributing the findings, each part of the process is incredibly complex and takes a long time. On top of it all, this process, which can take decades to complete, typically results in incremental rather than monumental change. Rather than creating massive leaps in technology, in the vast majority of instances, studies add a teeny tiny bit of insight to the greater body of knowledge. These incremental achievements in science are often blown out of proportion by the media. As John Oliver recently said “…[Science] deserves better than to be twisted out of proportion and be turned into morning show gossip.” Moving from science fiction to science fact is harder than the media makes it seem. © 2016 Scientific American,
By Karin Brulliard Think about how most people talk to babies: Slowly, simply, repetitively, and with an exaggerated tone. It’s one way children learn the uses and meanings of language. Now scientists have found that some adult birds do that when singing to chicks — and it helps the baby birds better learn their song. The subjects of the new study, published last week in the journal Proceedings of the National Academy of Sciences, were zebra finches. They’re good for this because they breed well in a lab environment, and “they’re just really great singers. They sing all the time,” said McGill University biologist and co-author Jon Sakata. The males, he means — they’re the singers, and they do it for fun and when courting ladies, as well as around baby birds. Never mind that their melody is more “tinny,” according to Sakata, than pretty. Birds in general are helpful for vocal acquisition studies because they, like humans, are among the few species that actually have to learn how to make their sounds, Sakata said. Cats, for example, are born knowing how to meow. But just as people pick up speech and bats learn their calls, birds also have to figure out how to sing their special songs. Sakata and his colleagues were interested in how social interactions between adult zebra finches and chicks influences that learning process. Is face-to-face — or, as it may be, beak-to-beak — learning better? Does simply hearing an adult sing work as well as watching it do so? Do daydreaming baby birds learn as well as their more focused peers? © 1996-2016 The Washington Post
By LISA FELDMAN BARRETT WHEN the world gets you down, do you feel just generally “bad”? Or do you have more precise emotional experiences, such as grief or despair or gloom? In psychology, people with finely tuned feelings are said to exhibit “emotional granularity.” When reading about the abuses of the Islamic State, for example, you might experience creeping horror or fury, rather than general awfulness. When learning about climate change, you could feel alarm tinged with sorrow and regret for species facing extinction. Confronted with this year’s presidential campaign, you might feel astonished, exasperated or even embarrassed on behalf of the candidates — an emotion known in Mexico as “pena ajena.” Emotional granularity isn’t just about having a rich vocabulary; it’s about experiencing the world, and yourself, more precisely. This can make a difference in your life. In fact, there is growing scientific evidence that precisely tailored emotional experiences are good for you, even if those experiences are negative. According to a collection of studies, finely grained, unpleasant feelings allow people to be more agile at regulating their emotions, less likely to drink excessively when stressed and less likely to retaliate aggressively against someone who has hurt them. Perhaps surprisingly, the benefits of high emotional granularity are not only psychological. People who achieve it are also likely to have longer, healthier lives. They go to the doctor and use medication less frequently, and spend fewer days hospitalized for illness. Cancer patients, for example, have lower levels of harmful inflammation when they more frequently categorize, label and understand their emotions. © 2016 The New York Times Company
Link ID: 22285 - Posted: 06.06.2016
By DENISE GRADY Muhammad Ali, who died on Friday after a long struggle with Parkinson’s disease, was given the diagnosis in 1984 when he was 42. The world witnessed his gradual decline over the decades as tremors and stiffness set in, replacing his athletic stride with a shuffle, silencing his exuberant voice and freezing his face into an expressionless mask. What is Parkinson’s disease? It is a progressive, incurable deterioration of the part of the brain that produces a chemical needed to carry signals to the regions that control movement. How common is Parkinson’s? About one million people in the United States, and between seven million and 10 million worldwide, are thought to have Parkinson’s, according to the Parkinson’s Disease Foundation. What causes it? Was boxing a factor for Ali? The exact cause is not known. As with many disorders, experts suspect a combination of genes and environment, meaning that people with a particular genetic makeup may be predisposed to the disease if they are exposed to certain environmental factors. Head injuries, such as those sustained repeatedly in boxing, are among the possible risk factors listed by the National Parkinson Foundation. So is exposure to certain pesticides. These factors have both been suggested as possible contributors in Muhammad Ali’s case. Can Parkinson’s disease be treated? Medication can ease the symptoms for a time, but the disease continues to progress. In some cases, implanted devices called deep-brain stimulators can also help with symptoms. But Parkinson’s is not curable. © 2016 The New York Times Company
Link ID: 22284 - Posted: 06.06.2016
By Hanoch Ben-Yami Adam Bear opens his article, What Neuroscience Says about Free Will by mentioning a few cases such as pressing snooze on the alarm clock or picking a shirt out of the closet. He continues with an assertion about these cases, and with a question: In each case, we conceive of ourselves as free agents, consciously guiding our bodies in purposeful ways. But what does science have to say about the true source of this experience? This is a bad start. To be aware of ourselves as free agents is not to have an experience. There’s no special tickle which tells you you’re free, no "freedom itch." Rather, to be aware of the fact that you acted freely is, among other things, to know that had you preferred to do something else in those circumstances, you would have done it. And in many circumstances we clearly know that this is the case, so in many circumstances we are aware that we act freely. No experience is involved, and so far there’s no question in Bear’s article for science to answer. Continuing with his alleged experience, Bear writes: …the psychologists Dan Wegner and Thalia Wheatley made a revolutionary proposal: The experience of intentionally willing an action, they suggested, is often nothing more than a post hoc causal inference that our thoughts caused some behavior. More than a revolutionary proposal, this is an additional confusion. What might "intentionally willing an action" mean? Is it to be contrasted with non-intentionally willing an action? But what could this stand for? © 2016 Scientific American
Link ID: 22282 - Posted: 06.04.2016
Scientists say they have found a gene that causes a rare but inherited form of multiple sclerosis. It affects about one in every thousand MS patients and, according to the Canadian researchers, is proof that the disease is passed down generations. Experts have long suspected there's a genetic element to MS, but had thought there would be lots of genes involved, as well as environmental factors. The finding offers hope of targeted screening and therapy, Neuron reports. The University of British Columbia studied the DNA of hundreds of families affected by MS to hunt for a culprit gene. They found it in two sets of families containing several members with a rapidly progressive type of MS. In these families, 70% of the people with the mutation developed the disease. Although other factors may still be important and necessary to trigger the disease process, the gene itself is a substantial causative risk factor that is passed down from parents to their children, say the researchers. The mutation is in a gene called NR1H3, which makes a protein that acts as a switch controlling inflammation. In MS the body's immune system mistakenly attacks the protective layer of myelin that surrounds nerve fibres in the brain and spinal cord, leading to muscle weakness and other symptoms. Studies in mice show that knocking out the function of the same gene leads to neurological problems and decreased myelin production. © 2016 BBC.
By Simon Makin Other species are capable of displaying dazzling feats of intelligence. Crows can solve multistep problems. Apes display numerical skills and empathy. Yet, neither species has the capacity to conduct scientific investigations into other species' cognitive abilities. This type of behavior provides solid evidence that humans are by far the smartest species on the planet. Besides just elevated IQs, however, humans set themselves apart in another way: Their offspring are among the most helpless of any species. A new study, published recently in Proceedings of the National Academy of Sciences (PNAS), draws a link between human smarts and an infant’s dependency, suggesting one thing led to the other in a spiraling evolutionary feedback loop. The study, from psychologists Celeste Kidd and Steven Piantadosi at the University of Rochester, represents a new theory about how humans came to possess such extraordinary smarts. Like a lot of evolutionary theories, this one can be couched in the form of a story—and like a lot of evolutionary stories, this one is contested by some scientists. Kidd and Piantadosi note that, according to a previous theory, early humans faced selection pressures for both large brains and the capacity to walk upright as they moved from forest to grassland. Larger brains require a wider pelvis to give birth whereas being bipedal limits the size of the pelvis. These opposing pressures—biological anthropologists call them the “obstetric dilemma”—could have led to giving birth earlier when infants’ skulls were still small. Thus, newborns arrive more immature and helpless than those of most other species. Kidd and Piantadosi propose that, as a consequence, the cognitive demands of child care increased and created evolutionary pressure to develop higher intelligence. © 2016 Scientific American
Amanda Aronczyk At first Giselle wasn't sure what to put on her medical school application. She wanted to be a doctor, but she also wanted people to know about her own health: years of depression, anxiety and a suicide attempt. (We're using only her first name in this story, out of concern for her future career.) "A lot of people were like, you don't say that at all," she said. "Do not mention that you have any kind of weakness." Giselle remembers having her first intense suicidal thoughts when she was 10 years old. Her parents had split up and she had moved from the coast of Colombia to Chicago. She started having extreme mood swings and fighting with her mom. And then, when she was 16 years old, she tried to kill herself. "Yeah, lots of pills." After her suicide attempt she began therapy and eventually started taking antidepressants. That worked extremely well. After finishing high school, she took an unconventional route. She went to Brazil to work with a women's community health group, worked as a research assistant for a doctor, and trained as a doula to assist women in labor. It was while working as a doula and witnessing what she saw as insensitive behavior from a doctor that she resolved her own career indecision: She would become a different kind of doctor. When she applied to medical school, she told them this whole story in her application. In the fall of 2014, she started at the University of Wisconsin School of Medicine and Public Health. © 2016 npr
Link ID: 22276 - Posted: 06.02.2016
By Mark Gollom, Anti-smoking advocates who support the Liberal government's proposal to require plain packaging on tobacco products argue that Australia's implementation of similar regulations has had a significant effect on smoking rates in that country. "Australia has seen the biggest decline in smoking prevalence that they've ever recorded after plain packing [was introduced]," said David Hammond, an associate professor of public health and health systems at the University of Waterloo. "All the data we have suggest that plain packing has reduced smoking in Australia." Rob Cunningham, senior policy analyst for the Canadian Cancer Society, agrees and says research supports the effectiveness of plain packaging. "If it wasn't effective, the tobacco companies wouldn't be so strongly opposed," he said. "And it's precisely because it's going to have an effect on sales that they are going to lobby hard against it, threaten legal cases." But not everyone believes that Australia's policy of imposing bland tobacco branding has done much to deter smoking, which has been steadily declining for decades, according to Julian Morris, vice-president of research at the libertarian think tank the Reason Foundation. "The decline in smoking seems to have been continuous and not dramatically effected, one way or the other, by the introduction of plain packaging," he said. ©2016 CBC/Radio-Canada.
Keyword: Drug Abuse
Link ID: 22274 - Posted: 06.02.2016
By Ann Lukits Teens who baby-sit may not only gain confidence in caring for young children, they may also alter their brain chemistry in a way that could make them better parents, suggests an animal study in Developmental Psychobiology. Young female rats housed with various groups of unrelated rat pups had fully developed mothering skills as adults, compared with control rats without caregiving, or alloparenting, experience. The early caregivers had significantly higher concentrations of tryptophan hydroxylase-2 (TPH2) in the brain, an enzyme associated with increased production of serotonin, a chemical involved in mood and social behavior. Previous research has associated baby-sitting experience in humans with greater confidence in new mothers, researchers said. Experiments at Michigan State University involved two groups of juvenile or adolescent female rats from 16 litters. In one group, 24 rats were housed in separate cages with a different group of week-old pups each day. A second group of 24 controls were given pink pup-size pencil erasers. The experiments continued for 14 days. Eight mature rats from both groups were subsequently exposed to new groups of pups. Six rats with alloparenting experience acted maternally toward the pups, whereas none of the control rats exhibited maternal behavior. Rats with alloparenting experience also displayed less anxiety during behavioral testing. The animals were euthanized after testing and TPH2 levels measured in a section of the brain called the dorsal raphe nucleus. ©2016 Dow Jones & Company, Inc
Keyword: Sexual Behavior
Link ID: 22273 - Posted: 06.01.2016
By David Z. Hambrick If you’re a true dog lover, you take it as one of life’s simple truths that all dogs are good, and you have no patience for scientific debate over whether dogs really love people. Of course they do. What else could explain the fact that your dog runs wildly in circles when you get home from work, and, as your neighbors report, howls inconsolably for hours on end when you leave? What else could explain the fact that your dog insists on sleeping in your bed, under the covers—in between you and your partner? At the same time, there’s no denying that some dogs are smarter than others. Not all dogs can, like a border collie mix named Jumpy, do a back flip, ride a skateboard, and weave through pylons on his front legs. A study published in the journal Intelligence by British psychologists Rosalind Arden and Mark Adams confirms as much. Consistent with over a century of research on human intelligence, Arden and Adams found that a dog that excels in one test of cognitive ability will likely excel in other tests of cognitive ability. In more technical terms, the study reveals that there is a general factor of intelligence in dogs—a canine “g” factor. For their study, Arden and Adams devised a battery of canine cognitive ability tests. All of the tests revolved around—you guessed it—getting a treat. In the detour test, the dog’s objective was to navigate around barriers arranged in different configurations to get to a treat. In the point-following test, a researcher pointed to one of two inverted beakers concealing a treat, and recorded whether the dog went to that beaker or the other one. Finally, the quantity discrimination test required the dog to choose between a small treat (a glob of peanut butter) and a larger one (the “correct” answer). Arden and Adams administered the battery to 68 border collies from Wales; all had been bred and trained to do herding work on a farm, and thus had similar backgrounds. © 2016 Scientific American
By Frances Marcellin A shirt and cap that can diagnose epilepsy quickly and easily has been approved for use by European health services, including the UK’s NHS. Epileptic seizures are the result of excessive electrical discharges in the brain. The World Health Organization estimates that over 50 million people worldwide have the condition, including 6 million in Europe, making it one of the world’s most common serious neurological conditions. Brain implants and apps have been developed to warn of oncoming seizures. But to diagnose the condition, someone must typically have a seizure recorded by an EEG machine in a hospital – with sensors and wires attached to the scalp. “An EEG reading is at the heart of a reliable diagnosis,” says Françoise Thomas-Vialettes, president of French epilepsy society EFAPPE. But seizures rarely coincide with hospital appointments. “The diagnosis can take several years and is often imprecise.” Seizures are so difficult to record that 30 per cent of people with epilepsy in Europe are misdiagnosed. In developing countries that lack medical equipment and healthcare the situation is even worse. To make diagnosis easier, French start-up BioSerenity has developed a smart outfit called the Neuronaute that monitors people as they go about their day. The shirt and cap are embedded with biometric sensors that record the electrical activity of the wearer’s brain, heart and muscles. If a seizure occurs, the outfit can send an EEG recording of the brain to doctors via a smartphone. © Copyright Reed Business Information Ltd.
Link ID: 22271 - Posted: 06.01.2016
Amy McDermott Giant pandas have better ears than people — and polar bears. Pandas can hear surprisingly high frequencies, conservation biologist Megan Owen of the San Diego Zoo and colleagues report in the April Global Ecology and Conservation. The scientists played a range of tones for five zoo pandas trained to nose a target in response to sound. Training, which took three to six months for each animal, demanded serious focus and patience, says Owen, who called the effort “a lot to ask of a bear.” Both males and females heard into the range of a “silent” ultrasonic dog whistle. Polar bears, the only other bears scientists have tested, are less sensitive to sounds at or above 14 kilohertz. Researchers still don’t know why pandas have ultrasonic hearing. The bears are a vocal bunch, but their chirps and other calls have never been recorded at ultrasonic levels, Owen says. Great hearing may be a holdover from the bears’ ancient past. Citations M.A. Owen et al. Hearing sensitivity in context: Conservation implications for a highly vocal endangered species. Global Ecology and Conservation. Vol. 6, April 2016, p. 121. doi: 10.1016/j.gecco.2016.02.007. © Society for Science & the Public 2000 - 2016.
Link ID: 22269 - Posted: 06.01.2016