Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Erin Allday When the United States’ top public health and political leaders declared the 1990s the “decade of the brain,” Dr. Pratik Mukherjee couldn’t help but feel a little dubious. “I was kind of laughing, because I didn’t think we’d make much progress in just a decade,” said Mukherjee, a neuro-radiologist at UCSF. Twenty-four years later, Mukherjee said he and his peers around the country are primed to plunge into what he’d like to call the century of the brain — a deep dive into the basic biology and mechanics of the impossibly complex organ that controls our every thought, action, behavior and mood. The National Institutes of Health last week announced $47 million in grants as part of President Obama’s Brain Initiative, a project announced 18 months ago to, in the simplest language, reverse-engineer the human brain. The grants were among the first in a roughly 11-year plan that could cost more than $3 billion. Most of the projects are in developing new technologies to help map the brain and study its mechanics — how cells communicate, what makes them turn on and off, and how large regions of the brain interact, for example. Ultimately, scientists hope these tools will help the next generation of neuroscientists solve the brain-centric disorders — from autism and Alzheimer’s to depression and schizophrenia — that have confounded doctors for centuries.
Keyword: Brain imaging
Link ID: 20183 - Posted: 10.09.2014
Patients with schizophrenia are already known to have higher rates of premature death than the general population. The study found that elevated risks of heart disease and metabolic issues such as high blood sugar in people with first episode psychosis are due to an interaction of mental illness, unhealthy lifestyle behaviors and antipsychotic medications that may accelerate these risks. Patients entered treatment with significant health concerns – including excess weight, smoking, and metabolic issues – despite an average age of only 24 years. The study identifies key opportunities for health care systems to improve the treatment of such patients with first episode psychosis. The research was funded by the National Institute of Mental Health (NIMH), part of the National Institutes of Health. Christoph Correll, M.D., of The Zucker Hillside Hospital, Hofstra North Shore-Long Island Jewish School of Medicine, New York, and colleagues, report their findings on Oct. 8, 2014 in JAMA Psychiatry. The study is among the first of several to report results from the Recovery After an Initial Schizophrenia Episode (RAISE) project, which was developed by NIMH to examine first episode psychosis before and after specialized treatment was offered in community settings. The researchers studied nearly 400 individuals between the ages of 15 and 40 with first episode psychosis, who presented for treatment at 34 community-based clinics across 21 states. The frequency of obesity was similar to the same age group in the general population. However, smoking and metabolic syndrome (a combination of conditions including obesity, high blood pressure, high blood sugar, and abnormal blood fats, such as cholesterol and triglycerides) were much more common.
Link ID: 20182 - Posted: 10.09.2014
|By Tara Haelle The first step to treating or preventing a disease is often finding out what drives it. In the case of neurodegenerative disorders, the discovery two decades ago of what drives them changed the field: all of them—including Alzheimer's, Parkinson's, Huntington's and amyotrophic lateral sclerosis (ALS or Lou Gehrig's disease)—involve the accumulation of misfolded proteins in brain cells. Typically when a protein misfolds, the cell destroys it, but as a person ages, this quality-control mechanism starts to fail and the rogue proteins build up. In Huntington's, for example, huntingtin protein—used for many cell functions—misfolds and accumulates. Symptoms such as muscular difficulties, irritability, declining memory, poor impulse control and cognitive deterioration accompany the buildup. Mounting evidence suggests that not only does the accumulation of misfolded proteins mark neurodegenerative disease but that the spread of the proteins from one cell to another causes the disease to progress. Researchers have seen misfolded proteins travel between cells in Alzheimer's and Parkinson's. A series of experiments reported in Nature Neuroscience in August suggests the same is true in Huntington's. In their tests, researchers in Switzerland showed that mutated huntingtin protein in diseased brain tissue could invade healthy brain tissue when the two were placed together. And when the team injected the mutated protein into a live mouse's brain, it spread through the neurons within a month—similar to the way prions spread, says Francesco Paolo Di Giorgio of the Novartis Institutes for BioMedical Research in Basel, who led the research. Prions are misfolded proteins that travel through the body and confer their disease-causing characteristics onto other proteins, as seen in mad cow disease. But it is not known if misfolded proteins involved in Huntington's convert other proteins as true prions do, according to Di Giorgio. © 2014 Scientific American
Link ID: 20181 - Posted: 10.08.2014
David Cyranoski Unlike its Western counterparts, Japan’s effort will be based on a rare resource — a large population of marmosets that its scientists have developed over the past decade — and on new genetic techniques that might be used to modify these highly social animals. The goal of the ten-year Brain/MINDS (Brain Mapping by Integrated Neurotechnologies for Disease Studies) project is to map the primate brain to accelerate understanding of human disorders such as Alzheimer’s disease and schizophrenia. On 11 September, the Japanese science ministry announced the names of the group leaders — and how the project would be organized. Funded at ¥3 billion (US$27 million) for the first year, probably rising to about ¥4 billion for the second, Brain/MINDS is a fraction of the size of the European Union’s Human Brain Project and the United States’ BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative, both of which are projected to receive at least US$1 billion over the next decade. But researchers involved in those efforts say that Brain/MINDS fills a crucial gap between disease models in smaller animals that too often fail to mimic human brain disorders, and models of the human brain that need validating data. “It is essential that we have a genetic primate model to study cognition and cognitive brain disorders such as schizophrenia and depression, for which we do not have good mouse models,” says neuroscientist Terry Sejnowski at the Salk Institute in La Jolla, California, who is a member of the National Institutes of Health BRAIN Initiative Working Group. “Other groups in the United States and China have started transgenic-primate projects, but none is as large or as well organized as the Japanese effort.” © 2014 Nature Publishing Group,
By Virginia Morell Two years ago, scientists showed that dolphins imitate the sounds of whales. Now, it seems, whales have returned the favor. Researchers analyzed the vocal repertoires of 10 captive orcas (Orcinus orca), three of which lived with bottlenose dolphins (Tursiops truncatus) and the rest with their own kind. Of the 1551 vocalizations these seven latter orcas made, more than 95% were the typical pulsed calls of killer whales. In contrast, the three orcas that had only dolphins as pals busily whistled and emitted dolphinlike click trains and terminal buzzes, the scientists report in the October issue of The Journal of the Acoustical Society of America. (Watch a video as bioacoustician and co-author Ann Bowles describes the difference between killer whale and orca whistles.) The findings make orcas one of the few species of animals that, like humans, is capable of vocal learning—a talent considered a key underpinning of language. © 2014 American Association for the Advancement of Science.
By CLAIRE MALDARELLI Whether it’s lying wide awake in the middle of the night or falling asleep at an international business meeting, many of us have experienced the funk of jet lag. New research has uncovered some of the mysteries behind how our cells work together to maintain one constant daily rhythm, offering the promise of defense against this disorienting travel companion. Many organisms, including humans and fruit flies, have pacemaker neurons — specialized cells in the brain that have their own molecular clocks and oscillate in 24-hour cycles. But in order for an organism to regulate itself, all of these internal clocks must tick together to create one master clock. While scientists understood how individual neurons set their own clock, they didn’t know how that master clock was set. Working with young fruit flies, whose neuronal system is simpler than adults with fewer cells and easier to study, the researchers found that two types of neurons, which they called dawn cells and dusk cells, maintain a continuous cycle. As the sun rises, special “timeless” proteins, as they’re called, help the dawn cells to first signal to each other and then signal to the dusk cells. Then as the sun sets, proteins help the dusk cells signal to each other and then signal back to the dawn cells. Each signal tells the cells to synchronize with each other. Together, these two distinct signals drive the daily sleep and wake cycle. “This really shifts our view of these cells as super strong, independent oscillators to much more of a collective group working together to keep time,” said Justin Blau, a neurobiologist at New York University and co-author of the study. © 2014 The New York Times Company
|By Brian Bienkowski and Environmental Health News On his farm in Iowa, Matt Peters worked from dawn to dusk planting his 1,500 acres of fields with pesticide-treated seeds. “Every spring I worried about him,” said his wife, Ginnie. “Every spring I was glad when we were done.” In the spring of 2011, Ginnie Peters' “calm, rational, loving” husband suddenly became depressed and agitated. “He told me ‘I feel paralyzed’,” she said. “He couldn’t sleep or think. Out of nowhere he was depressed.” A clinical psychologist spoke to him on the phone and urged him to get medical help. “He said he had work to do, and I told him if it’s too wet in the morning to plant beans come see me,” Mike Rossman said. “And the next day I got the call.” Peters took his own life. He was 55 years old. No one knows what triggered Peters’ sudden shift in mood and behavior. But since her husband’s death, Ginnie Peters has been on a mission to not only raise suicide awareness in farm families but also draw attention to the growing evidence that pesticides may alter farmers’ mental health. “These chemicals that farmers use, look what they do to an insect. It ruins their nervous system,” Peters said. “What is it doing to the farmer?” Farming is a stressful job – uncontrollable weather, physical demands and economic woes intertwine with a personal responsibility for land that often is passed down through generations. But experts say that some of the chemicals used to control pests may make matters worse by changing farmers’ brain chemistry. © 2014 Scientific American
By LAWRENCE K. ALTMAN A British-American scientist and a pair of Norwegian researchers were awarded this year’s Nobel Prize in Physiology or Medicine on Monday for discovering “an inner GPS in the brain” that enables virtually all creatures to navigate their surroundings. John O’Keefe, 75, a British-American scientist, will share the prize of $1.1 million with May-Britt Moser, 51, and Edvard I. Moser, 52, only the second married couple to win a Nobel in medicine, who will receive the other half. The three scientists’ discoveries “have solved a problem that has occupied philosophers and scientists for centuries — how does the brain create a map of the space surrounding us and how can we navigate our way through a complex environment?” said the Karolinska Institute in Sweden, which chooses the laureates. The positioning system they discovered helps us know where we are, find our way from place to place and store the information for the next time, said Goran K. Hansson, secretary of the Karolinska’s Nobel Committee. The researchers documented that certain cells are responsible for the higher cognitive function that steers the navigational system. Dr. O’Keefe began using neurophysiological methods in the late 1960s to study how the brain controls behavior and sense of direction. In 1971, he discovered the first component of the inner navigational system in rats. He identified nerve cells in the hippocampus region of the brain that were always activated when a rat was at a certain location. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20169 - Posted: 10.07.2014
By Clare Wilson If you’re facing surgery, this may well be your worst nightmare: waking up while under the knife without medical staff realizing. The biggest-ever study of this phenomenon is shedding light on what such an experience feels like and is causing debate about how best to prevent it. For a one-year period starting in 2012, an anesthetist at every hospital in the United Kingdom and Ireland recorded every case where a patient told a staff member that he had been awake during surgery. Prompted by these reports, the researchers investigated 300 cases, interviewing the patient and doctors involved. One of the most striking findings, says the study’s lead author, Jaideep Pandit of Oxford University Hospitals, was that pain was not generally the worst part of the experience: It was paralysis. For some operations, paralyzing drugs are given to relax muscles and stop reflex movements. “Pain was something they understood, but very few of us have experienced what it’s like to be paralyzed,” Pandit says. “They thought they had been buried alive.” “I thought I was about to die,” says Sandra, who regained consciousness but was unable to move during a dental operation when she was 12 years old. “It felt as though nothing would ever work again — as though the anesthetist had removed everything apart from my soul.”
Link ID: 20168 - Posted: 10.07.2014
Aaron E. Carroll For a drug to be approved by the Food and Drug Administration, it must prove itself better than a placebo, or fake drug. This is because of the “placebo effect,” in which patients often improve just because they think they are being treated with something. If we can’t compare a new drug with a placebo, we can’t be sure that the benefit seen from it is anything more than wishful thinking. But when it comes to medical devices and surgery, the requirements aren’t the same. Placebos aren’t required. That is probably a mistake. At the turn of this century, arthroscopic surgery for osteoarthritis of the knee was common. Basically, surgeons would clean out the knee using arthroscopic devices. Another common procedure was lavage, in which a needle would inject saline into the knee to irrigate it. The thought was that these procedures would remove fragments of cartilage and calcium phosphate crystals that were causing inflammation. A number of studies had shown that people who had these procedures improved more than people who did not. However, a growing number of people were concerned that this was really no more than a placebo effect. And in 2002, a study was published that proved it. A total of 180 patients who had osteoarthritis of the knee were randomly assigned (with their consent) to one of three groups. The first had a standard arthroscopic procedure, and the second had lavage. The third, however, had sham surgery. They had an incision, and a procedure was faked so that they didn’t know that they actually had nothing done. Then the incision was closed. The results were stunning. Those who had the actual procedures did no better than those who had the sham surgery. They all improved the same amount. The results were all in people’s heads. © 2014 The New York Times Company
Keyword: Pain & Touch
Link ID: 20167 - Posted: 10.07.2014
Fiona Fox Last week the UK Home Office published the findings of its investigations into allegations of animal suffering, made after undercover infiltrations at two animal research facilities. You will not find coverage of any of the conclusions in the national news media. Instead any search for media coverage will unearth the original infiltration stories under headlines such as: “Horrific video shows distress of puppies and kittens waiting to be dissected at animal testing lab”; “Graphic content: horrifying video shows puppies and kittens tested at UK laboratory”; and “Rats beheaded with scissors and kept in ‘pitiful state’.” These “shocking exposés”, brought to the newspapers by the animal rights group BUAV, include distressing images, links to videos that are difficult to watch, and quote allegedly secretly recorded researchers saying terrible things about the animals in their care. The newspapers seem in no doubt that the allegations they are carrying add up to “appalling suffering on a very large scale”, and appear to be proud of their role in bringing the abuses to light: “The Sunday Express today publishes details of an undercover investigation … that shines a light on the secret world of vivisection laboratories.” You may well see these articles as reassuring evidence that we still have public interest journalism in the UK. These animal rights supporters have done exactly what investigative journalists used to do in a time when newspapers had enough money to shine a light on the darker corners of our institutions and uncover hidden abuses. And you would be right, but for one thing: we now know that the stories were largely untrue. © 2014 Guardian News and Media Limited
Keyword: Animal Rights
Link ID: 20165 - Posted: 10.07.2014
by Michael Marshall When we search for the seat of humanity, are we looking at the wrong part of the brain? Most neuroscientists assume that the neocortex, the brain's distinctive folded outer layer, is the thing that makes us uniquely human. But a new study suggests that another part of the brain, the cerebellum, grew much faster in our ape ancestors. "Contrary to traditional wisdom, in the human lineage the cerebellum was the part of the brain that accelerated its expansion most rapidly, rather than the neocortex," says Rob Barton of Durham University in the UK. With Chris Venditti of the University of Reading in the UK, Barton examined how the relative sizes of different parts of the brain changed as primates evolved. During the evolution of monkeys, the neocortex and cerebellum grew in tandem, a change in one being swiftly followed by a change in the other. But starting with the first apes around 25 million years ago through to chimpanzees and humans, the cerebellum grew much faster. As a result, the cerebellums of apes and humans contain far more neurons than the cerebellum of a monkey, even if that monkey were scaled up to the size of an ape. "The difference in ape cerebellar volume, relative to a scaled monkey brain, is equal to 16 billion extra neurons," says Barton. "That's the number of neurons in the entire human neocortex." © Copyright Reed Business Information Ltd.
By Kevin Hartnett You may have seen that deliberately annoying “View of the World from Ninth Avenue” map featured on the cover of the New Yorker a while back. It shows the distorted way geography appears to a Manhattanite: 9th and 10th avenues are the center of the world, New Jersey appears, barely, and everywhere else is just a blip if it registers at all. As it turns out, a similar kind of map exists for the human body — with at least some basis in neuroscience. In August I wrote a story for Ideas on the rise of face transplants and spoke to Michael Sims, author of the book, “Adam’s Navel: A Natural and Cultural History of the Human Form.” During our conversation Sims mentioned an odd diagram published in 1951 by a neurosurgeon named Wilder Penfield. The diagram is known as “Homunculus” (a name taken from a weird and longstanding art form that depicts small human beings); it shows the human body scaled according to the amount of brain tissue dedicated to each part, and arranged according to the locations in the brain that control them. In the diagram, the eyes, lips, nose, and tongue appear grotesquely large, indicating that we devote an outsized amount of brain tissue to operating and receiving sensation from these parts of the body. (Sims’s point was that we devote a lot of processing power to the face, and for that reason find it biologically disorienting that faces could be changeable.) The hand is quite large, too, while the toes, legs, trunks, shoulders, and arms are tiny, the equivalents of Kansas City and Russia on the New Yorker map. “Homunculus” seems like the kind of thing that would have long since been superseded by modern brain science, but it actually continues to have a surprising amount of authority, and often appears in neuroscience textbooks.
Keyword: Pain & Touch
Link ID: 20158 - Posted: 10.04.2014
By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014
Carl Zimmer As much as we may try to deny it, Earth’s cycle of day and night rules our lives. When the sun sets, the encroaching darkness sets off a chain of molecular events spreading from our eyes to our pineal gland, which oozes a hormone called melatonin into the brain. When the melatonin latches onto neurons, it alters their electrical rhythm, nudging the brain into the realm of sleep. At dawn, sunlight snuffs out the melatonin, forcing the brain back to its wakeful pattern again. We fight these cycles each time we stay up late reading our smartphones, suppressing our nightly dose of melatonin and waking up grumpy the next day. We fly across continents as if we could instantly reset our inner clocks. But our melatonin-driven sleep cycle lags behind, leaving us drowsy in the middle of the day. Scientists have long wondered how this powerful cycle got its start. A new study on melatonin hints that it evolved some 700 million years ago. The authors of the study propose that our nightly slumbers evolved from the rise and fall of our tiny oceangoing ancestors, as they swam up to the surface of the sea at twilight and then sank in a sleepy fall through the night. To explore the evolution of sleep, scientists at the European Molecular Biology Laboratory in Germany study the activity of genes involved in making melatonin and other sleep-related molecules. Over the past few years, they’ve compared the activity of these genes in vertebrates like us with their activity in a distantly related invertebrate — a marine worm called Platynereis dumerilii. The scientists studied the worms at an early stage, when they were ball-shaped 2-day-old larvae. The ocean swarms with juvenile animals like these. Many of them spend their nights near the ocean surface, feeding on algae and other bits of food. Then they spend the day at lower depths, where they can hide from predators and the sun’s ultraviolet rays. © 2014 The New York Times Company
By CATHERINE SAINT LOUIS Driven by a handful of reports of poliolike symptoms in children, federal health officials have asked the nation’s physicians to report cases of children with limb weakness or paralysis along with specific spinal-cord abnormalities on a magnetic resonance imaging test. As a respiratory illness known as enterovirus 68 is sickening thousands of children from coast to coast, officials are trying to figure out if the weakness could be linked to the virus. The emergence of several cases of limb weakness among children in Colorado put doctors on alert in recent months. The Centers for Disease Control and Prevention issued an advisory on Friday, and this week, other cases of unexplained muscle weakness or paralysis came to light in Michigan, Missouri and Massachusetts. The C.D.C. is investigating the cases of 10 children hospitalized at Children’s Hospital Colorado with unexplained arm or leg weakness since Aug. 9. Some of the children, who range in age from 1 to 18, also developed symptoms like facial drooping, double vision, or difficulty swallowing or talking. Four of them tested positive for enterovirus 68, also known as enterovirus D68, which has recently caused severe respiratory illness in children in 41 states and the District of Columbia. One tested positive for rhinovirus, which can cause the common cold. Two tested negative. Two patients’ specimens are still being processed; another was never tested. It is unclear whether the muscle weakness is connected to the viral outbreak. “It’s one possibility we are looking at, but certainly not the only possibility,” said Mark Pallansch, director of the C.D.C.’s division of viral diseases. © 2014 The New York Times Company
Keyword: Movement Disorders
Link ID: 20150 - Posted: 10.02.2014
Have you ever wrongly suspected that other people are out to harm you? Have you been convinced that you’re far more talented and special than you really are? Do you sometimes hear things that aren’t actually there? These experiences – paranoia, grandiosity and hallucinations in the technical jargon – are more common among the general population than is usually assumed. But are people who are susceptible simply “made that way”? Are they genetically predisposed, in other words, or have their life experiences made them more vulnerable to these things? It’s an old debate: which is more important, nature or nurture? Scientists nowadays tend to agree that human psychology is a product of a complex interaction between genes and experience – which is all very well, but where does the balance lie? Scientists (including one of the authors of this blog) recently conducted the first ever study among the general population of the relative contributions of genes and environment to the experience of paranoia, grandiosity and hallucinations. How did we go about the research? First, it is important to be clear about the kinds of experience we measured. By paranoia, we mean the unfounded or excessive fear that other people are out to harm us. Grandiosity denotes an unrealistic conviction of one’s abilities and talents. Hallucinations are sensory experiences (hearing voices, for instance) that aren’t caused by external events. Led by Dr Angelica Ronald at Birkbeck, University of London, the team analysed data on almost 5,000 pairs of 16-year-old twins. This is the classical twin design, a standard method for gauging the relative influence of genes and environment. Looking simply at family traits isn’t sufficient: although family members share many genes, they also tend to share many of the same experiences. This is why studies involving twins are so useful. © 2014 Guardian News and Media Limited
Link ID: 20147 - Posted: 10.02.2014
by Jason M. Breslow As the NFL nears an end to its long-running legal battle over concussions, new data from the nation’s largest brain bank focused on traumatic brain injury has found evidence of a degenerative brain disease in 76 of the 79 former players it’s examined. The findings represent a more than twofold increase in the number of cases of chronic traumatic encephalopathy, or CTE, that have been reported by the Department of Veterans Affairs’ brain repository in Bedford, Mass. Researchers there have now examined the brain tissue of 128 football players who, before their deaths, played the game professionally, semi-professionally, in college or in high school. Of that sample, 101 players, or just under 80 percent, tested positive for CTE. To be sure, players represented in the data represent a skewed population. CTE can only be definitively identified posthumously, and many of the players who have donated their brains for research suspected that they may have had the disease while still alive. For example, former Chicago Bears star Dave Duerson committed suicide in 2011 by shooting himself in the chest, reportedly to preserve his brain for examination. Nonetheless, Dr. Ann McKee, the director of the brain bank, believes the findings suggest a clear link between football and traumatic brain injury. “Obviously this high percentage of living individuals is not suffering from CTE,” said McKee, a neuropathologist who directs the brain bank as part of a collaboration between the VA and Boston University’s CTE Center. But “playing football, and the higher the level you play football and the longer you play football, the higher your risk.” ©1995-2014 WGBH Educational Foundation
Keyword: Brain Injury/Concussion
Link ID: 20146 - Posted: 10.01.2014
By Sarah C. P. Williams A wind turbine, a roaring crowd at a football game, a jet engine running full throttle: Each of these things produces sound waves that are well below the frequencies humans can hear. But just because you can’t hear the low-frequency components of these sounds doesn’t mean they have no effect on your ears. Listening to just 90 seconds of low-frequency sound can change the way your inner ear works for minutes after the noise ends, a new study shows. “Low-frequency sound exposure has long been thought to be innocuous, and this study suggests that it’s not,” says audiology researcher Jeffery Lichtenhan of the Washington University School of Medicine in in St. Louis, who was not involved in the new work. Humans can generally sense sounds at frequencies between 20 and 20,000 cycles per second, or hertz (Hz)—although this range shrinks as a person ages. Prolonged exposure to loud noises within the audible range have long been known to cause hearing loss over time. But establishing the effect of sounds with frequencies under about 250 Hz has been harder. Even though they’re above the lower limit of 20 Hz, these low-frequency sounds tend to be either inaudible or barely audible, and people don’t always know when they’re exposed to them. For the new study, neurobiologist Markus Drexl and colleagues at the Ludwig Maximilian University in Munich, Germany, asked 21 volunteers with normal hearing to sit inside soundproof booths and then played a 30-Hz sound for 90 seconds. The deep, vibrating noise, Drexl says, is about what you might hear “if you open your car windows while you’re driving fast down a highway.” Then, they used probes to record the natural activity of the ear after the noise ended, taking advantage of a phenomenon dubbed spontaneous otoacoustic emissions (SOAEs) in which the healthy human ear itself emits faint whistling sounds. © 2014 American Association for the Advancement of Science
Link ID: 20144 - Posted: 10.01.2014
It's not just humans who want the latest gadget. Wild chimpanzees that see a friend making and using a nifty new kind of tool are likely to make one for themselves, scientists report. "Our study adds new evidence supporting the hypothesis that some of the behavioural diversity seen in wild chimpanzees is the result of social transmission and can therefore be interpreted as cultural," an international research team writes today in the journal PLOS ONE. The findings suggest that the ability of individuals to learn from one another originated long ago in a common ancestor of chimpanzees and humans, the researchers add. "This study tells us that chimpanzee culture changes over time, little by little, by building on previous knowledge found within the community," said Thibaud Gruber, a co-author of the study, in a statement. "This is probably how our early ancestors' cultures also changed over time." Scientists already knew that chimpanzees in different groups have certain behaviours unique to their group, such as using a particular kind of tool. They suspected that wild chimpanzees learn those behaviours from other chimpanzees within their group, as scientists have observed in captive chimps. But they could never be sure. The new study documents the spread of two new behaviours among chimpanzees living in Uganda's Budongo Forest. It shows that chimps learned one of them — the making and use of a new tool called a moss sponge — by observing other chimps who had already adopted the behaviour. Chimps dip the tool in water and then put it in their mouth to drink. © CBC 2014
Link ID: 20141 - Posted: 10.01.2014