Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 20797

Sara Reardon Antipsychotic drugs are widely used to blunt aggressive behaviour in people with intellectual disabilities who have no history of mental illness, a UK survey of medical records finds, even though the medicines may not have a calming effect. The finding is worrisome because antipsychotic drugs can cause severe side effects such as obesity or diabetes. Psychiatry researcher Rory Sheehan and colleagues1 at University College London studied data from 33,016 people with intellectual disabilities from general-care practices in the United Kingdom over a period of up to 15 years. The researchers found that 71% of 9,135 people who were treated with antipsychotics had never been diagnosed with a severe mental illness, and that the drugs were more likely to be prescribed to those who displayed problematic behaviours. “We suspected that this would be the case, but we didn’t know the true extent,” Sheehan says. “We should be worried because the rates are high,” says James Harris, a psychiatrist at Johns Hopkins University in Baltimore, Maryland. But he adds that it is hard to determine whether treatment with antipsychotics is appropriate without knowing what other forms of treatment were available to people in the study. It is possible that medication was the only option available or that it was used to dampen a person's behaviour enough that they could participate in therapy or other types of treatment. Evidence suggests that the drugs are not effective at treating aggressive and disruptive behaviour, says psychiatrist Peter Tyrer of Imperial College London. I © 2015 Nature Publishing Group

Keyword: Schizophrenia; Aggression
Link ID: 21376 - Posted: 09.02.2015

Aftab Ali People who were born prematurely are less intelligent later on in life and earn less money as a result, according to a new study by the University of Warwick. Researchers at the Coventry-based institution said they found a link which connects pre-term birth with low reading and, in particular, maths skills which affect the amount of wealth accumulated as adults. Funded by the Nuffield Foundation, the researchers examined data from two other large studies, following children born more than a decade apart, with one group from 1958 and the other from 1970. In total, more than 15,000 individuals were surveyed – which recruited all children born in a single week in England, Scotland, and Wales. Data were examined for all individuals who were born at between 28 and 42 weeks gestational age, and who had available wealth information at the age of 42. Those participants who were born pre-term – at less than 37 weeks – were compared with those who were born full-term to find both groups’ mathematical ability in childhood had a direct effect on how much they earned as an adult, regardless of later educational qualifications. In order to measure adult wealth, the researchers looked at factors including: family income and social class, housing and employment status, and their own perceptions of their financial situation. In regards to academic abilities, they examined: validated measures for mathematics, reading, and intelligence, along with ratings from teachers and parents. © independent.co.uk

Keyword: Development of the Brain; Intelligence
Link ID: 21375 - Posted: 09.02.2015

Shankar Vedantam Girls often outperform boys in science and math at an early age but are less likely to choose tough courses in high school. An Israeli experiment demonstrates how biases of teachers affect students. RENEE MONTAGNE, HOST: At early ages, girls often outperform boys in math and science classes. Later, something changes. By the time they get into high school, girls are less likely than boys to take difficult math courses and less likely, again, to go into careers in science, technology, engineering or medicine. To learn more about this, David Greene spoke with NPR social science correspondent Shankar Vedantam. SHANKAR VEDANTAM, BYLINE: Well, the new study suggests, David, that some of these outcomes might be driven by the unconscious biases of elementary school teachers. What's remarkable about the new work is it doesn't just theorize about the gender gap, it actually has very hard evidence. Edith Sand at Tel Aviv University and her colleague, Victor Lavy, analyzed the math test scores of about 3,000 students in Tel Aviv. When the students were in sixth grade, the researchers got two sets of math test scores. One set of scores were given by the classroom teachers, who obviously knew the children whom they were grading. The second set of scores were from external teachers who did not know if the children they were grading were either boys or girls. So the external teachers were blind to the gender of the children. © 2015 NPR

Keyword: Sexual Behavior; Attention
Link ID: 21374 - Posted: 09.02.2015

By Simon Makin Scientists claim to have discovered the first new human prion in almost 50 years. Prions are misfolded proteins that make copies of themselves by inducing others to misfold. By so doing, they multiply and cause disease. The resulting illness in this case is multiple system atrophy (MSA), a neurodegenerative disease similar to Parkinson's. The study, published August 31 in Proceedings of the National Academy of Sciences, adds weight to the idea that many neurodegenerative diseases are caused by prions. In the 1960s researchers led by Carleton Gajdusek at the National Institutes of Health transmitted kuru, a rare neurodegenerative disease found in Papua New Guinea, and Creutzfeldt–Jakob disease (CJD), a rare human dementia, to chimpanzees by injecting samples from victims' brains directly into those of chimps. It wasn't until 1982, however, that Stanley Prusiner coined the term prion (for “proteinaceous infectious particle”) to describe the self-propagating protein responsible. Prusiner and colleagues at the University of California, San Francisco, showed this process caused a whole class of diseases, called spongiform encephalopathies (for the spongelike appearance of affected brains), including the bovine form known as “mad cow” disease. The same protein, PrP, is also responsible for kuru, which was spread by cannibalism; variant-CJD, which over 200 people developed after eating beef infected with the bovine variety; and others. The idea that a protein could transmit disease was radical at the time but the work eventually earned Prusiner the 1997 Nobel Prize in Physiology or Medicine. He has long argued prions may underlie other neurodegenerative diseases but the idea has been slow to gain acceptance. © 2015 Scientific American

Keyword: Prions; Parkinsons
Link ID: 21373 - Posted: 09.02.2015

By Gretchen Reynolds At the age of 93, Olga Kotelko — one of the most successful and acclaimed nonagenarian track-and-field athletes in history — traveled to the University of Illinois to let scientists study her brain. Ms. Kotelko held a number of world records and had won hundreds of gold medals in masters events. But she was of particular interest to the scientific community because she hadn’t begun serious athletic training until age 77. So scanning her brain could potentially show scientists what late-life exercise might do for brains. Ms. Kotelko died last year at the age of 95, but the results of that summer brain scan were published last month in Neurocase. And indeed, Ms. Kotelko’s brain looked quite different from those of other volunteers aged 90-plus who participated in the study, the scans showed. The white matter of her brain — the cells that connect neurons and help to transmit messages from one part of the brain to another — showed fewer abnormalities than the brains of other people her age. And her hippocampus, a portion of the brain involved in memory, was larger than that of similarly aged volunteers (although it was somewhat shrunken in comparison to the brains of volunteers decades younger than her). Over all, her brain seemed younger than her age. But because the scientists didn’t have a scan showing Ms. Kotelko’s brain before she began training, it’s impossible to know whether becoming an athlete late in life improved her brain’s health or whether her naturally healthy brain allowed her to become a stellar masters athlete. © 2015 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 21372 - Posted: 09.02.2015

Ian Sample Science editor People who get too little sleep are more likely to catch a cold, according to US scientists who suspect that a good night’s sleep is crucial for the body’s immune defences. Those who slept six hours a night or less were four times more likely to catch a cold when they were exposed to the virus than people who spent more than seven hours a night asleep, their study found. The findings, reported in the journal Sleep, build on previous studies that suggest that the sleep-deprived are more susceptible to infectious diseases and recover more slowly when they do fall ill. “It goes beyond feeling groggy or irritable,” said Aric Prather, a health psychologist at the University of California in San Francisco. “Not getting enough sleep affects your physical health.” The scientists recruited 94 men and 70 women, with an average age of 30, for the study and subjected them to two months of health screening, interviews and questionnaires to establish their baseline stress levels, temperament and usage of alcohol and tobacco. The volunteers then spent a week wearing a wrist-mounted sleep sensor that tracked the duration and quality of their sleep each night. To see how well they fought off infections, the participants were taken to a hotel and given nasal drops containing the cold virus. Doctors monitored them closely for a week after, collecting mucus samples to work out if and when the virus took hold. © 2015 Guardian News and Media Limited

Keyword: Sleep; Neuroimmunology
Link ID: 21371 - Posted: 09.01.2015

Boer Deng Palaeontologist Stephen Gatesy wants to bring extinct creatures to life — virtually speaking. When he pores over the fossilized skeletons of dinosaurs and other long-dead beasts, he tries to imagine how they walked, ran or flew, and how those movements evolved into the gaits of their modern descendents. “I'm a very visual guy,” he says. But fossils are lifeless and static, and can only tell Gatesy so much. So instead, he relies on XROMM, a software package that he developed with his colleagues at Brown University in Providence, Rhode Island. XROMM (X-ray Reconstruction of Moving Morphology) borrows from the technology of motion capture, in which multiple cameras film a moving object from different angles, and markers on the object are rendered into 3D by a computer program. The difference is that XROMM uses not cameras, but X-ray machines that make videos of bones and joints moving inside live creatures such as pigs, ducks and fish. Understanding how the movements relate to the animals' bone structure can help palaeontologists to determine what movements would have been possible for fossilized creatures. “It's a completely different approach” to studying evolution, says Gatesy. XROMM, released to the public in 2008 as an open-source package, is one of a number of software tools that are expanding what researchers know about how animals and humans walk, crawl and, in some cases, fly (see ‘Movement from inside and out’). That has given the centuries-old science of animal motion relevance to a wide range of fields, from studying biodiversity to designing leg braces, prostheses and other assistive medical devices.“We're in an intense period of using camera-based and computer-based approaches to expand the questions we can ask about motion,” says Michael Dickinson, a neuroscientist at the California Institute of Technology in Pasadena. © 2015 Nature Publishing Group

Keyword: Movement Disorders
Link ID: 21370 - Posted: 09.01.2015

By LISA FELDMAN BARRETT Boston — IS psychology in the midst of a research crisis? An initiative called the Reproducibility Project at the University of Virginia recently reran 100 psychology experiments and found that over 60 percent of them failed to replicate — that is, their findings did not hold up the second time around. The results, published last week in Science, have generated alarm (and in some cases, confirmed suspicions) that the field of psychology is in poor shape. But the failure to replicate is not a cause for alarm; in fact, it is a normal part of how science works. Suppose you have two well-designed, carefully run studies, A and B, that investigate the same phenomenon. They perform what appear to be identical experiments, and yet they reach opposite conclusions. Study A produces the predicted phenomenon, whereas Study B does not. We have a failure to replicate. Does this mean that the phenomenon in question is necessarily illusory? Absolutely not. If the studies were well designed and executed, it is more likely that the phenomenon from Study A is true only under certain conditions. The scientist’s job now is to figure out what those conditions are, in order to form new and better hypotheses to test. A number of years ago, for example, scientists conducted an experiment on fruit flies that appeared to identify the gene responsible for curly wings. The results looked solid in the tidy confines of the lab, but out in the messy reality of nature, where temperatures and humidity varied widely, the gene turned out not to reliably have this effect. In a simplistic sense, the experiment “failed to replicate.” But in a grander sense, as the evolutionary biologist Richard Lewontin has noted, “failures” like this helped teach biologists that a single gene produces different characteristics and behaviors, depending on the context. © 2015 The New York Times Company

Keyword: Attention
Link ID: 21369 - Posted: 09.01.2015

When we move our head, the whole visual world moves across our eyes. Yet we can still make out a bee buzzing by or a hawk flying overhead, thanks to unique cells in the eye called object motion sensors. A new study on mice helps explain how these cells do their job, and may bring scientists closer to understanding how complex circuits are formed throughout the nervous system. The study was funded by the National Institutes of Health, and was published online in Nature. “Understanding how neurons are wired together to form circuits in the eye is fundamental for advancing potential new therapies for blinding eye diseases,” said Paul A. Sieving, M.D., Ph.D., director of NIH’s National Eye Institute (NEI). “Research aimed at regenerating photoreceptors, for example, is enriched by efforts to understand the circuitry in the eye.” Object motion sensors are one of about 30 different types of retinal ganglion cells (RGCs) in the retina, the light-sensitive tissue at the back of the eye. These cells are unique because they fire only when the timing of a small object’s movement differs from that of the background; they are silent when the object and the background move in sync. Researchers believe this is critical to our ability to see small objects moving against a backdrop of complex motion. The cells in the retina are wired up like an electrical circuit. Vision begins with photoreceptors, cells that detect light entering the eye and convert it into electrical signals. Middleman neurons, called interneurons, shuttle signals from photoreceptors to the RGCs. And each RGC sends the output visual information deeper into the brain for processing. This all takes place within fractions of a second, so the scientists were surprised to discover that before it reaches object motion sensors, visual information about object motion takes a detour through a unique type of interneuron. Their results represent an ongoing effort by scientists to map out complex circuits of the nervous system.

Keyword: Vision
Link ID: 21368 - Posted: 09.01.2015

By SINDYA N. BHANOO The human eye has a blind spot, though few of us realize it. Now, a new study suggests that it is possible to reduce the spot with training. The optic nerve, which carries visual signals to the brain, passes through the retina, a light-sensitive layer of tissue. There are no so-called photoreceptors at the point where the optic nerve intersects the retina. The right eye generally compensates for the left eye’s blind spot and vice versa, so the spot is hardly noticed. Researchers trained 10 people using a computer monitor and an eye patch. The participants were shown a waveform in the visual field of their blind spot day after day. After 20 days of this repeated stimulation, the blind spot shrunk by about 10 percent. The researchers believe that neurons at the periphery of the blind spot became more responsive, effectively reducing the extent of functional blindness. The findings add to a growing body of research suggesting that the human eye can be trained, said Paul Miller, a psychologist at the University of Queensland in Australia and an author of the study, which appeared in the journal Current Biology. This kind of training may help researchers develop better treatments for visual impairments like macular degeneration. “This is the leading cause of blindness in the western world,” Mr. Miller said. © 2015 The New York Times Company

Keyword: Vision; Learning & Memory
Link ID: 21367 - Posted: 09.01.2015

By EREZ YOELI and DAVID RAND Recently, three young American men and a British businessman thwarted a gunman’s attack on a French passenger train, acting within seconds and at enormous personal risk. When interviewed afterward, they stressed the unthinking nature of their actions. “It was just gut instinct,” said one, in a characteristic remark. “It wasn’t really a conscious decision.” This turns out to be typical of heroes. Last year, one of us, Professor Rand, together with his colleague Ziv Epstein, conducted an analysis of recipients of the Carnegie Medal for heroism, which is awarded to those who risk their lives for others. After collecting interviews given by 51 recipients and evaluating the transcripts, we found that the heroes overwhelming described their actions as fast and intuitive, and virtually never as carefully reasoned. This was true even in cases where the heroes had sufficient time to stop and think. Christine Marty, a college student who rescued a 69-year-old woman trapped in a car during a flash flood, said she was grateful that she didn’t take the time to reflect: “I’m thankful I was able to act and not think about it.” We found almost no examples of heroes whose first impulse was for self-preservation but who overcame that impulse with a conscious, rational decision to help. It is striking that our brute instincts, rather than our celebrated higher cognitive faculties, are what lead to such moral acts. But why would anyone ever develop such potentially fatal instincts? One possible explanation is that in most everyday situations, helping others pays off in the long run. You buy lunch for a friend or pitch in to help a colleague meet a tight deadline, and you find yourself repaid in kind, or even more, down the road. So it’s beneficial to develop a reflex to help — especially because the cost to you is usually quite small. © 2015 The New York Times Company

Keyword: Emotions
Link ID: 21365 - Posted: 08.31.2015

Depressed people who display "risky behaviour", agitation and impulsivity are at least 50% more likely to attempt suicide, a study has found. Research by the European College of Neuropsychopharmacology (ECNP) concluded that the behaviour patterns "precede many suicide attempts". The study said effective prevention measures were "urgently needed". The World Health Organisation estimates that there were more than 800,000 suicides worldwide in 2012. The ECNP study evaluated 2,811 patients suffering from depression, of whom 628 had previously attempted suicide. Researchers "looked especially at the characteristics and behaviours of those who had attempted suicide", and found that "certain patterns recur" before attempts. They said the risk of an attempt was "at least 50% higher" if a depressed patient displayed: "risky behaviour" such as reckless driving or promiscuous behaviour "psychomotor agitation" such as pacing around rooms or wringing their hands impulsivity - defined by the researchers as acting with "little or no forethought, reflection, or consideration of the consequences" Dr Dina Popovic, one of the report's authors, added: "We found that 'depressive mixed states' often preceded suicide attempts. "A depressive mixed state is where a patient is depressed, but also has symptoms of 'excitation', or mania." © 2015 BBC.

Keyword: Depression
Link ID: 21364 - Posted: 08.31.2015

By Roni Caryn Rabin Q: Is it harmful to go on and off antidepressants a few times a year? I seem to respond quickly and quite well to S.S.R.I.'s. I don't desire to be on them long-term, but would like to use them occasionally, to get through a rough patch like a stressful quarter at work. Is it harmful to go on and off of S.S.R.I.'s a few times a year? Yes, it may be harmful. You should always start and stop medication “under a physician’s supervision. Don’t do it on your own,” said Dr. Renee Binder, president of the American Psychiatric Association. It usually takes at least four weeks for an antidepressant to take effect, and patients should give themselves several weeks to taper off a drug when ending treatment. Starting and quitting abruptly expose you to the risks of initiation and withdrawal. Also, you may not get sustained relief from your depression. Antidepressants “don’t work right away,” Dr. Binder said. “It’s the kind of medication that you have to take every single day, and it takes awhile to build up in your body before it starts working.” When starting antidepressants, patients may experience anxiety and agitation and develop other transient side effects like headaches and nausea. Teenagers need close monitoring because they may be at a higher risk of suicide when starting treatment, some studies suggest. It also may take time for your doctor to find the antidepressant and dose that’s right for you. Withdrawal can trigger troubling symptoms like nausea, dizziness and “brain zaps,” a sensation that feels like electric shocks to the head. It can also trigger psychological problems like anxiety, irritability, moodiness and changes in appetite and sleep that mimic depression or may signal a recurrence. Some patients may become paranoid or suicidal. © 2015 The New York Times Company

Keyword: Depression
Link ID: 21363 - Posted: 08.31.2015

By Lily Hay Newman Mental health issues manifest in a number of ways, and they're not all behavioral. Increasingly, scientists are using speech analysis software to detect subtle changes in voice acoustics and patterns to detect or even predict potentially problematic conditions. A study published Wednesday in NPG-Schizophrenia by researchers at Columbia University Medical Center, the New York State Psychiatric Institute, and IBM's T. J. Watson Research Center found that digital speech analysis correctly predicted whether 34 youths at risk for mental illness (11 female, 23 male) would develop psychosis within 2.5 years. The system, which evaluated the study participants quarterly, correctly predicted all of their outcomes; five became psychotic. The algorithm evaluated transcripts for predictive "semantic and syntactic features" like coherence and phrase length. "These speech features predicted later psychosis development with 100% accuracy, outperforming classification from clinical interviews," the researchers wrote. Clinicians are able to accurately categorize patients as "at-risk," but within that subpopulation it is difficult to determine who will actually experience psychosis and potentially develop schizophrenia. If voice recognition software can help identify these individuals, they may be able to receive more effective care. "Computerized analysis of complex human behaviors such as speech may present an opportunity to move psychiatry beyond reliance on self-report and clinical observation toward more objective measures of health and illness in the individual patient," the researchers wrote. © 2015 The Slate Group LLC.

Keyword: Schizophrenia
Link ID: 21362 - Posted: 08.31.2015

By GREGORY COWLES Oliver Sacks, the neurologist and acclaimed author who explored some of the brain’s strangest pathways in best-selling case histories like “The Man Who Mistook His Wife for a Hat,” using his patients’ disorders as starting points for eloquent meditations on consciousness and the human condition, died on Sunday at his home in Manhattan. He was 82. The cause was cancer, said Kate Edgar, his longtime personal assistant. Dr. Sacks announced in February, in an Op-Ed essay in The New York Times, that an earlier melanoma in his eye had spread to his liver and that he was in the late stages of terminal cancer. As a medical doctor and a writer, Dr. Sacks achieved a level of popular renown rare among scientists. More than a million copies of his books are in print in the United States, his work was adapted for film and stage, and he received about 10,000 letters a year. (“I invariably reply to people under 10, over 90 or in prison,” he once said.) Dr. Sacks variously described his books and essays as case histories, pathographies, clinical tales or “neurological novels.” His subjects included Madeleine J., a blind woman who perceived her hands only as useless “lumps of dough”; Jimmie G., a submarine radio operator whose amnesia stranded him for more than three decades in 1945; and Dr. P. — the man who mistook his wife for a hat — whose brain lost the ability to decipher what his eyes were seeing. Describing his patients’ struggles and sometimes uncanny gifts, Dr. Sacks helped introduce syndromes like Tourette’s or Asperger’s to a general audience. But he illuminated their characters as much as their conditions; he humanized and demystified them. © 2015 The New York Times Company

Keyword: Miscellaneous
Link ID: 21361 - Posted: 08.31.2015

An experimental gene therapy reduces the rate at which nerve cells in the brains of Alzheimer’s patients degenerate and die, according to new results from a small clinical trial, published in the current issue of the journal JAMA Neurology. Targeted injection of the Nerve Growth Factor gene into the patients’ brains rescued dying cells around the injection site, enhancing their growth and inducing them to sprout new fibres. In some cases, these beneficial effects persisted for 10 years after the therapy was first delivered. Alzheimer’s is the world’s leading form of dementia, affecting an estimated 47 million people worldwide. This figure is predicted to almost double every 20 years, with much of this increase is likely to be in the developing world. And despite the huge amounts of time, effort, and money devoted to developing an effective cure, the vast majority of new drugs have failed in clinical trials. The new results are preliminary findings from the very first human trials designed to test the potential benefits of nerve growth factor (NGF) gene therapy for Alzheimer’s patients. NGF was discovered in the 1940s by Rita Levi-Montalcini, who convincingly demonstrated that the small protein promotes the survival of certain sub-types of sensory neurons during development of the nervous system. Since then, others have shown that it also promotes the survival of acetylcholine-producing cells in the basal forebrain, which die off in Alzheimer’s. © 2015 Guardian News and Media Limited

Keyword: Alzheimers; Trophic Factors
Link ID: 21360 - Posted: 08.29.2015

We all have days when we feel like our brain is going at a snail’s pace, when our neurons forgot to get out of bed. And psychologists have shown that IQ can fluctuate day to day. So if we’re in good health and don’t have a sleep deficit from last night’s shenanigans to blame, what’s the explanation? Sophie von Stumm, a psychologist at Goldsmiths University, London, set about finding out. In particular, she wanted to know whether mood might explain the brain’s dimmer switch. Although it seems intuitively obvious that feeling low could compromise intellectual performance, von Stumm says research to date has been inconclusive, with some studies finding an effect and others not. “On bad mood days, we tend to feel that our brains are lame and work or study is particularly challenging. But scientists still don’t really know if our brains work better when we are happy compared to when we are sad.” To see if she could pin down mood’s effect on IQ more convincingly, von Stumm recruited 98 participants. Over five consecutive days they completed questionnaires to assess their mood, as well as tests to measure cognitive functions, such as short-term memory, working memory and processing speed. Surprisingly, being in a bad mood didn’t translate into worse cognitive performance. However, when people reported feeling positive, von Stumm saw a modest boost in their processing speed. © Copyright Reed Business Information Ltd.

Keyword: Depression; Intelligence
Link ID: 21359 - Posted: 08.29.2015

By Claire Asher If you stuck to Aesop’s fables, you might think of all ants as the ancient storyteller described them—industrious, hard-working, and always preparing for a rainy day. But not every ant has the same personality, according to a new study. Some colonies are full of adventurous risk-takers, whereas others are less aggressive about foraging for food and exploring the great outdoors. Researchers say that these group “personality types” are linked to food-collecting strategies, and they could alter our understanding of how social insects behave. Personality—consistent patterns of individual behavior—was once considered a uniquely human trait. But studies since the 1990s have shown that animals from great tits to octopuses exhibit “personality.” Even insects have personalities. Groups of cockroaches have consistently shy and bold members, whereas damselflies have shown differences in risk tolerance that stay the same from grubhood to adulthood. To determine how group behavior might vary between ant colonies, a team of researchers led by Raphaël Boulay, an entomologist at the University of Tours in France, tested the insects in a controlled laboratory environment. They collected 27 colonies of the funnel ant (Aphaenogaster senilis) and had queens rear new workers in the lab. This meant that all ants in the experiment were young and inexperienced—a clean slate to test for personality. The researchers then observed how each colony foraged for food and explored new environments. They counted the number of ants foraging, exploring, or hiding during set periods of time, and then compared the numbers to measure the boldness, adventurousness, and foraging efforts of each group. © 2015 American Association for the Advancement of Science

Keyword: Emotions
Link ID: 21358 - Posted: 08.29.2015

They are rather diminutive to be kings of the jungle, but two species of mirid bug make sounds similar to the roars of big cats. These calls have never before been heard in insects, and we’re not sure why, or how, the insects produce the eerie calls. The roars are too weak to be heard by humans without a bit of help. But Valerio Mazzoni of the Edmund Mach Foundation in Italy and his team made them audible by amplifying them using a device called a laser vibrometer. The device detects the minute vibrations that the bugs produce on the leaves on which they live. “When you listen to these sounds through headphones you’d think you were next to a tiger or lion,” Mazzoni. The team found that when two males were introduced on the same leaf, they seemed to compete in roaring duets. When one insect heard a roar, it always sounded its own, apparently in response. This suggests that, as in big cats, the roars might serve to establish dominance or attract females. Female mirids don’t seem to roar. But unlike the roars of big cats, the sounds produced by bugs are transmitted through the solid material beneath their feet, usually a leaf, rather than by the vibration of air molecules. Thousands of insect species communicate through such vibration, but these roars are unlike any other known insect noise. © Copyright Reed Business Information Ltd.

Keyword: Sexual Behavior; Animal Communication
Link ID: 21357 - Posted: 08.29.2015

Raiding the fridge or downing glasses of water after a night of heavy drinking won't improve your sore head the next day, Dutch research suggests. Instead, a study concluded, the only way to prevent a hangover is to drink less alcohol. More than 800 students were asked how they tried to relieve hangover symptoms, but neither food nor water was found to have any positive effect. The findings are being presented at a conference in Amsterdam. A team of international researchers from the Netherlands and Canada surveyed students' drinking habits to find out whether hangovers could be eased or if some people were immune to them. Among 826 Dutch students, 54% ate food after drinking alcohol, including fatty food and heavy breakfasts, in the hope of staving off a hangover. With the same aim, more than two-thirds drank water while drinking alcohol and more than half drank water before going to bed. Although these groups showed a slight improvement in how they felt compared with those who hadn't drunk water, there was no real difference in the severity of their hangovers. Previous research suggests that about 25% of drinkers claim never to get hangovers. So the researchers questioned 789 Canadian students about their drinking in the previous month and the hangovers they experienced, finding that those who didn't get a hangover simply consumed "too little alcohol to develop a hangover in the first place". Of those students who drank heavily, with an estimated blood alcohol concentration of more than 0.2%, almost no-one was immune to hangovers. © 2015 BBC.

Keyword: Drug Abuse
Link ID: 21356 - Posted: 08.29.2015