Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 20165

|By Esther Landhuis As we age, we seem to get worse at ignoring irrelevant stimuli. It's what makes restaurant conversations challenging—having to converse while also shutting out surrounding chatter. New research bears out the aging brain's distractibility but also suggests that training may help us tune out interference. Scientists at Brown University recruited seniors and twentysomethings for a visual experiment. Presented with a sequence of letters and numbers, participants were asked to report back only the numbers—all the while disregarding a series of meaningless dots. Sometimes the dots moved randomly, but other times they traveled in a clear direction, making them harder to ignore. Older participants ended up accidentally learning the dots' patterns, based on the accuracy of their answers when asked which way the dots were moving, whereas young adults seemed able to suppress that information and focus on the numbers, the researchers reported last November in Current Biology. In a separate study published in Neuron, scientists at the University of California, San Francisco, showed they could train aging brains to become less distractible. Their regimen helped aging rats as well as older people. The researchers played three different sounds and rewarded trainees for identifying a target tone while ignoring distracter frequencies. As the subjects improved, the task grew more challenging—the distracting tone became harder to discriminate from the target. © 2015 Scientific American,

Keyword: Attention; Alzheimers
Link ID: 20681 - Posted: 03.12.2015

By Gretchen Reynolds An easy, two-minute vision test administered on the sidelines after a young athlete has hit his or her head can help to reliably determine whether the athlete has sustained a concussion, according to a new study of student athletes, some as young as 5. The test is so simple and inexpensive that any coach or parent potentially could administer it, the study’s authors believe, and any league afford to provide it as a way to help evaluate and safeguard players. Those of us who coach or care for young athletes know by now that an athlete who falls or collides with something during play or seems dazed, dizzy, loses consciousness or complains of head pain should be tested for a concussion, which occurs when the brain is physically jostled within the skull. But most of us are clueless about how to test young athletes. The most commonly recommended sideline test is the Standardized Assessment of Concussion, a multipart examination during which athletes are asked to name the date, describe how they feel, memorize and recall lists of words, and do jumping jacks and other tests of coordination. Ideally, this assessment should be administered and evaluated by a medical professional. But while the sidelines of college and professional games are crowded with doctors and certified athletic trainers, few high schools and youth leagues have those resources. Most of the time, concussion testing in youth sports falls to volunteer coaches or parents with little if any medical experience. That situation prompted researchers at New York University’s Langone Concussion Center to begin wondering recently whether there might be other, easier diagnostic tools to check young players for concussions. Their thoughts soon turned to vision. “About 50 percent of the brain’s pathways are tied in some to way to vision and visual processing,” said Dr. Steven Galetta, chairman of neurology at N.Y.U. Langone Medical Center and senior author of the study, which was published in The Journal of Neuro-Ophthalmology. © 2015 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 20680 - Posted: 03.12.2015

By Douglas Starr In 1906, Hugo Münsterberg, the chair of the psychology laboratory at Harvard University and the president of the American Psychological Association, wrote in the Times Magazine about a case of false confession. A woman had been found dead in Chicago, garroted with a copper wire and left in a barnyard, and the simpleminded farmer’s son who had discovered her body stood accused. The young man had an alibi, but after questioning by police he admitted to the murder. He did not simply confess, Münsterberg wrote; “he was quite willing to repeat his confession again and again. Each time it became richer in detail.” The young man’s account, he continued, was “absurd and contradictory,” a clear instance of “the involuntary elaboration of a suggestion” from his interrogators. Münsterberg cited the Salem witch trials, in which similarly vulnerable people were coerced into self-incrimination. He shared his opinion in a letter to a Chicago nerve specialist, which made the local press. A week later, the farmer’s son was hanged. Münsterberg was ahead of his time. It would be decades before the legal and psychological communities began to understand how powerfully suggestion can shape memory and, in turn, the course of justice. In the early nineteen-nineties, American society was recuperating from another panic over occult influence; Satanists had replaced witches. One case, the McMartin Preschool trial, hinged on nine young victims’ memories of molestation and ritual abuse—memories that they had supposedly forgotten and then, after being interviewed, recovered. The case fell apart, in 1990, because the prosecution could produce no persuasive evidence of the victims’ claims. A cognitive psychologist named Elizabeth Loftus, who had consulted on the case, wondered whether the children’s memories might have been fabricated—in Münsterberg’s formulation, involuntarily elaborated—rather than actually recovered.

Keyword: Learning & Memory
Link ID: 20679 - Posted: 03.12.2015

|By Anne Skomorowsky On a Saturday night last month, 12 students at Wesleyan University in Connecticut were poisoned by “Molly,” a hallucinogenic drug they had taken to enhance a campus party. Ambulances and helicopters transported the stricken to nearby hospitals, some in critical condition. Molly—the street name for the amphetamine MDMA—can cause extremely high fevers, liver failure, muscle breakdown, and cardiac arrest. Given the risks associated with Molly, why would anybody take it? The obvious answer—to get high—is only partly true. Like many drugs of abuse, Molly causes euphoria. But Molly is remarkable for its “prosocial” effects. Molly makes users feel friendly, loving, and strongly connected to one another. Molly is most commonly used in settings where communion with others is highly valued, such as raves, music festivals, and college parties. Recently, psychiatrists have taken an interest in its potential to enhance psychotherapy; this has led to new research into the mechanisms by which MDMA makes people feel closer. It appears that MDMA works by shifting the user’s attention towards positive experiences while minimizing the impact of negative feelings. To investigate this, a 2012 study by Cedric Hysek and colleagues used the Reading the Mind in the Eyes Test (RMET), which was developed to evaluate people with autism. In the RMET, participants are shown 36 pictures of the eye region of faces. Their task is to describe what the person in the picture is feeling. Volunteers taking MDMA, under carefully controlled conditions, improved in their recognition of positive emotions; but their performance in recognizing negative emotions declined. In other words, they incorrectly attributed positive or neutral feelings to images that were actually negative in emotional tone. They mistook negative and threat-related images for friendly ones. © 2015 Scientific American

Keyword: Drug Abuse
Link ID: 20678 - Posted: 03.12.2015

Mo Costandi Neuroscientists in France have implanted false memories into the brains of sleeping mice. Using electrodes to directly stimulate and record the activity of nerve cells, they created artificial associative memories that persisted while the animals snoozed and then influenced their behaviour when they awoke. Manipulating memories by tinkering with brain cells is becoming routine in neuroscience labs. Last year, one team of researchers used a technique called optogenetics to label the cells encoding fearful memories in the mouse brain and to switch the memories on and off, and another used it to identify the cells encoding positive and negative emotional memories, so that they could convert positive memories into negative ones, and vice versa. The new work, published today in the journal Nature Neuroscience, shows for the first time that artificial memories can be implanted into the brains of sleeping animals. It also provides more details about how populations of nerve cells encode spatial memories, and about the important role that sleep plays in making such memories stronger. Karim Benchenane of the French National Centre for Scientific Research (CNRS) in Paris and his colleagues implanted electrodes into the brains of 40 mice, targeting the medial forebrain bundle (MFB), a component of the reward circuitry, and the CA1 region of the hippocampus, which contains at least three different cell types that encode the memories needed for spatial navigation. © 2015 Guardian News and Media Limited

Keyword: Learning & Memory; Sleep
Link ID: 20677 - Posted: 03.10.2015

By Nicholas Bakalar People sometimes take Valium or Ativan to relieve anxiety before surgery, but a new study suggests that these benzodiazepine drugs have little beneficial effect and may even delay recovery. Researchers studied 1,062 patients admitted to French hospitals for surgery requiring general anesthesia. A third took 2.5 milligrams of lorazepam (brand name Ativan), a third received a placebo, and a third were given no premedication. Patients completed questionnaires assessing anxiety, pain levels and quality of sleep before and a day after their operations, while researchers recorded their time to having ventilation tubes removed and to recovering full wakefulness. The study was published in JAMA. Lorazepam was associated with more postsurgery amnesia and a longer time to recover cognitive abilities. Quality of sleep was impaired in the lorazepam group, but not in the others. And ventilation tubes were kept in significantly longer in the lorazepam group. Pain scores did not differ between the lorazepam and the no-medication groups, but there was more pain in the group given the placebo. The lead author, Dr. Axel Maurice-Szamburski, an anesthesiologist at Timone Hospital in Marseille, cited recent surveys showing that benzodiazepines are widely prescribed before surgery. “But until now,” he added, “sedatives have not been evaluated from the patient’s point of view. It’s the patient who should be happy, not the doctor.” © 2015 The New York Times Company

Keyword: Emotions
Link ID: 20676 - Posted: 03.10.2015

Jon Hamilton Alzheimer's, Parkinson's and amyotrophic lateral sclerosis ravage the brain in very different ways. But they have at least one thing in common, says Corinne Lasmezas, a neuroscientist and professor at Scripps Research Institute, in Jupiter, Fla. Each spreads from brain cell to brain cell like an infection. "So if we could block this [process], that might prevent the diseases," Lasmezas says. It's an idea that's being embraced by a growing number of researchers these days, including Nobel laureate Dr. Stanley Prusiner, who first recognized in the 1980s the infectious nature of brain proteins that came to be called prions. But the idea that mad cow prions could cause disease in people has its origins in an epidemic of mad cow disease that occurred in Europe and the U.K. some 15 years ago. Back then, Lasmezas was a young researcher in France studying how mad cow, formally known as bovine spongiform encephalopathy, was transmitted. "At that time, nobody knew if this new disease in cows was actually transmissible to humans," she says. In 1996, Lasmezas published a study strongly suggesting that it was. "So that was my first great research discovery," she says. Prions, it turns out, become toxic to brain cells when folded into an abnormal shape. "This misfolded protein basically kills the neurons," Lasmezas says. © 2015 NPR

Keyword: Prions; Parkinsons
Link ID: 20675 - Posted: 03.10.2015

By CELIA WATSON SEUPEL Every year, nearly 40,000 Americans kill themselves. The majority are men, and most of them use guns. In fact, more than half of all gun deaths in the United States are suicides. Experts and laymen have long assumed that people who died by suicide will ultimately do it even if temporarily deterred. “People think if you’re really intent on dying, you’ll find a way,” said Cathy Barber, the director of the Means Matters campaign at Harvard Injury Control Research Center. Prevention, it follows, depends largely on identifying those likely to harm themselves and getting them into treatment. But a growing body of evidence challenges this view. Suicide can be a very impulsive act, especially among the young, and therefore difficult to predict. Its deadliness depends more upon the means than the determination of the suicide victim. Now many experts are calling for a reconsideration of suicide-prevention strategies. While mental health and substance abuse treatment must always be important components in treating suicidality, researchers like Ms. Barber are stressing another avenue: “means restriction.” Instead of treating individual risk, means restriction entails modifying the environment by removing the means by which people usually die by suicide. The world cannot be made suicide-proof, of course. But, these researchers argue, if the walkway over a bridge is fenced off, a struggling college freshman cannot throw herself over the side. If parents leave guns in a locked safe, a teenage son cannot shoot himself if he suddenly decides life is hopeless. With the focus on who dies by suicide, these experts say, not enough attention has been paid to restricting the means to do it — particularly access to guns. © 2015 The New York Times Company

Keyword: Depression
Link ID: 20674 - Posted: 03.10.2015

If you missed the great dress debate of 2015 you were probably living under a rock. Staffrooms across the globe threatened to come to a standstill as teachers addressed the all-important question – was the dress white and gold or blue and black? This is just one example of how our brains interpret things differently. So, with the 20th anniversary of Brain Awareness Week from 16 to 22 March, this week we bring you a collection of ideas and resources to get students’ synapses firing. The brain is one of our most interesting organs, and advances in technology and medicine mean we now know more about it than ever before. Brain Awareness Week is a global campaign to raise awareness of the progress and benefits of brain research. The organisers, the Dana Foundation, have put together an assortment of teaching materials for primary and secondary students. For children aged five to nine, the Mindboggling Workbook is a good place to start. It includes information on how the brain works, what it does and how to take care of it. There’s also a section on the nervous system, which you could turn into a fun group activity. Ask one student to lie down on a large sheet of paper while others trace around them. Add a drawing of the brain and the spinal cord. Use different coloured crayons to illustrate how neurons send messages around your body when you a) touch something hot, b) get stung on the leg by a wasp, and c) wriggle your toes after stepping in sand. Can students explain why the brain is described as being more powerful than a computer? © 2015 Guardian News and Media Limited

Keyword: Miscellaneous
Link ID: 20673 - Posted: 03.10.2015

Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.

Keyword: Attention
Link ID: 20672 - Posted: 03.10.2015

By Rachel Rabkin Peachman Many women with a history of depression who take antidepressants assume that once they get pregnant, they should try to wean themselves off their meds to avoid negative side effects for the baby. A new large study published in the journal Pediatrics challenges one reason behind that assumption. The research found that taking selective serotonin reuptake inhibitors (the antidepressants also known as S.S.R.I.s) while pregnant does not increase the risk of asthma in the resulting babies. What is associated with an increased risk of asthma? According to this study and other research, untreated prenatal depression. “The mechanisms underlying the association of prenatal depression and asthma are unknown,” said Dr. Xiaoqin Liu, the lead author of the Pediatrics study and an epidemiologist at Aarhus University in Denmark. An association between prenatal depression and asthma does not mean that prenatal depression causes asthma. There could be other reasons for the correlation, genetic or environmental, or both. For example, people who live in dense, polluted urban areas could be at an increased risk of both asthma and depression. The researchers used Denmark’s national registries to evaluate all singleton babies born from 1996 to 2007, and identify the mothers who had a diagnosis of depression or had used antidepressants, or both, during pregnancy or one year beforehand. Using a statistical model, the study authors found that prenatal depression — with or without the use of antidepressants — was associated with a 25 percent increased risk of asthma in children as compared with children whose mothers did not have a record of depression. © 2015 The New York Times Company

Keyword: Depression; Development of the Brain
Link ID: 20671 - Posted: 03.10.2015

Alison Abbott Mediators appointed to analyse the rifts within Europe’s ambitious €1-billion (US$1.1-billion) Human Brain Project (HBP) have called for far-reaching changes both in its governance and its scientific programmes. Most significantly, the report recommends that systems neuroscience and cognitive neuroscience should be reinstated into the HBP. The mediation committee, led by engineer Wolfgang Marquardt, director of Germany’s national Jülich Research Centre, sent its final report to the HBP board of directors on 9 March, and issued a press release summarizing its findings. (The full report will not be published until after the board, a 22-strong team of scientists, discusses its contents at a meeting on 17–18 March). The European Commission flagship project, which launched in October 2013, is intended to boost supercomputing through neuroscience, with the aim of simulating the brain in a computer. But the project has been racked by dissent from the outset. In early 2014, a three-person committee of scientists who ran the HBP’s scientific direction revealed that they planned to eliminate cognitive neuroscience from the initiative, which precipitated a mass protest. More than 150 of Europe’s leading neuroscientists signed a letter to the European Commission, complaining about the project’s management and charging that the HBP plan to simulate the brain using only ‘bottom-up’ data on the behaviour of neurons was doomed to failure if it did not include the top-down constraints provided by systems and cognitive neuroscience. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20670 - Posted: 03.10.2015

By TIMOTHY WILLIAMS In January 1972, Cecil Clayton was cutting wood at his family’s sawmill in southeastern Missouri when a piece of lumber flew off the circular saw blade and struck him in the forehead. The impact caved in part of Mr. Clayton’s skull, driving bone fragments into his brain. Doctors saved his life, but in doing so had to remove 20 percent of his frontal lobe, which psychiatrists say led Mr. Clayton to be tormented for years by violent impulses, schizophrenia and extreme paranoia. In 1996, his lawyers say, those impulses drove Mr. Clayton to kill a law enforcement officer. Today, as Mr. Clayton, 74, sits on death row, his lawyers have returned to that 1972 sawmill accident in a last-ditch effort to save his life, arguing that Missouri’s death penalty law prohibits the execution of severely brain-damaged people. Lawyers for Mr. Clayton, who has an I.Q. of 71, say he should be spared because his injury has made it impossible for him to grasp the significance of his death sentence, scheduled for March 17. “There was a profound change in him that he doesn’t understand, and neither did his family,” said Elizabeth Unger Carlyle, one of Mr. Clayton’s lawyers. While several rulings by the United States Supreme Court in recent years have narrowed the criteria for executing people who have a mental illness, states continue to hold wide sway in establishing who is mentally ill. The debate surrounding Mr. Clayton involves just how profoundly his impairment has affected his ability to understand what is happening to him. Mr. Clayton is missing about 7.7 percent of his brain. © 2015 The New York Times Company

Keyword: Aggression; Attention
Link ID: 20669 - Posted: 03.09.2015

By James Gallagher Health editor, BBC News website, San Diego A dog has been used to sniff out thyroid cancer in people who had not yet been diagnosed, US researchers say. Tests on 34 patients showed an 88% success rate in finding tumours. The team, presenting their findings at the annual meeting of the Endocrine Society, said the animal had an "unbelievable" sense of smell. Cancer Research UK said using dogs would be impractical, but discovering the chemicals the dogs can smell could lead to new tests. The thyroid is a gland in the neck that produces hormones to regulate metabolism. Thyroid tumours are relatively rare and are normally diagnosed by testing hormone levels in the blood and by using a needle to extract cells for testing. Cancers are defective, out-of-control cells. They have their own unique chemistry and release "volatile organic compounds" into the body. The canine approach relies on dogs having 10 times the number of smell receptors as people and being able to pick out the unique smells being released by cancers. The man's best friend approach has already produced promising results in patients with bowel and lung cancers. A team at the University of Arkansas for Medical Sciences (UAMS) had previously showed that a dog could be trained to smell the difference between urine samples of patients with and without thyroid cancer. Frankie the dog Frankie gave the correct diagnosis in 30 out of 34 cases The next step was to see if it could be used as a diagnostic test. Frankie the German Shepherd was trained to lie down when he could smell thyroid cancer in a sample and turn away if the urine was clean.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20668 - Posted: 03.09.2015

By Lily Hay Newman When I was growing up, I had a lazy eye. I had to wear a patch over my stronger eye for many years so that good-for-nothing, freeloading, lazy eye could learn some responsibility and toughen up. Wearing a patch was really lousy, though, because people would ask me about it all the time and say things like, "What's wrong with you?" Always fun to hear. I would have much preferred to treat my condition, which is also called amblyopia, by playing video games. Who wouldn't? And it seems like that dream may become a possibility. On Tuesday, developer Ubisoft announced Dig Rush, a game that uses stereoscopic glasses and blue and red figures in varying contrasts to attempt to treat amblyopia. Working in collaboration with McGill University and the eye treatment startup Amblyotech, Ubisoft created a world where controlling a mole character to mine precious metals is really training patients' brains to coordinate their eyes. When patients wear a patch, they may force their lazy eye to toughen up, but they aren't doing anything to teach their eyes how to work together. This lack of coordination, called strabismus, is another important factor that the game makers hope can be addressed better by Dig Rush than by "patching" alone. Amblyotech CEO Joseph Koziak said in a statement, “[This] electronic therapy has been tested clinically to significantly increase the visual acuity of both children and adults who suffer from this condition without the use of an eye patch.” One advantage of Dig Rush, he noted, is that it's easier to measure compliance with video games.

Keyword: Vision
Link ID: 20667 - Posted: 03.09.2015

By RICHARD A. FRIEDMAN CHANCES are that everyone on this planet has experienced anxiety, that distinct sense of unease and foreboding. Most of us probably assume that anxiety always has a psychological trigger. Yet clinicians have long known that there are plenty of people who experience anxiety in the absence of any danger or stress and haven’t a clue why they feel distressed. Despite years of psychotherapy, many experience little or no relief. It’s as if they suffer from a mental state that has no psychological origin or meaning, a notion that would seem heretical to many therapists, particularly psychoanalysts. Recent neuroscience research explains why, in part, this may be the case. For the first time, scientists have demonstrated that a genetic variation in the brain makes some people inherently less anxious, and more able to forget fearful and unpleasant experiences. This lucky genetic mutation produces higher levels of anandamide — the so-called bliss molecule and our own natural marijuana — in our brains. In short, some people are prone to be less anxious simply because they won the genetic sweepstakes and randomly got a genetic mutation that has nothing at all to do with strength of character. About 20 percent of adult Americans have this mutation. Those who do may also be less likely to become addicted to marijuana and, possibly, other drugs — presumably because they don’t need the calming effects that marijuana provides. One patient of mine, a man in his late 40s, came to see me because he was depressed and lethargic. He told me at our first meeting that he had been using cannabis almost daily for at least the past 15 years. “It became a way of life,” he explained. “Things are more interesting, and I can tolerate disappointments without getting too upset.” © 2015 The New York Times Company

Keyword: Drug Abuse; Stress
Link ID: 20666 - Posted: 03.09.2015

by Penny Sarchet For some of us, it might have been behind the bikeshed. Not so the African cotton leafworm moth (Spodoptera littoralis), which can choose any one of a vast number of plant species to mate on. But these moths remember their first time, returning to the same species in search of other mates. In the wild, this moth feeds and mates on species from as many as 40 different plant families. That much choice means there's usually something available to eat, but selecting and remembering the best plants is tricky. So, recalling what you ate as a larva, or where you first copulated, may help narrow down which plants provide better quality food or are more likely to attract other potential mates. Magali Proffit and David Carrasco of the Swedish University of Agricultural Sciences in Alnarp and their colleagues have discovered that this moth's first mating experience shapes its future preferences. These moths have an innate preference for cotton plants over cabbage. But when the researchers made them mate for the first time on cabbage, the moths later showed an increased preference for mating or laying eggs on this plant. Further experiments revealed that moths didn't just favour plants they were familiar with, even in combination with a sex pheromone – mating had to be involved. © Copyright Reed Business Information Ltd.

Keyword: Sexual Behavior; Learning & Memory
Link ID: 20665 - Posted: 03.09.2015

By Neuroskeptic There is a popular view that all of the natural sciences can be arranged in a chain or ladder according to the complexity of their subjects. On this view, physics forms the base of the ladder because it deals with the simplest building-blocks of matter, atoms and subatomic particles. Chemistry is next up because it studies interacting atoms i.e. molecules. Biology studies complex collections of molecules, i.e. cells. Then comes neuroscience which deals with a complex collection of interacting cells – the brain. Psychology, perhaps, can be seen as the next level above neuroscience, because psychology studies brains interacting with each other and with the environment. So this on this model, we have a kind of Great Chain of Science, something like this: This is an appealing model. But is biology really basic to neuroscience (and psychology)? At first glance it seems like biology – most importantly cell and molecular biology – surely is basic to neuroscience. After all, brains are comprised of cells. All of the functions of brain cells, like synaptic transmission and plasticity, are products of biological machinery, i.e. proteins and ultimately genes. This doesn’t imply that neuroscience could be ‘reduced to’ biology, any more than biology will ever be reduced to pure chemistry, but it does seem to imply that biology is the foundation for neuroscience.

Keyword: Miscellaneous
Link ID: 20664 - Posted: 03.09.2015

by Sarah Zielinski Before they grow wings and fly, young praying mantises have to rely on leaps to move around. But these little mantises are really good at jumping. Unlike most insects, which tend to spin uncontrollably and sometimes crash land, juvenile praying mantises make precision leaps with perfect landings. But how do they do that? To find out, Malcolm Burrows of the University of Cambridge in England and colleagues filmed 58 juvenile Stagmomantis theophila praying mantises making 381 targeted jumps. The results of their study appear March 5 in Current Biology. For each test leap, the researchers put a young insect on a ledge with a black rod placed one to two body lengths away. A jump to the rod was fast — only 80 milliseconds, faster than a blink of an eye — but high-speed video captured every move at 1,000 frames per second. That let the scientists see what was happening: First, the insect shook its head from side to side, scanning its path. Then it rocked backwards and curled up its abdomen, readying itself to take a leap. With a push of its legs, the mantis was off. In the air, it rotated its abdomen, hind legs and front legs, but its body stayed level until it hit the target and landed on all four limbs. “The abdomen, front legs and hind legs performed a series of clockwise and anticlockwise rotations during which they exchanged angular momentum at different times and in different combinations,” the researchers write. “The net result … was that the trunk of the mantis spun by 50˚relative to the horizontal with a near-constant angular momentum, aligning itself perfectly for landing with the front and hind legs ready to grasp the target.” © Society for Science & the Public 2000 - 2015

Keyword: Vision
Link ID: 20663 - Posted: 03.07.2015

By Nicholas Weiler Killer whales wouldn’t get far without their old ladies. A 9-year study of orcas summering off the southern tip of Vancouver Island in the Pacific Northwest finds that menopausal females usually lead their families to find salmon, particularly when the fish are scarce. Older females’ years of foraging experience may help their clans survive in years of famine, an evolutionary benefit that could explain why—like humans—female orcas live for decades past their reproductive prime. “Menopause is a really bizarre trait. Evolutionarily it doesn’t make sense,” says biologist Lauren Brent of the University of Exeter in the United Kingdom, who led the new study. Most animals keep having babies until they drop, part of the evolutionary drive to spread their genes as widely as possible. Only female humans, pilot whales, and killer whales are known to go through menopause: At a certain age, they stop reproducing, but continue to lead long, productive lives. Like humans, female killer whales stop giving birth by about 40, but can live into their 90s. Anthropologists have proposed a controversial explanation for menopause in humans: that grandmothers contribute to their genetic legacies by helping their children and grandchildren survive and reproduce. In hunter-gatherer and other societies, elders find extra food, babysit, and remember tribal lore about how to live through floods, famines, and other hardships. According to the “grandmother hypothesis,” this contribution is so valuable that it helped spur the evolution of women’s long postreproductive lives. Orcas too depend on their elders: Adult killer whales’ mortality rates skyrocket after their elderly mothers die. But how the menopausal whales might help their children survive was not clear, Brent says. © 2015 American Association for the Advancement of Science.

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 20662 - Posted: 03.07.2015