Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 19293

By PETER ANDREY SMITH Sweet, salty, sour and bitter — every schoolchild knows these are the building blocks of taste. Our delight in every scrumptious bonbon, every sizzling hot dog, derives in part from the tongue’s ability to recognize and signal just four types of taste. But are there really just four? Over the last decade, research challenging the notion has been piling up. Today, savory, also called umami, is widely recognized as a basic taste, the fifth. And now other candidates, perhaps as many as 10 or 20, are jockeying for entry into this exclusive club. “What started off as a challenge to the pantheon of basic tastes has now opened up, so that the whole question is whether taste is even limited to a very small number of primaries,” said Richard D. Mattes, a professor of nutrition science at Purdue University. Taste plays an intrinsic role as a chemical-sensing system for helping us find what is nutritious (stimulatory) and as a defense against what is poison (aversive). When we put food in our mouths, chemicals slip over taste buds planted into the tongue and palate. As they respond, we are thrilled or repulsed by what we’re eating. But the body’s reaction may not always be a conscious one. In the late 1980s, in a windowless laboratory at Brooklyn College, the psychologist Anthony Sclafani was investigating the attractive power of sweets. His lab rats loved Polycose, a maltodextrin powder, even preferring it to sugar. That was puzzling for two reasons: Maltodextrin is rarely found in plants that rats might feed on naturally, and when human subjects tried it, the stuff had no obvious taste. More than a decade later, a team of exercise scientists discovered that maltodextrin improved athletic performance — even when the tasteless additive was swished around in the mouth and spit back out. Our tongues report nothing; our brains, it seems, sense the incoming energy. © 2014 The New York Times Company

Keyword: Chemical Senses (Smell & Taste)
Link ID: 19867 - Posted: 07.22.2014

Sarah C. P. Williams The wheezing, coughing, and gasping for breath that come with a sudden asthma attack aren’t just the fault of an overactive immune system. A particularly sensitive bundle of neurons stretching from the brain to the lungs might be to blame as well, researchers have found. Drugs that alter these neurons could provide a new way to treat some types of asthma. “This is an exciting confirmation of an idea that’s been around for decades,” says Allison Fryer, a pulmonary pharmacology researcher at Oregon Health & Science University in Portland, who was not involved in the new study. An asthma attack can be brought on by a variety of triggers, including exercise, cold temperatures, pollen, and dust. During an attack, a person’s airways become inflamed, mucus clogs their lungs, and the muscles surrounding their airways tighten. Asthma is often considered a disease of the immune system because immune cells go into overdrive when they sense a trigger and cause inflammation. But a bundle of nerves that snakes through the neck and chest, the vagus nerve, has long been suspected to play a role; the cells it contains, after all, control the airway muscles. Studying which cell types and molecular pathways within the thick nerve bundle are involved, though, has been tough—the vagus contains a multitude of different cells that are physically intertwined. Working together at the Howard Hughes Medical Institute’s Janelia Farm Research Campus in Ashburn, Virginia, neurobiologists Dimitri Tränkner, now at the University of Utah in Salt Lake City, and Charles Zuker of Columbia University turned to genetics to work out the players. They selectively shut off different sets of the neurons in mice based on which genes each neuron expressed, rather than their physical location. Then, through a series of injections, they gave the animals an egg white allergy that causes asthmalike symptoms. © 2014 American Association for the Advancement of Science

Keyword: Stress
Link ID: 19866 - Posted: 07.22.2014

By DONALD G. MCNEIL Where was I? Sorry — must have nodded off for a decade. Ten years ago, I spent two nights in a sleep lab at SUNY Downstate Medical Center, taking the test for sleep apnea, and wrote about it for Science Times. Back then, “sleep technicians” wired me up like the Bride of Frankenstein: 15 sensors glued or clamped to my scalp, lip, eye sockets, jaw, index finger, chest and legs, two belts around my torso, and a “snore mike” on my neck. As I slept, an infrared camera watched over me. And I ended up spending 23 hours in that hospital bed because the test wasn’t over until you could lie in a dark room for 20 minutes without dozing off. I had such a sleep deficit that I kept conking out, not just all night, but all the next day. So this year, when a company called NovaSom offered to let me try out a new home sleep-test kit that promises to streamline the process, I said yes. In the decade since my ordeal, the pendulum has swung sharply in the direction of the home test, said Dr. M. Safwan Badr, past president of the American Academy of Sleep Medicine, which first recognized home testing for apnea in 2007. Insurers prefer it because it costs only about $300, about one-tenth that of a hospital test, and many patients like it, too. “Lots of people are reluctant to let a stranger watch them sleep,” said Dr. Michael Coppola, a former president of the American Sleep Apnea Association who is now the chief medical officer at NovaSom. Doctors estimate that 18 million Americans have moderate to severe apnea and 75 percent of them do not know it. Home testing is not recommended for those with heart failure, emphysema, seizures and a few other conditions. And because it does not record brain waves as a hospital lab does, a home test can be fooled by someone who just lies awake all night staring at the ceiling. But it’s useful for many people who exhibit the warning signs of apnea, such as waking up exhausted after a full night’s sleep or dozing off at the wheel in bright daylight. And severe apnea can be lethal: starving the brain of oxygen all night quadruples the risk of stroke. © 2014 The New York Times Company

Keyword: Sleep
Link ID: 19865 - Posted: 07.22.2014

Sara Reardon Broad population studies are shedding light on the genetic causes of mental disorders. Researchers seeking to unpick the complex genetic basis of mental disorders such as schizophrenia have taken a huge step towards their goal. A paper1 published in Nature this week ties 108 genetic locations to schizophrenia — most for the first time. The encouraging results come on the same day as a US$650-million donation to expand research into psychiatric conditions. Philanthropist Ted Stanley gave the money to the Stanley Center for Psychiatric Research at the Broad Institute in Cambridge, Massachusetts. The institute describes the gift as the largest-ever donation for psychiatric research. “The assurance of a very long life of the centre allows us to take on ambitious long-term projects and intellectual risks,” says its director, Steven Hyman. The centre will use the money to fund genetic studies as well as investigations into the biological pathways involved in conditions such as schizophrenia, autism and bipolar disorder. The research effort will also seek better animal and cell models for mental disorders, and will investigate chemicals that might be developed into drugs. The Nature paper1 was produced by the Psychiatric Genomics Consortium (PGC) — a collaboration of more than 80 institutions, including the Broad Institute. Hundreds of researchers from the PGC pooled samples from more than 150,000 people, of whom 36,989 had been diagnosed with schizophrenia. This enormous sample size enabled them to spot 108 genetic locations, or loci, where the DNA sequence in people with schizophrenia tends to differ from the sequence in people without the disease. “This paper is in some ways proof that genomics can succeed,” Hyman says. © 2014 Nature Publishing Group

Keyword: Schizophrenia; Aggression
Link ID: 19864 - Posted: 07.22.2014

Most of the genetic risk for autism comes from versions of genes that are common in the population rather than from rare variants or spontaneous glitches, researchers funded by the National Institutes of Health have found. Heritability also outweighed other risk factors in this largest study of its kind to date. About 52 percent of the risk for autism was traced to common and rare inherited variation, with spontaneous mutations contributing a modest 2.6 percent of the total risk. “Genetic variation likely accounts for roughly 60 percent of the liability for autism, with common variants comprising the bulk of its genetic architecture,” explained Joseph Buxbaum, Ph.D., of the Icahn School of Medicine at Mount Sinai (ISMMS), New York City. “Although each exerts just a tiny effect individually, these common variations in the genetic code add up to substantial impact, taken together.” Buxbaum, and colleagues of the Population-Based Autism Genetics and Environment Study (PAGES) Consortium, report on their findings in a unique Swedish sample in the journal Nature Genetics, July 20, 2014. “Thanks to the boost in statistical power that comes with ample sample size, autism geneticists can now detect common as well as rare genetic variation associated with risk,” said Thomas R. Insel, M.D., director of the NIH’s National Institute of Mental Health (NIMH). “Knowing the nature of the genetic risk will reveal clues to the molecular roots of the disorder. Common variation may be more important than we thought.”

Keyword: Autism; Aggression
Link ID: 19863 - Posted: 07.22.2014

By JAN HOFFMAN As it has for decades, the Centers for Disease Control and Prevention last week released its annual National Health Interview Survey on the health of Americans. But this year, there was a difference: For the first time, the respondents were asked about their sexual orientation. Of 34,557 adults ages 18 and older, the survey reported, 1.6 percent said they were gay or lesbian. Some critics say the numbers are low, but they fall in the range of other surveys. In the new survey, however, only 0.7 percent of respondents described themselves as bisexual; other studies have reported higher numbers. Adults who identified themselves as gay, lesbian or bisexual reported some different behaviors and concerns — for example, more alcohol consumption and cigarette smoking — than those who said they were straight. But it can be difficult to elicit information that many people consider private. The New York Times spoke about such challenges with Gary J. Gates, a demographer at the Williams Institute at the U.C.L.A. School of Law, which focuses on law and policy issues related to sexual orientation and gender identity. Some of Dr. Gates’s findings were echoed in the new survey. This interview was edited and condensed. Q.How was this survey conducted? A.Survey takers had a computer that guided them through questions which they asked the respondent in person, and they used flash cards to show them potential answers. Q.Why do you think the figure for bisexuality was lower than in other surveys? A.There is evidence that bisexuals perceive more stigma and discrimination than gay and lesbian people. They are much less likely to tell important people around them that they are bisexual. © 2014 The New York Times Company

Keyword: Sexual Behavior
Link ID: 19862 - Posted: 07.22.2014

By HENRY L. ROEDIGER III TESTS have a bad reputation in education circles these days: They take time, the critics say, put students under pressure and, in the case of standardized testing, crowd out other educational priorities. But the truth is that, used properly, testing as part of an educational routine provides an important tool not just to measure learning, but to promote it. In one study I published with Jeffrey D. Karpicke, a psychologist at Purdue, we assessed how well students remembered material they had read. After an initial reading, students were tested on some passages by being given a blank sheet of paper and asked to recall as much as possible. They recalled about 70 percent of the ideas. Other passages were not tested but were reread, and thus 100 percent of the ideas were re-exposed. In final tests given either two days or a week later, the passages that had been tested just after reading were remembered much better than those that had been reread. What’s at work here? When students are tested, they are required to retrieve knowledge from memory. Much educational activity, such as lectures and textbook readings, is aimed at helping students acquire and store knowledge. Various kinds of testing, though, when used appropriately, encourage students to practice the valuable skill of retrieving and using knowledge. The fact of improved retention after a quiz — called the testing effect or the retrieval practice effect — makes the learning stronger and embeds it more securely in memory. This is vital, because many studies reveal that much of what we learn is quickly forgotten. Thus a central challenge to learning is finding a way to stem forgetting. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 19861 - Posted: 07.21.2014

|By Nathan Collins Time, space and social relationships share a common language of distance: we speak of faraway places, close friends and the remote past. Maybe that is because all three share common patterns of brain activity, according to a January study in the Journal of Neuroscience. Curious to understand why the distance metaphor works across conceptual domains, Dartmouth College psychologists used functional MRI scans to analyze the brains of 15 people as they viewed pictures of household objects taken at near or far distances, looked at photographs of friends or acquaintances, and read phrases such as “in a few seconds” or “a year from now.” Patterns of activity in the right inferior parietal lobule, a region thought to handle distance information, robustly predicted whether a participant was thinking about near versus far in any of the categories—indicating that certain aspects of time, space and relationships are all processed in a similar way in the brain. The results, the researchers say, suggest that higher-order brain functions are organized more around computations such as near versus far than conceptual domains such as time or social relationships. © 2014 Scientific American

Keyword: Attention
Link ID: 19860 - Posted: 07.21.2014

By Kelly Clancy In one important way, the recipient of a heart transplant ignores its new organ: Its nervous system usually doesn’t rewire to communicate with it. The 40,000 neurons controlling a heart operate so perfectly, and are so self-contained, that a heart can be cut out of one body, placed into another, and continue to function perfectly, even in the absence of external control, for a decade or more. This seems necessary: The parts of our nervous system managing our most essential functions behave like a Swiss watch, precisely timed and impervious to perturbations. Chaotic behavior has been throttled out. Or has it? Two simple pendulums that swing with perfect regularity can, when yoked together, move in a chaotic trajectory. Given that the billions of neurons in our brain are each like a pendulum, oscillating back and forth between resting and firing, and connected to 10,000 other neurons, isn’t chaos in our nervous system unavoidable? The prospect is terrifying to imagine. Chaos is extremely sensitive to initial conditions—just think of the butterfly effect. What if the wrong perturbation plunged us into irrevocable madness? Among many scientists, too, there is a great deal of resistance to the idea that chaos is at work in biological systems. Many intentionally preclude it from their models. It subverts computationalism, which is the idea that the brain is nothing more than a complicated, but fundamentally rule-based, computer. Chaos seems unqualified as a mechanism of biological information processing, as it allows noise to propagate without bounds, corrupting information transmission and storage. © 2014 Nautilus,

Keyword: Biological Rhythms
Link ID: 19859 - Posted: 07.21.2014

By Meeri Kim Babies start with simple vowel sounds — oohs and aahs. A mere months later, the cooing turns into babbling — “bababa” — showing off a newfound grasp of consonants. A new study has found that a key part of the brain involved in forming speech is firing away in babies as they listen to voices around them. This may represent a sort of mental rehearsal leading up to the true milestone that occurs after only a year of life: baby’s first words. Any parent knows how fast babies learn how to comprehend and use language. The skill develops so rapidly and seemingly without much effort, but how do they do it? Researchers at the University of Washington are a step closer to unraveling the mystery of how babies learn how to speak. They had a group of 7- and 11-month-old infants listen to a series of syllables while sitting in a brain scanner. Not only did the auditory areas of their brains light up as expected but so did a region crucial to forming higher-level speech, called Broca’s area. A year-old baby sits in a brain scanner, called magnetoencephalography -- a noninvasive approach to measuring brain activity. The baby listens to speech sounds like "da" and "ta" played over headphones while researchers record her brain responses. (Institute for Learning and Brain Sciences, University of Washington) These findings may suggest that even before babies utter their first words, they may be mentally exercising the pivotal parts of their brains in preparation. Study author and neuroscientist Patricia Kuhl says that her results reinforce the belief that talking and reading to babies from birth is beneficial for their language development, along with exaggerated speech and mouth movements (“Hiii cuuutie! How are youuuuu?”). © 1996-2014 The Washington Post

Keyword: Language; Aggression
Link ID: 19858 - Posted: 07.21.2014

By ANN SANNER Associated Press COLUMBUS, Ohio (AP) — A few weeks before their prom king’s death, students at an Ohio high school had attended an assembly on narcotics that warned about the dangers of heroin and prescription painkillers. But it was one of the world’s most widely accepted drugs that killed Logan Stiner — a powdered form of caffeine so potent that as little as a single teaspoon can be fatal. The teen’s sudden death in May has focused attention on the unregulated powder and drawn a warning from federal health authorities urging consumers to avoid it. ‘‘I don’t think any of us really knew that this stuff was out there,’’ said Jay Arbaugh, superintendent of the Keystone Local Schools. The federal Food and Drug Administration said Friday that it’s investigating caffeine powder and will consider taking regulatory action. The agency cautioned parents that young people could be drawn to it. An autopsy found that Stiner had a lethal amount of caffeine in his system when he died May 27 at his home in LaGrange, Ohio, southwest of Cleveland. Stiner, a wrestler, had more than 70 micrograms of caffeine per milliliter of blood in his system, as much as 23 times the amount found in a typical coffee or soda drinker, according to the county coroner. His mother has said she was unaware her son took caffeine powder. He was just days away from graduation and had planned to study at the University of Toledo. Caffeine powder is sold as a dietary supplement, so it’s not subject to the same federal regulations as certain caffeinated foods. Users add it to drinks for a pick-me-up before workouts or to control weight gain. A mere 1/16th of a teaspoon can contain about 200 milligrams of caffeine, roughly the equivalent of two large cups of coffee. That means a heaping teaspoon could kill, said Dr. Robert Glatter, an emergency physician at Lenox Hill ?Hospital in New York.

Keyword: Drug Abuse
Link ID: 19857 - Posted: 07.21.2014

Tania Browne As a teenager, I lost my grandfather. But he wasn't dead. He still had his favourite music, he still loved to walk in the woods and name the flowers and plants, and he loved his soap operas. He was alive, but gone. A dignified man, a former aircraft engineer and oil company salesman, reduced to the status of a bewildered toddler lost in a shopping centre. When he died, our family felt an odd mix of relief, then guilt at the relief. The man we loved had left his body years before the body gave out. This was 30 years ago. But while a cure is still far away, two new techniques may at least be able to forewarn us of dementia, and allow us to plan treatment for ourselves or loved ones before any outward symptoms are apparent. According to Alzheimer's Research UK, my experience is currently shared by 24m relatives and close friends of the 800 000 diagnosed dementia sufferers in the UK. In December last year, a G8 summit was told by Alzheimer's Disease International that the worldwide figure was 44m and set to treble by 2050, as the life expectancy of people in middle and lower income countries soars – precisely the countries who have either depleted or non-existent healthcare systems. Dementia is a serious time bomb. “Dementia” covers about 100 conditions, all resulting from large scale brain cell death. People often think that when they're diagnosed they're in the early stages. Yet cell death can be occurring for 10-15 years or more before any outward symptoms occur, and by the time they're diagnosed many dementia patients have already lost one fifth of their memory cells. © 2014 Guardian News and Media Limited

Keyword: Alzheimers
Link ID: 19856 - Posted: 07.21.2014

|By Roni Jacobson Last week, nine-year-old Hally Yust died after contracting a rare brain-eating amoeba infection while swimming near her family’s home in Kansas. The organism responsible, Naegleria fowleri, dwells in warm freshwater lakes and rivers and usually targets children and young adults. Once in the brain it causes a swelling called primary meningoencephalitis. The infection is almost universally fatal: it kills more than 97 percent of its victims within days. Although deadly, infections are exceedingly uncommon—there were only 34 reported in the U.S. during the past 10 years—but evidence suggests they may be increasing. Prior to 2010 more than half of cases came from Florida, Texas and other southern states. Since then, however, infections have popped up as far north as Minnesota. “We’re seeing it in states where we hadn’t seen cases before,” says Jennifer Cope, an epidemiologist and expert in amoeba infections at the U.S. Centers for Disease Control and Prevention. The expanding range of Naegleria infections could potentially be related to climate change, she adds, as the organism thrives in warmer temperatures. “It’s something we’re definitely keeping an eye on.” Still, “when it comes to Naegleria there’s a lot we don’t know,” Cope says—including why it chooses its victims. The amoeba has strategies to evade the immune system, and treatment options are meager partly because of how fast the infection progresses. But research suggests that the infectioncan be stopped if it is caught soon enough. So what happens during an N. fowleri infection? © 2014 Scientific American

Keyword: Chemical Senses (Smell & Taste)
Link ID: 19855 - Posted: 07.21.2014

Emily A. Holmes, Michelle G. Craske & Ann M. Graybiel How does one human talking to another, as occurs in psychological therapy, bring about changes in brain activity and cure or ease mental disorders? We don't really know. We need to. Mental-health conditions, such as post-traumatic stress disorder (PTSD), obsessive–compulsive disorder (OCD), eating disorders, schizophrenia and depression, affect one in four people worldwide. Depression is the third leading contributor to the global burden of disease, according to the World Health Organization. Psychological treatments have been subjected to hundreds of randomized clinical trials and hold the strongest evidence base for addressing many such conditions. These activities, techniques or strategies target behavioural, cognitive, social, emotional or environmental factors to improve mental or physical health or related functioning. Despite the time and effort involved, they are the treatment of choice for most people (see ‘Treating trauma with talk therapy’). For example, eating disorders were previously considered intractable within our life time. They can now be addressed with a specific form of cognitive behavioural therapy (CBT)1 that targets attitudes to body shape and disturbances in eating habits. For depression, CBT can be as effective as antidepressant medication and provide benefits that are longer lasting2. There is also evidence that interpersonal psychotherapy (IPT) is effective for treating depression. Ian was filling his car with petrol and was caught in the cross-fire of an armed robbery. His daughter was severely injured. For the following decade Ian suffered nightmares, intrusive memories, flashbacks of the trauma and was reluctant to drive — symptoms of post-traumatic stress disorder (PTSD). © 2014 Nature Publishing Group,

Keyword: Depression; Aggression
Link ID: 19854 - Posted: 07.19.2014

By Emily Anthes The women that come to see Deane Aikins, a clinical psychologist at Wayne State University, in Detroit, are searching for a way to leave their traumas behind them. Veterans in their late 20s and 30s, they served in Iraq and Afghanistan. Technically, they’d been in non-combat positions, but that didn’t eliminate the dangers of warfare. Mortars and rockets were an ever-present threat on their bases, and they learned to sleep lightly so as not to miss alarms signaling late-night attacks. Some of the women drove convoys of supplies across the desert. It was a job that involved worrying about whether a bump in the road was an improvised explosive device, or if civilians in their path were strategic human roadblocks. On top of all that, some of the women had been sexually assaulted by their military colleagues. After one woman was raped, she helped her drunk assailant sneak back into his barracks because she worried that if they were caught, she’d be disciplined or lose her job. These traumas followed the women home. Today, far from the battlefield, they find themselves struggling with vivid flashbacks and nightmares, tucking their guns under their pillows at night. Some have turned to alcohol to manage their symptoms; others have developed exhausting routines to avoid any people or places that might trigger painful memories and cause them to re-live their experiences in excruciating detail. © 2014 Nautilus,

Keyword: Learning & Memory; Aggression
Link ID: 19853 - Posted: 07.19.2014

Fearful memories can be dampened by imagining past traumas in a safe setting. The "extinction" of fear is fragile, however, and surprising or unexpected events can cause fear memories to return. Inactivating brain areas that detect novelty prevents relapse of unwanted fear memories. Traumatic and emotional experiences often lead to debilitating mental health disorders, including post-traumatic stress disorder (PTSD). In the clinic, it is typical to use behavioral therapies such as exposure therapy to help reduce fear in patients suffering from traumatic memories. Using these approaches, patients are asked to remember the circumstances and stimuli surrounding their traumatic memory in a safe setting in order to "extinguish" their fear response to those events. While effective in many cases, the loss of fear and anxiety achieved by these therapies is often short-lived—fear returns or relapses under a variety of conditions. Many years ago, the famous Russian physiologist Ivan Pavlov noted that simply exposing animals to novel or unexpected events could cause extinguished responses (such as salivary responses to sounds) to return. Might exposure to novelty also cause extinguished fear responses to return? In a recent study (Maren, 2014), rats first learned that an innocuous tone predicted an aversive (but mild) electric shock to their feet. The subsequent fear response to the tone was then extinguished by presenting the stimulus to the animals many times without the shock. After the fear response to the tone was reduced with the extinction procedure, they were then presented with the tone in either a new location (a novel test box) or in a familiar location, but in the presence of an unexpected sound (a noise burst). In both cases, fear to the tone returned as Pavlov predicted: the unexpected places and sounds led to a disinhibition of fear—in other words, fear relapsed. © 2014 Publiscize

Keyword: Learning & Memory; Aggression
Link ID: 19852 - Posted: 07.19.2014

Ewen Callaway One could be forgiven for mistaking anomalocaridids for creatures from another world. The spade-shaped predators, which lived in the seas during the Cambrian — the geological era stretching from 541 million to 485 million years ago — had eyes that protruded from stalks and a pair of giant appendages on the sides of their mouths. But three stunningly well-preserved fossils found in China now show that the anomalocaridid brain was wired much like that of modern creatures called velvet worms, or onychophorans. Both anomalocaridids and onychophorans belong to the arthropods, the group of invertebrates that includes spiders and insects and whose brain structures come in three main types. Two of those were already known to be very ancient, and the new fossils, described today in Nature1, suggest that the third type — the neural architecture found in onychophorans — also has changed little over more than half a billion years of evolution. Named Lyrarapax unguispinus, the three fossils reveal creatures that — at 8 centimetres long — are on the small side for anomalocaridids, some of which are thought to have been as long as 2 to 3 metres. But the fossils’ segmented bodies and frontal appendages are pure anomalocaridid, says Nicholas Strausfeld, a neuroscientist at the University of Arizona in Tucson, who co-led the study. What really grabbed Strausfeld’s attention was the creature’s brain, preserved flattened like a pressed flower: “I said, ‘Holy shit, that’s an onychophoran brain!’” he recalls. The animal’s frontal appendages are connected to nerve bundles, or ganglia, in front of optic nerves. Both the ganglia and the optic nerves lead to a segmented brain. The layout is an uncanny match to the wiring of the velvet worm’s brain, Strausfeld says: “It’s completely unlike anything else in any other arthropod.” © 2014 Nature Publishing Group

Keyword: Evolution
Link ID: 19851 - Posted: 07.19.2014

Obese women may have a "food learning impairment" that could explain their attitude to food, research from Yale School of Medicine suggests. Tests on groups of obese and healthy-weight people found that the obese women performed worst when asked to remember a sequence of food picture cards. Writing in Current Biology, Yale researchers tested 135 men and women. The findings could lead to new ways to tackle obesity, the study says. Study author Ifat Levy, assistant professor at Yale School of Medicine, said the difference in the performance of the obese women compared with the other groups was "really striking" and "significant". The tests looked at an individual's ability to learn and predict the appearance of pictures of food or money on coloured cards. The participants were told they would be given whatever appeared on these "reward" cards. In the first phase, the reward cards always followed a particular coloured card in a sequence. Later, the order was changed and the reward cards appeared following a different coloured card. During this time, participants were asked to predict the likelihood of a reward card appearing as the cards were shown one by one. The results showed that obese women performed worst because they overestimated how often the pictures of food, including pretzels or chocolate, appeared. Even after researchers had accounted for other factors, there was still a large difference in their learning performance. Prof Levy said: "This is not a general learning impairment, as obese women had no problem learning when the reward was money rather than food. BBC © 2014

Keyword: Obesity; Aggression
Link ID: 19850 - Posted: 07.19.2014

Kelly Servick If you’re a bird enthusiast, you can pick out the “chick-a-DEE-dee” song of the Carolina chickadee with just a little practice. But if you’re an environmental scientist faced with parsing thousands of hours of recordings of birdsongs in the lab, you might want to enlist some help from your computer. A new approach to automatic classification of birdsong borrows techniques from human voice recognition software to sort through the sounds of hundreds of species and decides on its own which features make each one unique. Collectors of animal sounds are facing a data deluge. Thanks to cheap digital recording devices that can capture sound for days in the field, “it’s really, really easy to collect sound, but it’s really difficult to analyze it,” say Aaron Rice, a bioacoustics researcher at Cornell University, who was not involved in the new work. His lab has collected 6 million hours of underwater recordings, from which they hope to pick out the signature sounds of various marine mammals. Knowing where and when a certain species is vocalizing might help scientists understand habitat preferences, track their movements or population changes, and recognize when a species is disrupted by human development. But to keep these detailed records, researchers rely on software that can reliably sort through the cacophony they capture in the field. Typically, scientists build one computer program to recognize one species, and then start all over for another species, Rice says. Training a computer to recognize lots of species in one pass is “a challenge that we’re all facing.” © 2014 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 19849 - Posted: 07.19.2014

Sam McDougle By now, perhaps you’ve seen the trailer for the new sci-fi thriller Lucy. It starts with a flurry of stylized special effects and Scarlett Johansson serving up a barrage of bad-guy beatings. Then comes Morgan Freeman, playing a professorial neuroscientist with the obligatory brown blazer, to deliver the film’s familiar premise to a full lecture hall: “It is estimated most human beings only use 10 percent of the brain’s capacity. Imagine if we could access 100 percent. Interesting things begin to happen.” Johansson as Lucy, who has been kidnapped and implanted with mysterious drugs, becomes a test case for those interesting things, which seem to include even more impressive beatings and apparently some kind of Matrix-esque time-warping skills. Of course, the idea that “you only use 10 percent of your brain” is, indeed, 100 hundred percent bogus. Why has this myth persisted for so long, and when is it finally going to die? Unfortunately, not any time soon. A survey last year by The Michael J. Fox Foundation for Parkinson's Research found that 65 percent of Americans believe the myth is true, 5 percent more than those who believe in evolution. Even Mythbusters, which declared the statistic a myth a few years ago, further muddied the waters: The show merely increased the erroneous 10 percent figure and implied, incorrectly, that people use 35 percent of their brains. The idea that swaths of the brain are stagnant pudding while one section does all the work is silly. Like most legends, the origin of this fiction is unclear, though there are some clues. © 2014 by The Atlantic Monthly Group

Keyword: Brain imaging
Link ID: 19848 - Posted: 07.17.2014