Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 19303

Philip Ball Lead guitarists usually get to play the flashy solos while the bass player gets only to plod to the beat. But this seeming injustice could have been determined by the physiology of hearing. Research published today in the Proceedings of the National Academy of Sciences1 suggests that people’s perception of timing in music is more acute for lower-pitched notes. Psychologist Laurel Trainor of McMaster University in Hamilton, Canada, and her colleagues say that their findings explain why in the music of many cultures the rhythm is carried by low-pitched instruments while the melody tends to be taken by the highest pitched. This is as true for the low-pitched percussive rhythms of Indian classical music and Indonesian gamelan as it is for the walking double bass of a jazz ensemble or the left-hand part of a Mozart piano sonata. Earlier studies2 have shown that people have better pitch discrimination for higher notes — a reason, perhaps, that saxophonists and lead guitarists often have solos at a squealing register. It now seems that rhythm works best at the other end of the scale. Trainor and colleagues used the technique of electroencephalography (EEG) — electrical sensors placed on the scalp — to monitor the brain signals of people listening to streams of two simultaneous piano notes, one high-pitched and the other low-pitched, at equally spaced time intervals. Occasionally, one of the two notes was played slightly earlier, by just 50 milliseconds. The researchers studied the EEG recordings for signs that the listeners had noticed. © 2014 Nature Publishing Group,

Keyword: Hearing
Link ID: 19776 - Posted: 07.01.2014

By RICHARD A. FRIEDMAN ADOLESCENCE is practically synonymous in our culture with risk taking, emotional drama and all forms of outlandish behavior. Until very recently, the widely accepted explanation for adolescent angst has been psychological. Developmentally, teenagers face a number of social and emotional challenges, like starting to separate from their parents, getting accepted into a peer group and figuring out who they really are. It doesn’t take a psychoanalyst to realize that these are anxiety-provoking transitions. But there is a darker side to adolescence that, until now, was poorly understood: a surge during teenage years in anxiety and fearfulness. Largely because of a quirk of brain development, adolescents, on average, experience more anxiety and fear and have a harder time learning how not to be afraid than either children or adults. Different regions and circuits of the brain mature at very different rates. It turns out that the brain circuit for processing fear — the amygdala — is precocious and develops way ahead of the prefrontal cortex, the seat of reasoning and executive control. This means that adolescents have a brain that is wired with an enhanced capacity for fear and anxiety, but is relatively underdeveloped when it comes to calm reasoning. You may wonder why, if adolescents have such enhanced capacity for anxiety, they are such novelty seekers and risk takers. It would seem that the two traits are at odds. The answer, in part, is that the brain’s reward center, just like its fear circuit, matures earlier than the prefrontal cortex. That reward center drives much of teenagers’ risky behavior. This behavioral paradox also helps explain why adolescents are particularly prone to injury and trauma. The top three killers of teenagers are accidents, homicide and suicide. The brain-development lag has huge implications for how we think about anxiety and how we treat it. It suggests that anxious adolescents may not be very responsive to psychotherapy that attempts to teach them to be unafraid, like cognitive behavior therapy, which is zealously prescribed for teenagers. © 2014 The New York Times Company

Keyword: Development of the Brain
Link ID: 19775 - Posted: 07.01.2014

A toxic caffeine level was found in the system of a high school student who died unexpectedly, says a U.S. coroner who warns young people need to be educated about the dangers of taking the potent powder that is sold online. Logan Stiner, 18, was found dead at his family’s home in May. Steiner was an excellent student and a healthy young man who didn’t do drugs, Dr. Stephen Evans, a coroner in Lorain County, Ohio, said Monday. "We sent his blood out for levels, and [when] it came back it was a toxic level. Caffeine toxicity will do exactly what happened to him. It'll lead to things like cardiac arrhytmias and seizures," Evans said in an interview. Use of caffeine from coffee, tea and other beverages is so widespread that it is considered innocuous, but that’s not the case when it’s taken in an overdose amount. Powdered caffeine is sold in bulk over the internet. Problems can arise because adding a teaspoon of the caffeine powder to water is the equivalent of 30 cups of coffee. About one-sixteenth of a teaspoon of the powder is equal to about two cups of coffee. Evans said he recognizes that weightlifters will say Stiner should’ve taken the correct amount. "One-sixteenth of a teaspoon. You expect a kid to figure that out?" He suggested that regulators re-consider internet sales of a pound of powdered caffeine to young people. When Evans and his staff reviewed the pathology literature, they found 18 other cases of deaths in the U.S. from caffeine overdoses. Some were suicides and others were accidental, but he suspects the deaths are underreported since few pathologists investigating deaths from seizure and cardiac arrhytmia check caffeine levels. © CBC 2014

Keyword: Neurotoxins; Aggression
Link ID: 19774 - Posted: 07.01.2014

Emotional and behavioral problems show up even with low exposure to lead, and as blood lead levels increase in children, so do the problems, according to research funded by the National Institute of Environmental Health Sciences (NIEHS), part of the National Institutes of Health. The results were published online June 30 in the journal JAMA Pediatrics. “This research focused on lower blood lead levels than most other studies and adds more evidence that there is no safe lead level,” explained NIEHS Health Scientist Administrator Kimberly Gray, Ph.D. “It is important to continue to study lead exposure in children around the world, and to fully understand short-term and long-term behavioral changes across developmental milestones. It is well-documented that lead exposure lowers the IQ of children.” Blood lead concentrations measured in more than 1,300 preschool children in China were associated with increased risk of behavioral and emotional problems, such as being anxious, depressed, or aggressive. The average blood lead level in the children was 6.4 micrograms per deciliter. While many studies to date have examined health effects at or above 10 micrograms per deciliter, this study focused on lower levels. The CDC now uses a reference level of 5 micrograms per deciliter, to identify children with blood lead levels that are much higher than normal, and recommends educating parents on reducing sources of lead in their environment and continued monitoring of blood lead levels.

Keyword: Neurotoxins; Aggression
Link ID: 19773 - Posted: 07.01.2014

by Bethany Brookshire One day when I came in to the office, my air conditioning unit was making a weird rattling sound. At first, I was slightly annoyed, but then I chose to ignore it and get to work. In another 30 minutes, I was completely oblivious to the noise. It wasn’t until my cubicle neighbor Meghan Rosen came in and asked about the racket that I realized the rattle was still there. My brain had habituated to the sound. Habituation, the ability to stop noticing or responding to an irrelevant signal, is one of the simplest forms of learning. But it turns out that at the level of a brain cell, it’s a far more complex process than scientists previously thought. In the June 18 Neuron, Mani Ramaswami of Trinity College Dublin proposes a new framework to describe how habituation might occur in our brains. The paper not only offers a new mechanism to help us understand one of our most basic behaviors, it also demonstrates how taking the time to integrate new findings into a novel framework can help push a field forward. Our ability to ignore the irrelevant and familiar has been a long-known feature of human learning. It’s so simple, even a sea slug can do it. Because the ability to habituate is so simple, scientists hypothesized that the mechanism behind it must also be simple. The previous framework for habituation has been synaptic depression, a decrease in chemical release. When one brain cell sends a signal to another, it releases chemical messengers into a synapse, the small gap between neurons. Receptors on the other side pick up this excitatory signal and send the message onward. But in habituation, neurons would release fewer chemicals, making the signal less likely to hit the other side. Fewer chemicals, fewer signals, and you’ve habituated. Simple. But, as David Glanzman, a neurobiologist at the University of California, Los Angeles points out, there are problems with this idea. © Society for Science & the Public 2000 - 2013

Keyword: Learning & Memory
Link ID: 19772 - Posted: 06.25.2014

|By Lisa Marshall Is Alzheimer's disease an acquired form of Down syndrome? When neurobiologist Huntington Potter first posed the question in 1991, Alzheimer's researchers were skeptical. They were just beginning to explore the causes of the memory-robbing neurological disease. Scientists already knew that by age 40, nearly 100 percent of patients with Down syndrome, who have an extra copy of chromosome 21, had brains full of beta-amyloid peptide—the neuron-strangling plaque that is a hallmark of Alzheimer's. They also knew that the gene that codes for that protein lives on chromosome 21, suggesting that people acquire more plaque because they get an extra dose of the peptide. Potter, though, suggested that if people with Down syndrome develop Alzheimer's because of an extra chromosome 21, healthy people may develop Alzheimer's for the same reason. A quarter of a century later mounting evidence supports the idea. “What we hypothesized in the 1990s and have begun to prove is that people with Alzheimer's begin to make molecular mistakes and generate cells with three copies of chromosome 21,” says Potter, who was recently appointed director of Alzheimer's disease research at the University of Colorado School of Medicine, with the express purpose of studying Alzheimer's through the lens of Down syndrome. He is no longer the only one exploring the link. In recent years dozens of studies have shown Alzheimer's patients possess an inordinate amount of Down syndrome–like cells. One 2009 study by Russian researchers found that up to 15 percent of the neurons in the brains of Alzheimer's patients contained an extra copy of chromosome 21. Others have shown Alzheimer's patients have 1.5 to two times as many skin and blood cells with the extra copy as healthy controls. Potter's own research in mice suggests a vicious cycle: when normal cells are exposed to the beta-amyloid peptide, they tend to make mistakes when dividing, producing more trisomy 21 cells, which, in turn, produce more plaque. In August, Potter and his team published a paper in the journal Neurobiology of Aging describing why those mistakes may occur: the inhibition of a specific enzyme. © 2014 Scientific American

Keyword: Alzheimers
Link ID: 19771 - Posted: 06.25.2014

By Jim Tankersley COLUMBUS, Ohio — First they screwed the end of the gray cord into the metal silo rising out of Ian Burkhart’s skull. Later they laid his right forearm across two foam cylinders, and they wrapped it with thin strips that looked like film from an old home movie camera. They ran him through some practice drills, and then it was time for him to try. If he succeeded at this next task, it would be science fiction come true: His thoughts would bypass his broken spinal cord. With the help of an algorithm and some electrodes, he would move his once-dead limb again — a scientific first. “Ready?” the young engineer, Nick Annetta, asked from the computer to his left. “Three. Two. One.” Burkhart, 23, marshaled every neuron he could muster, and he thought about his hand. 1 of 14 The last time the hand obeyed him, it was 2010 and Burkhart was running into the Atlantic Ocean. The hand had gripped the steering wheel as he drove the van from Ohio University to North Carolina’s Outer Banks, where he and friends were celebrating the end of freshman year. The hand unclenched to drop his towel on the sand. Burkhart splashed into the waves, the hand flying above his head, the ocean warm around his feet, the sun roasting his arms, and he dived. In an instant, he felt nothing. Not his hand. Not his legs. Only the breeze drying the saltwater on his face.

Keyword: Robotics
Link ID: 19770 - Posted: 06.25.2014

By Gary Stix Tony Zador: The human brain has 100 billion neurons, a mouse brain has maybe 100 million. What we’d really like to understand is how we go from a bunch of neurons to thought, feelings, behavior. We think that the key is to understand how the different neurons are connected to one another. So traditionally there have been a lot of techniques for studying connectivity but at a fairly crude level. We can, for instance, tell that a bunch of neurons here tend to be connected to a bunch of neurons there. There are also techniques for looking at how single neurons are connected but only for individual links between those neurons. What we would love to be able to do is to tell how every single neuron in the brain is connected to every single other neuron in the brain. So if you wanted to navigate through the United States, one of the most useful things you could have is a roadmap. It wouldn’t tell you everything about the United States, but it would be very hard to get around without a complete roadmap of the country. We need something like that for the brain. Zador: Traditionally the way people study connectivity is as a branch of microscopy. Typically what people do is they use one method or another to label a neuron and then they observe that neuron at some level of resolution. But the challenge that’s at the core of all the microscopy techniques is that neurons can extend long distances. That might be millimeters in a mouse brain or, in fact, in a giraffe brain, there are neurons that go all the way from the brain to its foot, which can be over 15 feet. Brain cells are connected with one another at structures called synapses, which are below the resolution of light microscopy. That means that if you really want to understand how one neuron is connected to another, you need to resolve the synapse, which requires electron microscopy. You have to take incredibly thin sections of brain and then image them. © 2014 Scientific American

Keyword: Autism; Aggression
Link ID: 19769 - Posted: 06.25.2014

Helen Shen As US science agencies firm up plans for a national ten-year neuroscience initiative, California is launching an ambitious project of its own. On 20 June, governor Jerry Brown signed into law a state budget that allocates US$2 million to establish the California Blueprint for Research to Advance Innovations in Neuroscience (Cal-BRAIN) project. Cal-BRAIN is the first state-wide programme to piggyback on the national Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative announced by US President Barack Obama in April 2013 (see Nature 503, 26–28; 2013). The national project is backed this year by $110 million in public funding from the National Institutes of Health (NIH), the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF). California researchers and lawmakers hope that the state’s relatively modest one-time outlay will pave the way for a larger multiyear endeavour that gives its scientists an edge in securing grants from the national initiative. “It’s a drop in the bucket, but it’s an important start,” says Zack Lynch, executive director of the Neurotechnology Industry Organization, an advocacy group in San Francisco, California. Cal-BRAIN sets itself apart from the national effort by explicitly seeking industry involvement. The proposal emphasizes the potential economic benefits of neuroscience research and calls for the formation of a programme to facilitate the translation of any discoveries into commercial applications. © 2014 Nature Publishing Group,

Keyword: Brain imaging
Link ID: 19768 - Posted: 06.25.2014

by Sarah Zielinski Would you recognize a stop sign if it was a different shape, though still red and white? Probably, though there might be a bit of a delay. After all, your brain has long been trained to expect a red-and-white octagon to mean “stop.” The animal and plant world also uses colorful signals. And it would make sense if a species always used the same pattern to signal the same thing — like how we can identify western black widows by the distinctive red hourglass found on the adult spiders’ back. But that doesn’t always happen. Even with really important signals, such as the ones that tell a predator, “Don’t eat me — I’m poisonous.” Consider the dyeing dart frog (Dendrobates tinctorius), which is found in lowland forests of the Guianas and Brazil. The backs of the 5-centimeter-long frogs are covered with a yellow-and-black pattern, which warns of its poisonous nature. But that pattern isn’t the same from frog to frog. Some are decorated with an elongated pattern; others have more complex, sometimes interrupted patterns. The difference in patterns should make it harder for predators to recognize the warning signal. So why is there such variety? Because the patterns aren’t always viewed on a static frog, and the different ways that the frogs move affects how predators see the amphibians, according to a study published June 18 in Biology Letters. Bibiana Rojas of Deakin University in Geelong, Australia, and colleagues studied the frogs in a nature reserve in French Guiana from February to July 2011. They found 25 female and 14 male frogs, following each for two hours from about 2.5 meters away, where the frog wouldn’t notice a scientist. As a frog moved, a researcher would follow, recording how far it went and in what direction. Each frog was then photographed. © Society for Science & the Public 2000 - 2013.

Keyword: Vision; Aggression
Link ID: 19767 - Posted: 06.25.2014

By HELENE STAPINSKI A few months ago, my 10-year-old daughter, Paulina, was suffering from a bad headache right before bedtime. She went to lie down and I sat beside her, stroking her head. After a few minutes, she looked up at me and said, “Everything in the room looks really small.” And I suddenly remembered: When I was young, I too would “see things far away,” as I once described it to my mother — as if everything in the room were at the wrong end of a telescope. The episodes could last anywhere from a few minutes to an hour, but they eventually faded as I grew older. I asked Paulina if this was the first time she had experienced such a thing. She shook her head and said it happened every now and then. When I was a little girl, I told her, it would happen to me when I had a fever or was nervous. I told her not to worry and that it would go away on its own. Soon she fell asleep, and I ran straight to my computer. Within minutes, I discovered that there was an actual name for what turns out to be a very rare affliction — Alice in Wonderland Syndrome. Episodes usually include micropsia (objects appear small) or macropsia (objects appear large). Some sufferers perceive their own body parts to be larger or smaller. For me, and Paulina, furniture a few feet away seemed small enough to fit inside a dollhouse. Dr. John Todd, a British psychiatrist, gave the disorder its name in a 1955 paper, noting that the misperceptions resemble Lewis Carroll’s descriptions of what happened to Alice. It’s also known as Todd’s Syndrome. Alice in Wonderland Syndrome is not an optical problem or a hallucination. Instead, it is most likely caused by a change in a portion of the brain, likely the parietal lobe, that processes perceptions of the environment. Some specialists consider it a type of aura, a sensory warning preceding a migraine. And the doctors confirmed that it usually goes away by adulthood. © 2014 The New York Times Company

Keyword: Vision; Aggression
Link ID: 19766 - Posted: 06.24.2014

By Tanya Lewis and Live Science They say laughter is the best medicine. But what if laughter is the disease? For a 6-year-old girl in Bolivia who suffered from uncontrollable and inappropriate bouts of giggles, laughter was a symptom of a serious brain problem. But doctors initially diagnosed the child with “misbehavior.” “She was considered spoiled, crazy — even devil-possessed,” José Liders Burgos Zuleta of the Advanced Medical Image Centre in La Paz said in a statement. [ But Burgos Zuleta discovered that the true cause of the girl’s laughing seizures, medically called gelastic seizures, was a brain tumor. After the girl underwent a brain scan, the doctors discovered a hamartoma, a small, benign tumor that was pressing against her brain’s temporal lobe. Surgeons removed the tumor, the doctors said. She stopped having the uncontrollable attacks of laughter and now laughs only normally, they said. Gelastic seizures are a relatively rare form of epilepsy, said Solomon Moshé, a pediatric neurologist at Albert Einstein College of Medicine in New York. “It’s not necessarily ‘ha-ha-ha’ laughing,” Moshé said. “There’s no happiness in this. Some of the kids may be very scared,” he added. The seizures are most often caused by tumors in the hypothalamus, although they can also come from tumors in other parts of brain, Moshé said. Although laughter is the main symptom, patients may also have outbursts of crying.

Keyword: Emotions; Aggression
Link ID: 19765 - Posted: 06.24.2014

|By Lindsey Konkel and Environmental Health News Babies whose moms lived within a mile of crops treated with widely used pesticides were more likely to develop autism, according to new research. The study of 970 children, born in farm-rich areas of Northern California, is part of the largest project to date that is exploring links between autism and environmental exposures. The University of California, Davis research – which used women’s addresses to determine their proximity to insecticide-treated fields – is the third project to link prenatal pesticide exposures to autism and related disorders. “The weight of evidence is beginning to suggest that mothers’ exposures during pregnancy may play a role in the development of autism spectrum disorders,” said Kim Harley, an environmental health researcher at the University of California, Berkeley who was not involved in the new study. One in every 68 U.S. children has been identified with an autism spectrum disorder—a group of neurodevelopmental disorders characterized by difficulties with social interactions, according to the Centers for Disease Control and Prevention. “This study does not show that pesticides are likely to cause autism, though it suggests that exposure to farming chemicals during pregnancy is probably not a good thing,” said Dr. Bennett Leventhal, a child psychiatrist at University of California, San Francisco who studies autistic children. He did not participate in the new study. The biggest known contributor to autism risk is having a family member with it. Siblings of a child with autism are 35 times more likely to develop it than those without an autistic brother or sister, according to the National Institutes of Health. © 2014 Scientific American

Keyword: Autism; Aggression
Link ID: 19764 - Posted: 06.24.2014

By DOUGLAS QUENQUA When it comes to forming memories that involve recalling a personal experience, neuroscientists are of two minds. Some say that each memory is stored in a single neuron in a region of the brain called the hippocampus. But a new study is lending weight to the theory of neuroscientists who believe that every memory is spread out, or distributed, across many neurons in that part of the brain. By watching patients with electrodes in their brains play a memory game, researchers found that each such memory is committed to cells distributed across the hippocampus. Though the proportion of cells responsible for each memory is small (about 2 percent of the hippocampus), the absolute number is in the millions. So the loss of any one cell should not have a noticeable effect on memory or mental acuity, said Peter N. Steinmetz, a research neurologist at the Dignity Health Barrow Neurological Institute in Phoenix and senior author of the study. “The significance of losing one cell is substantially reduced because you’ve got this whole population that’s turning on” when you access a memory, he said. The findings also suggest that memory researchers “need to use techniques that allow us to look at the whole population of neurons” rather than focus on individual cells. The patients in the study, which is published in Proceedings of the National Academy of Sciences, first memorized a list of words on a computer screen, then viewed a second list that included those words and others. When asked to identify words they had seen earlier, the patients displayed cell-firing activity consistent with the distributed model of memory. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 19763 - Posted: 06.24.2014

|By Tori Rodriguez One of the most devastating aspects of Alzheimer's is its effect on patients' ability to recall life events. Several studies have found that music helps to strengthen these individuals' autobiographical memories, and a paper in the November 2013 Journal of Neurolinguistics builds on these findings by exploring the linguistic quality of those recollections. Researchers instructed 18 patients with Alzheimer's and 18 healthy control subjects to tell stories from their lives in a silent room or while listening to the music of their choice. Among the Alzheimer's patients, the music-cued stories contained a greater number of meaningful words, were more grammatically complex and conveyed more information per number of words. Music may enhance narrative memories because “music and language processing share a common neural basis,” explains study co-author Mohamad El Haj of Lille University in France. © 2014 Scientific American

Keyword: Alzheimers
Link ID: 19762 - Posted: 06.24.2014

Sarah C. P. Williams There’s a reason people say “Calm down or you’re going to have a heart attack.” Chronic stress—such as that brought on by job, money, or relationship troubles—is suspected to increase the risk of a heart attack. Now, researchers studying harried medical residents and harassed rodents have offered an explanation for how, at a physiological level, long-term stress can endanger the cardiovascular system. It revolves around immune cells that circulate in the blood, they propose. The new finding is “surprising,” says physician and atherosclerosis researcher Alan Tall of Columbia University, who was not involved in the new study. “The idea has been out there that chronic psychosocial stress is associated with increased cardiovascular disease in humans, but what’s been lacking is a mechanism,” he notes. Epidemiological studies have shown that people who face many stressors—from those who survive natural disasters to those who work long hours—are more likely to develop atherosclerosis, the accumulation of fatty plaques inside blood vessels. In addition to fats and cholesterols, the plaques contain monocytes and neutrophils, immune cells that cause inflammation in the walls of blood vessels. And when the plaques break loose from the walls where they’re lodged, they can cause more extreme blockages elsewhere—leading to a stroke or heart attack. Studying the effect of stressful intensive care unit (ICU) shifts on medical residents, biologist Matthias Nahrendorf of Harvard Medical School in Boston recently found that blood samples taken when the doctors were most stressed out had the highest levels of neutrophils and monocytes. To probe whether these white blood cells, or leukocytes, are the missing link between stress and atherosclerosis, he and his colleagues turned to experiments on mice. © 2014 American Association for the Advancement of Science

Keyword: Stress
Link ID: 19761 - Posted: 06.23.2014

By Adam Carter, CBC News Women who take antidepressants when they’re pregnant could unknowingly predispose their kids to type 2 diabetes and obesity later on in life, new research out of McMaster University suggests. The study, conducted by associate professor of obstetrics and gynecology Alison Holloway and PhD student Nicole De Long, found a link between the antidepressant fluoxetine and increased risk of obesity and diabetes in children. Holloway cautions that this is not a warning for all pregnant women to stop taking antidepressants, but rather to start a conversation about prenatal care and what works best on an individual basis. “There are a lot of women who really need antidepressants to treat depression. This is what they need,” Holloway told CBC. “We’re not saying you should necessarily take patients off antidepressants because of this — but women should have this discussion with their caregiver.” “Obesity and Type 2 diabetes in children is on the rise and there is the argument that it is related to lifestyle and availability of high calorie foods and reduced physical activity, but our study has found that maternal antidepressant use may also be a contributing factor to the obesity and diabetes epidemic.” According to a study out of Memorial University in St. John's, obesity rates in Canada have tripled between 1985 and 2011. Canada also ranks poorly when it comes to its overall number of cases of diabetes, according to international report from the Organization for Economic Co-operation and Development, released last year. © CBC 2014

Keyword: Depression; Aggression
Link ID: 19760 - Posted: 06.23.2014

Nicola Davis The old adage that we eat with our eyes appears to be correct, according to research that suggests diners rate an artistically arranged meal as more tasty – and are prepared to pay more for it. The team at Oxford University tested the idea by gauging the reactions of diners to food presented in different ways. Inspired by Wassily Kandinsky's "Painting Number 201" Franco-Columbian chef and one of the authors of the study, Charles Michel, designed a salad resembling the abstract artwork to explore how the presentation of food affects the dining experience. "A number of chefs now are realising that they are being judged by how their foods photograph – be it in the fancy cookbooks [or], more often than not, when diners instagram their friends," explains Professor Charles Spence, experimental psychologist at the University of Oxford and a co-author of the study. Thirty men and 30 women were each presented with one of three salads containing identical ingredients, arranged either to resemble the Kandinsky painting, a regular tossed salad, or a "neat" formation where each component was spaced away from the others. Seated alone at a table mimicking a restaurant setting, and unaware that other versions of the salad were on offer, each participant was given two questionnaires asking them to rate various aspects of the dish on a 10-point scale, before and after tucking into the salad. Before participants sampled their plateful, the Kandinsky-inspired dish was rated higher for complexity, artistic presentation and general liking. Participants were prepared to pay twice as much for the meal as for either the regular or "neat arrangements". © 2014 Guardian News and Media Limited

Keyword: Chemical Senses (Smell & Taste); Aggression
Link ID: 19759 - Posted: 06.23.2014

By ANDREW POLLACK It is a tantalizingly simple idea for losing weight: Before meals, swallow a capsule that temporarily swells up in the stomach, making you feel full. Now, some early results for such a pill are in. And they are only partly fulfilling. People who took the capsule lost 6.1 percent of their weight after 12 weeks, compared with 4.1 percent for those taking a placebo, according to results presented Sunday at an endocrinology meeting in Chicago. Gelesis, the company developing the capsule, declared the results a triumph and said it would start a larger study next year aimed at winning approval for the product, called Gelesis100. “I’m definitely impressed, absolutely,” Dr. Arne V. Astrup, head of the department of nutrition, exercise and sports at the University of Copenhagen in Denmark and the lead investigator in the study, said in an interview. He said the physical mode of action could make the product safer than many existing diet drugs, which act chemically on the brain to influence appetite. But Dr. Daniel H. Bessesen, an endocrinologist at the University of Colorado who was not involved in the study, said weight loss of 2 percent beyond that provided by a placebo was “very modest.” “It doesn’t look like a game changer,” he said. Gelesis, a privately held company based in Boston, is one of many trying to come up with a product that can provide significant weight loss without bariatric surgery. Two new drugs — Qsymia from Vivus, and Belviq from Arena Pharmaceuticals and Eisai — have had disappointing sales since their approvals in 2012. Reasons include modest effectiveness, safety concerns, lack of insurance reimbursement and a belief among some doctors and overweight people that obesity is not a disease. © 2014 The New York Times Company

Keyword: Obesity
Link ID: 19758 - Posted: 06.23.2014

by Frank Swain WHEN it comes to personal electronics, it's difficult to imagine iPhones and hearing aids in the same sentence. I use both and know that hearing aids have a well-deserved reputation as deeply uncool lumps of beige plastic worn mainly by the elderly. Apple, on the other hand, is the epitome of cool consumer electronics. But the two are getting a lot closer. The first "Made for iPhone" hearing aids have arrived, allowing users to stream audio and data between smartphones and the device. It means hearing aids might soon be desirable, even to those who don't need them. A Bluetooth wireless protocol developed by Apple last year lets the prostheses connect directly to Apple devices, streaming audio and data while using a fraction of the power consumption of conventional Bluetooth. LiNX, made by ReSound (pictured), and Halo hearing aids made by Starkey – both international firms – use the iPhone as a platform to offer users new features and added control over their hearing aids. "The main advantage of Bluetooth is that the devices are talking to each other, it's not just one way," says David Nygren, UK general manager of ReSound. This is useful as hearing aids have long suffered from a restricted user interface – there's not much room for buttons on a device the size of a kidney bean. This is a major challenge for hearing-aid users, because different environments require different audio settings. Some devices come with preset programmes, while others adjust automatically to what their programming suggests is the best configuration. This is difficult to get right, and often devices calibrated in the audiologist's clinic fall short in the real world. © Copyright Reed Business Information Ltd.

Keyword: Hearing
Link ID: 19757 - Posted: 06.23.2014