Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 1375

By GRETCHEN REYNOLDS Exercise may help to keep the brain robust in people who have an increased risk of developing Alzheimer’s disease, according to an inspiring new study. The findings suggests that even moderate amounts of physical activity may help to slow the progression of one of the most dreaded diseases of aging. For the new study, which was published in May in Frontiers in Aging Neuroscience, researchers at the Cleveland Clinic in Ohio recruited almost 100 older men and women, aged 65 to 89, many of whom had a family history of Alzheimer’s disease. Alzheimer’s disease, characterized by a gradual and then quickening loss of memory and cognitive functioning, can strike anyone. But scientists have discovered in recent years that people who harbor a specific variant of a gene, known as the APOE epsilon4 allele or the e4 gene for short, have a substantially increased risk of developing the disease. Genetic testing among the volunteers in the new study determined that about half of the group carried the e4 gene, although, at the start of the study, none showed signs of memory loss beyond what would be normal for their age. Then the scientists set out to more closely examine their volunteers’ brains. For some time, researchers have suspected that Alzheimer’s disease begins altering the structure and function of the brain years or even decades before the first symptoms appear. In particular, it’s been thought that the disease silently accelerates the atrophy of the hippocampus, a portion of the brain critical for memory processing. Brain scans of people who have Alzheimer’s show that their hippocampi are considerably more shrunken than those of people of the same age without the disease. There’s been less study, though, of possible shrinkage in the brains of cognitively normal people at risk for Alzheimer’s. One reason is that, until recently, few interventions, including drugs, had shown much promise in slowing or preventing the disease’s progression, so researchers – and patients – have been reluctant to identify markers of its potential onset. © 2014 The New York Times Company

Keyword: Alzheimers
Link ID: 19783 - Posted: 07.02.2014

by Laura Sanders At the playground yesterday, Baby V commando-crawled through a tunnel with holes on the side. Every so often, I stuck my face in there with a loud “peekaboo.” She reached up longingly toward the bouncy duck. I picked her up and steadied her as she lurched back and forth. She scrambled up some low stairs and launched down a slide. I lurked near the bottom, ready to assist and yell “yay” when she didn’t face-plant. The one thing I didn’t do was sit back and leave her to her own devices, free from my helicopter-mom tendencies. But since I have the most ridiculous crush on that girl, it’s hard for me to leave her be. As a parent who works outside of the home, I treasure our time together. But as she becomes more capable and independent, I realize that I need to be more thoughtful about letting her carve out some space for herself. A recent research paper emphasized this point. The study, published June 17 in Frontiers in Psychology, finds that children who spend more time in unstructured activities may better master some important life skills. Researchers sorted kids’ activities into structured activities, which included child-initiated activities such as playing alone or with friends, singing, riding bikes and camping, and structured activities, including soccer practice, piano lessons, chores and homework. Six- and seven-year-olds who had more unstructured time scored higher on a measure of executive function, complex cognitive abilities such as seamlessly switching between tasks, resisting impulses and paying attention — all things that help people get along in this world. © Society for Science & the Public 2000 - 2013.

Keyword: Development of the Brain
Link ID: 19780 - Posted: 07.02.2014

James Gorman All moving animals do their best to avoid running into things. And most living things follow a tried and true strategy — Watch where you’re going! Flying and swimming animals both have to cope with some complications that walkers, jumpers and gallopers don’t confront. Not only do they have to navigate in three dimensions, but they also cope with varying air and water flow. Beyond that, they often do so without the same references points and landmarks we have on the ground. Christine Scholtyssek of Lund University in Sweden, and colleagues decided to compare how two species in different mediums, air and water, which pose similar problems, reacted to apparent obstacles as they were moving. What they found, and reported in Biology Letters in May, was that the two species they examined — bumblebees and zebra fish — have very different strategies. It was known that the bees’ navigation depended on optic flow, which is something like the sensation of watching telephone poles speed past from a seat on a moving train. They tend to fly away from apparent obstacles as they approach them. The question was whether fish would do something similar. So, in order to give both animals the same test, Dr. Scholtyssek and her colleagues devised an apparatus that could contain air or water. When one wall had vertical stripes and the other horizontal, the bees, not surprisingly, flew away from the vertical stripes, which would have appeared as one emerging obstacle after another as the bees flew past. Horizontal stripes don’t change as a creature moves past, so they provide no reference for speed or progress. The fish, however, swam closer to the vertical stripes, which wasn’t expected. “It is surprising that although fish and bees have the same challenge, moving with or against streams, they do not use the same mechanisms,” Dr. Scholtyssek said. © 2014 The New York Times Company

Keyword: Animal Migration
Link ID: 19778 - Posted: 07.01.2014

Philip Ball Lead guitarists usually get to play the flashy solos while the bass player gets only to plod to the beat. But this seeming injustice could have been determined by the physiology of hearing. Research published today in the Proceedings of the National Academy of Sciences1 suggests that people’s perception of timing in music is more acute for lower-pitched notes. Psychologist Laurel Trainor of McMaster University in Hamilton, Canada, and her colleagues say that their findings explain why in the music of many cultures the rhythm is carried by low-pitched instruments while the melody tends to be taken by the highest pitched. This is as true for the low-pitched percussive rhythms of Indian classical music and Indonesian gamelan as it is for the walking double bass of a jazz ensemble or the left-hand part of a Mozart piano sonata. Earlier studies2 have shown that people have better pitch discrimination for higher notes — a reason, perhaps, that saxophonists and lead guitarists often have solos at a squealing register. It now seems that rhythm works best at the other end of the scale. Trainor and colleagues used the technique of electroencephalography (EEG) — electrical sensors placed on the scalp — to monitor the brain signals of people listening to streams of two simultaneous piano notes, one high-pitched and the other low-pitched, at equally spaced time intervals. Occasionally, one of the two notes was played slightly earlier, by just 50 milliseconds. The researchers studied the EEG recordings for signs that the listeners had noticed. © 2014 Nature Publishing Group,

Keyword: Hearing
Link ID: 19776 - Posted: 07.01.2014

By RICHARD A. FRIEDMAN ADOLESCENCE is practically synonymous in our culture with risk taking, emotional drama and all forms of outlandish behavior. Until very recently, the widely accepted explanation for adolescent angst has been psychological. Developmentally, teenagers face a number of social and emotional challenges, like starting to separate from their parents, getting accepted into a peer group and figuring out who they really are. It doesn’t take a psychoanalyst to realize that these are anxiety-provoking transitions. But there is a darker side to adolescence that, until now, was poorly understood: a surge during teenage years in anxiety and fearfulness. Largely because of a quirk of brain development, adolescents, on average, experience more anxiety and fear and have a harder time learning how not to be afraid than either children or adults. Different regions and circuits of the brain mature at very different rates. It turns out that the brain circuit for processing fear — the amygdala — is precocious and develops way ahead of the prefrontal cortex, the seat of reasoning and executive control. This means that adolescents have a brain that is wired with an enhanced capacity for fear and anxiety, but is relatively underdeveloped when it comes to calm reasoning. You may wonder why, if adolescents have such enhanced capacity for anxiety, they are such novelty seekers and risk takers. It would seem that the two traits are at odds. The answer, in part, is that the brain’s reward center, just like its fear circuit, matures earlier than the prefrontal cortex. That reward center drives much of teenagers’ risky behavior. This behavioral paradox also helps explain why adolescents are particularly prone to injury and trauma. The top three killers of teenagers are accidents, homicide and suicide. The brain-development lag has huge implications for how we think about anxiety and how we treat it. It suggests that anxious adolescents may not be very responsive to psychotherapy that attempts to teach them to be unafraid, like cognitive behavior therapy, which is zealously prescribed for teenagers. © 2014 The New York Times Company

Keyword: Development of the Brain
Link ID: 19775 - Posted: 07.01.2014

by Bethany Brookshire One day when I came in to the office, my air conditioning unit was making a weird rattling sound. At first, I was slightly annoyed, but then I chose to ignore it and get to work. In another 30 minutes, I was completely oblivious to the noise. It wasn’t until my cubicle neighbor Meghan Rosen came in and asked about the racket that I realized the rattle was still there. My brain had habituated to the sound. Habituation, the ability to stop noticing or responding to an irrelevant signal, is one of the simplest forms of learning. But it turns out that at the level of a brain cell, it’s a far more complex process than scientists previously thought. In the June 18 Neuron, Mani Ramaswami of Trinity College Dublin proposes a new framework to describe how habituation might occur in our brains. The paper not only offers a new mechanism to help us understand one of our most basic behaviors, it also demonstrates how taking the time to integrate new findings into a novel framework can help push a field forward. Our ability to ignore the irrelevant and familiar has been a long-known feature of human learning. It’s so simple, even a sea slug can do it. Because the ability to habituate is so simple, scientists hypothesized that the mechanism behind it must also be simple. The previous framework for habituation has been synaptic depression, a decrease in chemical release. When one brain cell sends a signal to another, it releases chemical messengers into a synapse, the small gap between neurons. Receptors on the other side pick up this excitatory signal and send the message onward. But in habituation, neurons would release fewer chemicals, making the signal less likely to hit the other side. Fewer chemicals, fewer signals, and you’ve habituated. Simple. But, as David Glanzman, a neurobiologist at the University of California, Los Angeles points out, there are problems with this idea. © Society for Science & the Public 2000 - 2013

Keyword: Learning & Memory
Link ID: 19772 - Posted: 06.25.2014

|By Lisa Marshall Is Alzheimer's disease an acquired form of Down syndrome? When neurobiologist Huntington Potter first posed the question in 1991, Alzheimer's researchers were skeptical. They were just beginning to explore the causes of the memory-robbing neurological disease. Scientists already knew that by age 40, nearly 100 percent of patients with Down syndrome, who have an extra copy of chromosome 21, had brains full of beta-amyloid peptide—the neuron-strangling plaque that is a hallmark of Alzheimer's. They also knew that the gene that codes for that protein lives on chromosome 21, suggesting that people acquire more plaque because they get an extra dose of the peptide. Potter, though, suggested that if people with Down syndrome develop Alzheimer's because of an extra chromosome 21, healthy people may develop Alzheimer's for the same reason. A quarter of a century later mounting evidence supports the idea. “What we hypothesized in the 1990s and have begun to prove is that people with Alzheimer's begin to make molecular mistakes and generate cells with three copies of chromosome 21,” says Potter, who was recently appointed director of Alzheimer's disease research at the University of Colorado School of Medicine, with the express purpose of studying Alzheimer's through the lens of Down syndrome. He is no longer the only one exploring the link. In recent years dozens of studies have shown Alzheimer's patients possess an inordinate amount of Down syndrome–like cells. One 2009 study by Russian researchers found that up to 15 percent of the neurons in the brains of Alzheimer's patients contained an extra copy of chromosome 21. Others have shown Alzheimer's patients have 1.5 to two times as many skin and blood cells with the extra copy as healthy controls. Potter's own research in mice suggests a vicious cycle: when normal cells are exposed to the beta-amyloid peptide, they tend to make mistakes when dividing, producing more trisomy 21 cells, which, in turn, produce more plaque. In August, Potter and his team published a paper in the journal Neurobiology of Aging describing why those mistakes may occur: the inhibition of a specific enzyme. © 2014 Scientific American

Keyword: Alzheimers
Link ID: 19771 - Posted: 06.25.2014

By Jim Tankersley COLUMBUS, Ohio — First they screwed the end of the gray cord into the metal silo rising out of Ian Burkhart’s skull. Later they laid his right forearm across two foam cylinders, and they wrapped it with thin strips that looked like film from an old home movie camera. They ran him through some practice drills, and then it was time for him to try. If he succeeded at this next task, it would be science fiction come true: His thoughts would bypass his broken spinal cord. With the help of an algorithm and some electrodes, he would move his once-dead limb again — a scientific first. “Ready?” the young engineer, Nick Annetta, asked from the computer to his left. “Three. Two. One.” Burkhart, 23, marshaled every neuron he could muster, and he thought about his hand. 1 of 14 The last time the hand obeyed him, it was 2010 and Burkhart was running into the Atlantic Ocean. The hand had gripped the steering wheel as he drove the van from Ohio University to North Carolina’s Outer Banks, where he and friends were celebrating the end of freshman year. The hand unclenched to drop his towel on the sand. Burkhart splashed into the waves, the hand flying above his head, the ocean warm around his feet, the sun roasting his arms, and he dived. In an instant, he felt nothing. Not his hand. Not his legs. Only the breeze drying the saltwater on his face.

Keyword: Robotics
Link ID: 19770 - Posted: 06.25.2014

Helen Shen As US science agencies firm up plans for a national ten-year neuroscience initiative, California is launching an ambitious project of its own. On 20 June, governor Jerry Brown signed into law a state budget that allocates US$2 million to establish the California Blueprint for Research to Advance Innovations in Neuroscience (Cal-BRAIN) project. Cal-BRAIN is the first state-wide programme to piggyback on the national Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative announced by US President Barack Obama in April 2013 (see Nature 503, 26–28; 2013). The national project is backed this year by $110 million in public funding from the National Institutes of Health (NIH), the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF). California researchers and lawmakers hope that the state’s relatively modest one-time outlay will pave the way for a larger multiyear endeavour that gives its scientists an edge in securing grants from the national initiative. “It’s a drop in the bucket, but it’s an important start,” says Zack Lynch, executive director of the Neurotechnology Industry Organization, an advocacy group in San Francisco, California. Cal-BRAIN sets itself apart from the national effort by explicitly seeking industry involvement. The proposal emphasizes the potential economic benefits of neuroscience research and calls for the formation of a programme to facilitate the translation of any discoveries into commercial applications. © 2014 Nature Publishing Group,

Keyword: Brain imaging
Link ID: 19768 - Posted: 06.25.2014

by Sarah Zielinski Would you recognize a stop sign if it was a different shape, though still red and white? Probably, though there might be a bit of a delay. After all, your brain has long been trained to expect a red-and-white octagon to mean “stop.” The animal and plant world also uses colorful signals. And it would make sense if a species always used the same pattern to signal the same thing — like how we can identify western black widows by the distinctive red hourglass found on the adult spiders’ back. But that doesn’t always happen. Even with really important signals, such as the ones that tell a predator, “Don’t eat me — I’m poisonous.” Consider the dyeing dart frog (Dendrobates tinctorius), which is found in lowland forests of the Guianas and Brazil. The backs of the 5-centimeter-long frogs are covered with a yellow-and-black pattern, which warns of its poisonous nature. But that pattern isn’t the same from frog to frog. Some are decorated with an elongated pattern; others have more complex, sometimes interrupted patterns. The difference in patterns should make it harder for predators to recognize the warning signal. So why is there such variety? Because the patterns aren’t always viewed on a static frog, and the different ways that the frogs move affects how predators see the amphibians, according to a study published June 18 in Biology Letters. Bibiana Rojas of Deakin University in Geelong, Australia, and colleagues studied the frogs in a nature reserve in French Guiana from February to July 2011. They found 25 female and 14 male frogs, following each for two hours from about 2.5 meters away, where the frog wouldn’t notice a scientist. As a frog moved, a researcher would follow, recording how far it went and in what direction. Each frog was then photographed. © Society for Science & the Public 2000 - 2013.

Keyword: Vision; Aggression
Link ID: 19767 - Posted: 06.25.2014

By DOUGLAS QUENQUA When it comes to forming memories that involve recalling a personal experience, neuroscientists are of two minds. Some say that each memory is stored in a single neuron in a region of the brain called the hippocampus. But a new study is lending weight to the theory of neuroscientists who believe that every memory is spread out, or distributed, across many neurons in that part of the brain. By watching patients with electrodes in their brains play a memory game, researchers found that each such memory is committed to cells distributed across the hippocampus. Though the proportion of cells responsible for each memory is small (about 2 percent of the hippocampus), the absolute number is in the millions. So the loss of any one cell should not have a noticeable effect on memory or mental acuity, said Peter N. Steinmetz, a research neurologist at the Dignity Health Barrow Neurological Institute in Phoenix and senior author of the study. “The significance of losing one cell is substantially reduced because you’ve got this whole population that’s turning on” when you access a memory, he said. The findings also suggest that memory researchers “need to use techniques that allow us to look at the whole population of neurons” rather than focus on individual cells. The patients in the study, which is published in Proceedings of the National Academy of Sciences, first memorized a list of words on a computer screen, then viewed a second list that included those words and others. When asked to identify words they had seen earlier, the patients displayed cell-firing activity consistent with the distributed model of memory. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 19763 - Posted: 06.24.2014

|By Tori Rodriguez One of the most devastating aspects of Alzheimer's is its effect on patients' ability to recall life events. Several studies have found that music helps to strengthen these individuals' autobiographical memories, and a paper in the November 2013 Journal of Neurolinguistics builds on these findings by exploring the linguistic quality of those recollections. Researchers instructed 18 patients with Alzheimer's and 18 healthy control subjects to tell stories from their lives in a silent room or while listening to the music of their choice. Among the Alzheimer's patients, the music-cued stories contained a greater number of meaningful words, were more grammatically complex and conveyed more information per number of words. Music may enhance narrative memories because “music and language processing share a common neural basis,” explains study co-author Mohamad El Haj of Lille University in France. © 2014 Scientific American

Keyword: Alzheimers
Link ID: 19762 - Posted: 06.24.2014

Sarah C. P. Williams There’s a reason people say “Calm down or you’re going to have a heart attack.” Chronic stress—such as that brought on by job, money, or relationship troubles—is suspected to increase the risk of a heart attack. Now, researchers studying harried medical residents and harassed rodents have offered an explanation for how, at a physiological level, long-term stress can endanger the cardiovascular system. It revolves around immune cells that circulate in the blood, they propose. The new finding is “surprising,” says physician and atherosclerosis researcher Alan Tall of Columbia University, who was not involved in the new study. “The idea has been out there that chronic psychosocial stress is associated with increased cardiovascular disease in humans, but what’s been lacking is a mechanism,” he notes. Epidemiological studies have shown that people who face many stressors—from those who survive natural disasters to those who work long hours—are more likely to develop atherosclerosis, the accumulation of fatty plaques inside blood vessels. In addition to fats and cholesterols, the plaques contain monocytes and neutrophils, immune cells that cause inflammation in the walls of blood vessels. And when the plaques break loose from the walls where they’re lodged, they can cause more extreme blockages elsewhere—leading to a stroke or heart attack. Studying the effect of stressful intensive care unit (ICU) shifts on medical residents, biologist Matthias Nahrendorf of Harvard Medical School in Boston recently found that blood samples taken when the doctors were most stressed out had the highest levels of neutrophils and monocytes. To probe whether these white blood cells, or leukocytes, are the missing link between stress and atherosclerosis, he and his colleagues turned to experiments on mice. © 2014 American Association for the Advancement of Science

Keyword: Stress
Link ID: 19761 - Posted: 06.23.2014

By ANDREW POLLACK It is a tantalizingly simple idea for losing weight: Before meals, swallow a capsule that temporarily swells up in the stomach, making you feel full. Now, some early results for such a pill are in. And they are only partly fulfilling. People who took the capsule lost 6.1 percent of their weight after 12 weeks, compared with 4.1 percent for those taking a placebo, according to results presented Sunday at an endocrinology meeting in Chicago. Gelesis, the company developing the capsule, declared the results a triumph and said it would start a larger study next year aimed at winning approval for the product, called Gelesis100. “I’m definitely impressed, absolutely,” Dr. Arne V. Astrup, head of the department of nutrition, exercise and sports at the University of Copenhagen in Denmark and the lead investigator in the study, said in an interview. He said the physical mode of action could make the product safer than many existing diet drugs, which act chemically on the brain to influence appetite. But Dr. Daniel H. Bessesen, an endocrinologist at the University of Colorado who was not involved in the study, said weight loss of 2 percent beyond that provided by a placebo was “very modest.” “It doesn’t look like a game changer,” he said. Gelesis, a privately held company based in Boston, is one of many trying to come up with a product that can provide significant weight loss without bariatric surgery. Two new drugs — Qsymia from Vivus, and Belviq from Arena Pharmaceuticals and Eisai — have had disappointing sales since their approvals in 2012. Reasons include modest effectiveness, safety concerns, lack of insurance reimbursement and a belief among some doctors and overweight people that obesity is not a disease. © 2014 The New York Times Company

Keyword: Obesity
Link ID: 19758 - Posted: 06.23.2014

by Frank Swain WHEN it comes to personal electronics, it's difficult to imagine iPhones and hearing aids in the same sentence. I use both and know that hearing aids have a well-deserved reputation as deeply uncool lumps of beige plastic worn mainly by the elderly. Apple, on the other hand, is the epitome of cool consumer electronics. But the two are getting a lot closer. The first "Made for iPhone" hearing aids have arrived, allowing users to stream audio and data between smartphones and the device. It means hearing aids might soon be desirable, even to those who don't need them. A Bluetooth wireless protocol developed by Apple last year lets the prostheses connect directly to Apple devices, streaming audio and data while using a fraction of the power consumption of conventional Bluetooth. LiNX, made by ReSound (pictured), and Halo hearing aids made by Starkey – both international firms – use the iPhone as a platform to offer users new features and added control over their hearing aids. "The main advantage of Bluetooth is that the devices are talking to each other, it's not just one way," says David Nygren, UK general manager of ReSound. This is useful as hearing aids have long suffered from a restricted user interface – there's not much room for buttons on a device the size of a kidney bean. This is a major challenge for hearing-aid users, because different environments require different audio settings. Some devices come with preset programmes, while others adjust automatically to what their programming suggests is the best configuration. This is difficult to get right, and often devices calibrated in the audiologist's clinic fall short in the real world. © Copyright Reed Business Information Ltd.

Keyword: Hearing
Link ID: 19757 - Posted: 06.23.2014

Karen Ravn To the west, the skies belong to the carrion crow. To the east, the hooded crow rules the roost. In between, in a narrow strip running roughly north to south through central Europe, the twain have met, and mated, for perhaps as long as 10,000 years. But although the crows still look very different — carrion crows are solid black, whereas hooded crows are grey — researchers have found that they are almost identical genetically. The taxonomic status of carrion crows (Corvus corone) and hooded crows (Corvus cornix) has been debated ever since Carl Linnaeus, the founding father of taxonomy, declared them to be separate species in 1758. A century later, Darwin called any such classification impossible until the term 'species' had been defined in a generally accepted way. But the definition is still contentious, and many believe it always will be. The crows are known to cross-breed and produce viable offspring, so lack the reproductive barriers that some biologists consider essential to the distinction of a species, leading to proposals that they are two subspecies of carrion crow. In fact, evolutionary biologist Jochen Wolf from Uppsala University in Sweden and his collaborators have now found that the populations living in the cross-breeding zone are so similar genetically that the carrion crows there are more closely related to hooded crows than to the carrion crows farther west1. Only a small part of the genome — less than 0.28% — differs between the populations, the team reports in this week's Science1. This section is located on chromosome 18, in an area associated with pigmentation, visual perception and hormonal regulation. It is no coincidence, the researchers suggest, that the main differences between carrion and hooded crows are in colouring, mating preferences (both choose mates whose colouring matches theirs), and hormone-influenced social behaviours (carrion crows lord it over hooded ones). © 2014 Nature Publishing Group,

Keyword: Sexual Behavior; Aggression
Link ID: 19755 - Posted: 06.21.2014

—By Indre Viskontas He might be fictional. But the gigantic Hodor, a character in the blockbuster Game of Thrones series, nonetheless sheds light on something very much in the realm of fact: how our ability to speak emerges from a complex ball of neurons, and how certain brain-damaged patients can lose very specific aspects of that ability. According to George R.R. Martin, who wrote the epic books that inspired the HBO show, the 7-foot-tall Hodor could only say one word—"Hodor"—and everyone therefore tended to assume that was his name. Here's one passage about Hodor from the first novel in Martin's series: Theon Greyjoy had once commented that Hodor did not know much, but no one could doubt that he knew his name. Old Nan had cackled like a hen when Bran told her that, and confessed that Hodor's real name was Walder. No one knew where "Hodor" had come from, she said, but when he started saying it, they started calling him by it. It was the only word he had. Yet it's clear that Hodor can understand much more than he can say; he's able to follow instructions, anticipate who needed help, and behave in socially appropriate ways (mostly). Moreover, he says this one word in many different ways, implying very different meanings: So what might be going on in Hodor's brain? Hodor's combination of impoverished speech production with relatively normal comprehension is a classic, albeit particularly severe, presentation of expressive aphasia, a neurological condition usually caused by a localized stroke in the front of the brain, on the left side. Some patients, however, have damage to that part of the brain from other causes, such as a tumor, or a blow to the head. ©2014 Mother Jones

Keyword: Language
Link ID: 19753 - Posted: 06.21.2014

Heidi Ledford If shown to be possible in humans, addiction to the Sun could help explain why some tanners continue to seek out sunlight despite being well aware of the risks. The lure of a sunny day at the beach may be more than merely the promise of fun and relaxation. A study published today reports that mice exposed to ultraviolet (UV) rays exhibit behaviours akin to addiction. The researchers found that mice exposed repeatedly to UV light produced an opioid called β-endorphin, which numbs pain and is associated with addiction to drugs. When they were given a drug that blocks the effect of opioids, the mice also showed signs of withdrawal — including shaky paws and chattering teeth. If the results hold true in humans, they would suggest an explanation for why many tanners continue to seek out sunlight, despite the risks — and, in some cases, even after being diagnosed with skin cancer. “This offers a clear potential mechanism for how UV radiation can be rewarding and, in turn, potentially addictive,” says Bryon Adinoff, an addiction psychiatrist at the University of Texas Southwestern Medical Center in Dallas, who was not involved with the study. “That’s a big deal.” Oncologist David Fisher of the Massachusetts General Hospital in Boston and his colleagues became interested in sunlight addiction after studying the molecular mechanisms of pigment production in the skin after UV light exposure. In the new study published today in Cell1, they show that in mice, some skin cells also synthesize β-endorphin in response to chronic, low doses of UV light. © 2014 Nature Publishing Group

Keyword: Drug Abuse
Link ID: 19752 - Posted: 06.21.2014

By Robert Dudley When we think about the origins of agriculture and crop domestication, alcohol isn’t necessarily the first thing that comes to mind. But our forebears may well have been intentionally fermenting fruits and grains in parallel with the first Neolithic experiments in plant cultivation. Ethyl alcohol, the product of fermentation, is an attractive and psychoactively powerful inebriant, but fermentation is also a useful means of preserving food and of enhancing its digestibility. The presence of alcohol prolongs the edibility window of fruits and gruels, and can thus serve as a means of short-term storage for various starchy products. And if the right kinds of bacteria are also present, fermentation will stabilize certain foodstuffs (think cheese, yogurt, sauerkraut, and kimchi, for example). Whoever first came up with the idea of controlling the natural yeast-based process of fermentation was clearly on to a good thing. Using spectroscopic analysis of chemical residues found in ceramic vessels unearthed by archaeologists, scientists know that the earliest evidence for intentional fermentation dates to about 7000 BCE. But if we look deeper into our evolutionary past, alcohol was a component of our ancestral primate diet for millions of years. In my new book, The Drunken Monkey, I suggest that alcohol vapors and the flavors produced by fermentation stimulate modern humans because of our ancient tendencies to seek out and consume ripe, sugar-rich, and alcohol-containing fruits. Alcohol is present because of particular strains of yeasts that ferment sugars, and this process is most common in the tropics where fruit-eating primates originated and today remain most diverse. © 1986-2014 The Scientist

Keyword: Drug Abuse; Aggression
Link ID: 19751 - Posted: 06.21.2014

by Colin Barras The Neanderthals knew how to make an entrance: teeth first. Our sister species' distinctive teeth were among the first unique aspects of their anatomy to evolve, according to a study of their ancestors. These early Neanderthals may have used their teeth as a third hand, gripping objects that they then cut with tools. The claim comes from a study of fossils from Sima de los Huesos in northern Spain. This "pit of bones" may be an early burial site, and 28 near-complete skeletons have been pulled from it, along with a large hand-axe that might be a funeral gift. The hominins in the pit look like Neanderthals, but are far too old. That suggests they are forerunners of the Neanderthals, and if that is the case they can tell us how the species evolved. To find out, Juan Luis Arsuaga Ferreras at the UCM-ISCIII Joint Centre for Research into Human Evolution and Behaviour in Madrid, Spain, and colleagues studied 17 of the skulls. They found that the brain case was still the same shape as in older species. But the skulls' protruding faces and small molar teeth were much more Neanderthal-like. This suggests the earliest Neanderthals used their jaws in a specialised way. It's not clear how, but it probably wasn't about food, says Ferreras. "There are no indications of any dietary specialisation in the Neanderthals and their ancestors. They were basically carnivores." © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 19750 - Posted: 06.21.2014