Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 20566

Boer Deng A genetic variant protected some practitioners of cannibalism from prion disease. Scientists who study a rare brain disease that once devastated entire communities in Papua New Guinea have described a genetic variant that appears to stop misfolded proteins known as prions from propagating in the brain1. Kuru was first observed in the mid-twentieth century among the Fore people of Papua New Guinea. At its peak in the late 1950s, the disease killed up to 2% of the group's population each year. Scientists later traced the illness to ritual cannibalism2, in which tribe members ate the brains and nervous systems of their dead. The outbreak probably began when a Fore person consumed body parts from someone who had sporadic Creutzfeldt-Jakob disease (CJD), a prion disease that spontaneously strikes about one person in a million each year. Scientists have noted previously that some people seem less susceptible to prion diseases if they have an amino-acid substitution in a particular region of the prion protein — codon 1293. And in 2009, a team led by John Collinge — a prion researcher at University College London who is also the lead author of the most recent analysis — found another protective mutation among the Fore, in codon 1274. The group's latest work, reported on 10 June in Nature1, shows that the amino-acid change that occurs at this codon, replacing a glycine with a valine, has a different and more powerful effect than the substitution at codon 129. The codon 129 variant confers some protection against prion disease only when it is present on one of the two copies of the gene that encodes the protein. But transgenic mice with the codon-127 mutation were completely resistant to kuru and CJD regardless of whether they bore one or two copies of it. The researchers say that the mutation in codon 127 appears to confer protection by preventing prion proteins from becoming misshapen. © 2015 Nature Publishing Group,

Keyword: Prions
Link ID: 21043 - Posted: 06.13.2015

Joe Palca Scientists found a molecule crucial to perceiving the sensation of itching. It affects how the brain responds to serotonin, and may explain why anti-depressants that boost serotonin make some itch. JOE PALCA, BYLINE: How do you go about discovering what makes us itch? Well, if you're Diana Bautista at the University of California, Berkeley, you ask what molecules are involved. DIANA BAUTISTA: We say OK, what are the possible molecular players out there that might be contributing to itch or touch? PALCA: Bautista says it turns out itch and touch, and even pain, all seem to be related - at least in the way our brains makes sense of these sensations. But how to tell which molecules are key players? Bautista says basically you try everything you can. BAUTISTA: We test a lot of candidates. And if we're really lucky, one of our candidates - we can prove that it plays a really important role. PALCA: And now she thinks she's found one. Working with colleagues at the Buck Institute for Research on Aging, she's found a molecule that's made by a gene called HTR7. When there's less of this molecule, animals with itchy skin conditions, like eczema, do less scratching. When there's more of it, itching gets worse. The way this molecule works is kind of interesting. It changes how sensitive brain cells are to a chemical called serotonin. Now, serotonin is a chemical that's related to depression. So Bautista's research might explain why certain antidepressant drugs that boost serotonin have a peculiar side effect. For some people, the drugs make them itch. Bautista says the new research is certainly not the end of the story when it comes to understanding itch. © 2015 NPR

Keyword: Pain & Touch
Link ID: 21042 - Posted: 06.13.2015

by Penny Sarchet Simon Sponberg of Georgia Institute of Technology in Atlanta and his team have figured out the secret to the moths' night vision by testing them with robotic artificial flowers (see above). By varying the speed of a fake flower's horizontal motion and changing brightness levels, the team tested moths' abilities under different conditions. It has been theorised that the moth brain slows down, allowing their visual system to collect light for longer, a bit like lengthening a camera's exposure. But the strategy might also introduce blur, making it hard to detect fast movement. If the moths were using this brain-slowing tactic, they would be expected to react to fast flower movements more slowly in darker conditions. The team found that there was indeed a lag. It helped them see motion in the dark while still allowing them to keep up with flowers swaying at normal speeds. The size of the lags matched the expected behaviour of a slowed nervous system, providing evidence that moths could be slowing down the action of neurons in their visual system. Previously, placing hawkmoths in a virtual obstacle courseMovie Camera revealed that they vary their navigation strategies depending on visibility conditions. Journal reference: Science, DOI: 10.1126/science.aaa3042 © Copyright Reed Business Information Ltd

Keyword: Vision
Link ID: 21041 - Posted: 06.13.2015

By David Grimm In 2013, the Nonhuman Rights Project filed a series of lawsuits asking courts to recognize four New York chimpanzees as legal persons and free them from captivity. The animal rights group, which hopes to set a precedent for research chimps everywhere, has yet to succeed, but in April a judge ordered Stony Brook University to defend its possession of two of these animals, Hercules and Leo. Last month, the group and the university squared off in court, and the judge is expected to issue a decision soon. But the scientist working with the chimps, anatomist Susan Larson, has remained largely silent until now. In an exclusive interview, Larson talks about her work with these animals and the impact the litigation is having on her studies—and research animals in general. This interview has been edited for clarity and brevity. Q: Where did Hercules and Leo come from? A: They were born 8 years ago at the New Iberia Research Center in Louisiana. They were among the last juveniles New Iberia had. We've had them on loan for 6 years. Q: What kind of work do you do with them? A: We're interested in learning about the evolution of bipedalism by actually looking at what real animals do. Over the past 30 years, we've looked at 17 different species of primates, including 11 chimpanzees. Chimpanzees are the best model because they are so close to us. When we compare how they walk to how we walk, we can feed those data into computer models that may help us understand how early hominids like Lucy moved around. The work we're doing with Hercules and Leo is the most important work we've done. © 2015 American Association for the Advancement of Science

Keyword: Animal Rights
Link ID: 21040 - Posted: 06.13.2015

Dogs do not like people who are mean to their owners and will refuse food offered by people who have snubbed their master, Japanese researchers have said. The findings reveal that canines have the capacity to cooperate socially – a characteristic found in a relatively small number of species, including humans and some other primates. Researchers led by Kazuo Fujita, a professor of comparative cognition at Kyoto University, tested three groups of 18 dogs using role plays in which their owners needed to open a box. In all three groups, the owner was accompanied by two people whom the dog did not know. In the first group, the owner sought assistance from one of the other people, who actively refused to help. In the second group, the owner asked for, and received, help from one person. In both groups, the third person was neutral and not involved in either helping or refusing to help. Neither person interacted with the dog’s owner in the control – third – group. After watching the box-opening scene, the dog was offered food by the two unfamiliar people in the room. Dogs that saw their owner being rebuffed were far more likely to choose food from the neutral observer, and to ignore the offer from the person who had refused to help, Fujita said on Friday. Dogs whose owners were helped and dogs whose owners did not interact with either person showed no marked preference for accepting snacks from the strangers. “We discovered for the first time that dogs make social and emotional evaluations of people regardless of their direct interest,” Fujita said. If the dogs were acting solely out of self-interest, there would be no differences among the groups, and a roughly equal number of animals would have accepted food from each person. © 2015 Guardian News and Media Limited

Keyword: Emotions; Evolution
Link ID: 21039 - Posted: 06.13.2015

Owning a cat as a kid could put you at risk for schizophrenia and bipolar disorder later on because of parasites found in feline feces, new research says. Previous studies have linked the parasite toxoplasma gondii (T. gondii) to the development of mental disorders, and two more research papers published recently provide further evidence. Researchers from the Academic Medical Centre in Amsterdam looked at more than 50 studies and found that a person infected with the parasite is nearly twice as likely to develop schizophrenia. The other study, led by Dr. Robert H. Yolken of Johns Hopkins University School of Medicine in Baltimore, confirmed the results of a 1982 questionnaire that found half of people who had a cat as a kid were diagnosed with mental illnesses later in life compared to 42% of those who didn't grow up with a cat. "Cat ownership in childhood has now been reported in three studies to be significantly more common in families in which the child is later diagnosed with schizophrenia or another serious mental illness," the authors said in a press release. The findings were published in Schizophrenia Research and Acta Psychiatrica Scandinavica. T. gondii, which causes the disease toxoplasma, is especially risky for pregnant women and people with weak immune symptoms. The parasite can also be found in undercooked meat and unwashed fruits and vegetables.

Keyword: Schizophrenia; Neurotoxins
Link ID: 21038 - Posted: 06.10.2015

By Michael Balter Alcoholic beverages are imbibed in nearly every human society across the world—sometimes, alas, to excess. Although recent evidence suggests that tippling might have deep roots in our primate past, nonhuman primates are only rarely spotted in the act of indulgence. A new study of chimpanzees with easy access to palm wine shows that some drink it enthusiastically, fashioning leaves as makeshift cups with which to lap it up. The findings could provide new insights into why humans evolved a craving for alcohol, with all its pleasures and pains. Scientists first hypothesized an evolutionary advantage to humans’ taste for ethanol about 15 years ago, when a biologist at the University of California, Berkeley, proposed what has come to be called the “drunken monkey hypothesis.” Robert Dudley argued that our primate ancestors got an evolutionary benefit from being able to eat previously unpalatable fruit that had fallen to the ground and started to undergo fermentation. The hypothesis received a boost last year, when a team led by Matthew Carrigan—a biologist at Santa Fe College in Gainesville, Florida—found that the key enzyme that helps us metabolize ethanol underwent an important mutation about 10 million years ago. This genetic change, which occurred in the common ancestor of humans, chimps, and gorillas, made ethanol metabolism some 40 times faster than the process in other primates—such as monkeys—that do not have it. According to the hypothesis, the mutation allowed apes to consume fermented fruit without immediately getting drunk or, worse, succumbing to alcohol poisoning. Nevertheless, researchers had turned up little evidence that primates in the wild regularly eat windfall fruit or are attracted to the ethanol that such fruit contains. Now, a team led by Kimberley Hockings, a primatologist at the Center for Research in Anthropology in Lisbon, concludes from a 17-year study of chimps in West Africa that primates can tolerate significant levels of ethanol and may actually crave it, as humans do. © 2015 American Association for the Advancement of Science

Keyword: Drug Abuse; Evolution
Link ID: 21037 - Posted: 06.10.2015

By Gretchen Reynolds Treadmill desks are popular, even aspirational, in many offices today since they can help those of us who are deskbound move more, burn extra calories and generally improve our health. But an interesting new study raises some practical concerns about the effects of walking at your workspace and suggests that there may be unacknowledged downsides to using treadmill desks if you need to type or think at the office. The drumbeat of scientific evidence about the health benefits of sitting less and moving more during the day continues to intensify. One study presented last month at the 2015 annual meeting of the American College of Sports Medicine in San Diego found that previously sedentary office workers who walked slowly at a treadmill desk for two hours each workday for two months significantly improved their blood pressure and slept better at night. But as attractive as the desks are for health reasons, they must be integrated into a work setting so it seems sensible that they should be tested for their effects on productivity. But surprisingly little research had examined whether treadmill desks affect someone’s ability to get work done. So for the new study, which was published in April in PLOS One, researchers at Brigham Young University in Provo, Utah, recruited 75 healthy young men and women and randomly assigned them to workspaces outfitted with a computer and either a chair or a treadmill desk. The treadmill desk was set to move at a speed of 1.5 miles per hour with zero incline. None of the participants had used a treadmill desk before, so they received a few minutes of instruction and practice. Those assigned a chair were assumed to be familiar with its use. © 2015 The New York Times Company

Keyword: Obesity; Attention
Link ID: 21036 - Posted: 06.10.2015

By David Noonan Every night, before he goes to sleep, Al Pierce, whose thunderous snoring used to drive his wife out of their bedroom, uses a small remote control to turn on an electronic sensor implanted in his chest. The sensor detects small changes in his breathing pattern—early signs that Pierce's airway is beginning to collapse on itself. When the device senses these changes, it triggers a mild jolt of electricity that travels through a wire going up his neck. The wire ends at a tiny electrode wrapped around a nerve that controls muscles in his tongue. The nerve, stimulated by the charge, activates muscles that thrust Pierce's tongue forward in his mouth, which pulls his airway open. Throughout the night the 65-year-old plumber in Florence, S.C., gets hundreds of little jolts, yet he sleeps quietly. In the morning, rested and refreshed, Pierce uses the remote to turn off the device. This new technology, called upper-airway electronic stimulation and approved by the U.S. Food and Drug Administration last summer, offers much more than relief from an annoying noise. Pierce's loud snoring was the most obvious symptom of obstructive sleep apnea, a drastically underdiagnosed disorder shared by an estimated 25 million Americans. It can lead to high blood pressure, heart disease, diabetes, depression and an impaired ability to think clearly. Overall, people with severe sleep apnea have triple the risk of death from all causes as compared with those without the disorder. © 2015 Scientific American

Keyword: Sleep
Link ID: 21035 - Posted: 06.10.2015

Elizabeth Gibney A simple injection is now all it takes to wire up a brain. A diverse team of physicists, neuroscientists and chemists has implanted mouse brains with a rolled-up, silky mesh studded with tiny electronic devices, and shown that it unfurls to spy on and stimulate individual neurons. The implant has the potential to unravel the workings of the mammalian brain in unprecedented detail. “I think it’s great, a very creative new approach to the problem of recording from large number of neurons in the brain,” says Rafael Yuste, director of the Neuro­technology Center at Columbia University in New York, who was not involved in the work. If eventually shown to be safe, the soft mesh might even be used in humans to treat conditions such as Parkinson’s disease, says Charles Lieber, a chemist at Harvard University on Cambridge, Massachusetts, who led the team. The work was published in Nature Nanotechnology on 8 June1. Neuroscientists still do not understand how the activities of individual brain cells translate to higher cognitive powers such as perception and emotion. The problem has spurred a hunt for technologies that will allow scientists to study thousands, or ideally millions, of neurons at once, but the use of brain implants is currently limited by several disadvantages. So far, even the best technologies have been composed of relatively rigid electronics that act like sandpaper on delicate neurons. They also struggle to track the same neuron over a long period, because individual cells move when an animal breathes or its heart beats. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 21034 - Posted: 06.09.2015

by Hal Hodson Electricity is the brain's language, and now we can speak to it without wires or implants. Nanoparticles can be used to stimulate regions of the brain electrically, opening up new ways to treat brain diseases. It may even one day allow the routine exchange of data between computers and the brain. A material discovered in 2004 makes this possible. When "magnetoelectric" nanoparticles (MENs) are stimulated by an external magnetic field, they produce an electric field. If such nanoparticles are placed next to neurons, this electric field should allow them to communicate. To find out, Sakhrat Khizroev of Florida International University in Miami and his team inserted 20 billion of these nanoparticles into the brains of mice. They then switched on a magnetic field, aiming it at the clump of nanoparticles to induce an electric field. An electroencephalogram showed that the region surrounded by nanoparticles lit up, stimulated by this electric field that had been generated. "When MENs are exposed to even an extremely low frequency magnetic field, they generate their own local electric field at the same frequency," says Khizroev. "In turn, the electric field can directly couple to the electric circuitry of the neural network." Khizroev's goal is to build a system that can both image brain activity and precisely target medical treatments at the same time. Since the nanoparticles respond differently to different frequencies of magnetic field, they can be tuned to release drugs. © Copyright Reed Business Information Ltd

Keyword: Brain imaging
Link ID: 21033 - Posted: 06.09.2015

Angus Chen The genetic underpinnings of psychosis are elusive and diffuse. There are hundreds of common genetic mutations scattered throughout the human genome that each bump up by just a tiny bit the risk of developing a mental illness like schizophrenia. Many people carry some set of those genes, but most don't end up with a psychotic disorder. Instead, a study suggests, they might be getting a small creative boost. Meghan, 23, began experiencing hallucinations at 19. "Driving home, cars' headlights turned into eyes. The grills on the cars turned into mouths and none of them looked happy. It would scare the crap out of me," Meghan says. Those genetic changes may persist in human DNA because they confer benefits, according Dr. Kári Stefánsson, a neurologist and CEO of a biological research company called deCODE Genetics, which conducted the study published in Nature Neuroscience Monday. "They are found in most of us, and they're common because they either confer or in the past conferred some reproductive advantage," he says. The advantage of having a more creative mind, he suggests, might help explain why these genes persist, even as they increase the risk of developing debilitating disorders, such as schizophrenia. It's an idea from the ancients. The philosopher Aristotle famously opined that genius and madness go hand in hand. Psychiatric studies have to some degree supported the adage. Studies of more than 1 million Swedish people in 2011 and 2013 found that people who had close relatives with schizophrenia or bipolar disorder were much more likely to become creative professionals. (The patients with mental illness were not themselves more creative, with the exception of some who had bipolar disorder.) What's more, studies that looked at healthy people who carry genetic markers associated with a psychotic disorder found their brains work slightly differently than others who lack those genetic markers. © 2015 NPR

Keyword: Schizophrenia; Genes & Behavior
Link ID: 21032 - Posted: 06.09.2015

Austin Frakt One weekend afternoon a couple of years ago, while turning a page of the book I was reading to my daughters, I fell asleep. That’s when I knew it was time to do something about my insomnia. Data, not pills, was my path to relief. Insomnia is common. About 30 percent of adults report some symptoms of it, though less than half that figure have all symptoms. Not all insomniacs are severely debilitated zombies. Consistent sleeplessness that causes some daytime problems is all it takes to be considered an insomniac. Most function quite well, and the vast majority go untreated. I was one of the high-functioning insomniacs. In fact, part of my problem was that I relished the extra time awake to work. My résumé is full of accomplishments I owe, in part, to my insomnia. But it took a toll on my mood, as well as my ability to make it through a children’s book. Insomnia is worth curing. Though causality is hard to assess, chronic insomnia is associated with greater risk of anxiety, depression, hypertension, diabetes, accidents and pain. Not surprisingly, and my own experience notwithstanding, it is also associated with lower productivity at work. Patients who are successfully treated experience improved mood, and they feel healthier, function better and have fewer symptoms of depression. Which remedy would be best for me? Lunesta, Ambien, Restoril and other drugs are promised by a barrage of ads to deliver sleep to minds that resist it. Before I reached for the pills, I looked at the data. © 2015 The New York Times Company

Keyword: Sleep
Link ID: 21031 - Posted: 06.09.2015

The virtual reality arm appears to move faster and more accurately than the real arm Virtual reality could help stroke patients recover by "tricking" them into thinking their affected limb is more accurate than it really is. Researchers in Spain found that making the affected limb appear more effective on screen increased the chance the patient would use it in real life. This is important because stroke victims often underuse their affected limbs, making them even weaker. A stroke charity welcomed the study and called for more research. In the study of 20 stroke patients, researchers sometimes enhanced the virtual representation of the patient's affected limb, making it seem faster and more accurate, but without the patient's knowledge. After the episodes in which the limbs were made to seem more effective, the patients then went on to use them more, according to lead researcher Belen Rubio. "Surprisingly, only 10 minutes of enhancement was enough to induce significant changes in the amount of spontaneous use of the affected limb," said Mrs Rubio from the Laboratory of Synthetic, Perceptive, Emotive and Cognitive Systems at Pompeu Fabra University in Spain. "This therapy could create a virtuous circle of recovery, in which positive feedback, spontaneous arm use and motor performance can reinforce each other. Engaging patients in this ongoing cycle of spontaneous arm use, training and learning could produce a remarkable impact on their recovery process," she said. © 2015 BBC

Keyword: Stroke
Link ID: 21030 - Posted: 06.09.2015

By Sandra G. Boodman The test had become something of an annual ritual. Every year beginning when he turned 45, Thomas Clark Semmes, an IT consultant for the federal government, would visit his internist for a physical. In a standard test of the sensory system that is often part of a physical, the Baltimore doctor would prick the soles of Semmes’s feet with a pin. “He’d look at me and say, ‘Tell me when you feel it,’ and I’d say ‘I will when I can,’ ” Semmes, now 56, recalled of the pinprick test. Because he never felt anything, he said nothing. “It never really concerned me very much,” he recalled. His doctor would then dutifully jot something in his chart, never exploring it further. But in 2013, nearly a decade after that first test, a quick evaluation by a podiatrist revealed the reason for his unfeeling feet and provided an explanation for an anatomical oddity in one of Semmes’s close relatives. In retrospect, Semmes wishes he had asked his internist about the lack of sensation, but he assumed it wasn’t important — otherwise, the doctor would have said something. And as Semmes would later learn, not knowing what was wrong had cost him valuable time. “I definitely wish I’d been diagnosed sooner,” he said. “There are things that could have been done to lessen the impact.” Before 2013, Semmes never had much reason to think about his feet. He knew he had hammertoes — toes that bend downward at the middle joint as a result of heredity or trauma — as well as extremely high arches, but neither condition was painful or limiting. At least, he thought, he did not have bird legs like his father, whose limbs were so storklike that they were a running family joke. “I had big, muscular legs,” Semmes said.

Keyword: Movement Disorders; Genes & Behavior
Link ID: 21029 - Posted: 06.09.2015

Fergus Walsh Medical correspondent Scientists in Austria have created an artificial leg which allows the amputee to feel lifelike sensations from their foot. The recipient, Wolfang Rangger, who lost his right leg in 2007, said: "It feels like I have a foot again. It's like a second lease of life." Prof Hubert Egger of the University of Linz, said sensors fitted to the sole of the artificial foot, stimulated nerves at the base of the stump. He added it was the first time that a leg amputee had been fitted with a sensory-enhanced prosthesis. How it works Surgeons first rewired nerve endings in the patient's stump to place them close to the skin surface. Six sensors were fitted to the base of the foot, to measure the pressure of heel, toe and foot movement. These signals were relayed to a micro-controller which relayed them to stimulators inside the shaft where it touched the base of the stump. These vibrated, stimulating the nerve endings under the skin, which relayed the signals to the brain. Prof Egger said: "The sensors tell the brain there is a foot and the wearer has the impression that it rolls off the ground when he walks." Wolfgang Ranger, a former teacher, who lost his leg after a blood clot caused by a stroke, has been testing the device for six months, both in the lab and at home. He says it has given him a new lease of life He said: "I no longer slip on ice and I can tell whether I walk on gravel, concrete, grass or sand. I can even feel small stones." © 2015 BBC.

Keyword: Pain & Touch; Robotics
Link ID: 21028 - Posted: 06.09.2015

James Gorman When researchers found a group of brain cells in the fruit fly that function like a compass, they were very satisfied. They had found what they were looking for. But, said Vivek Jayaraman, when he and Johannes D. Seelig realized that the cells were actually arranged in a physical circle in the brain, so they looked just like a compass, they were taken aback. “It’s kind of like a cosmic joke that they are arranged like that,” he said. Dr. Jayaraman was investigating a kind of navigation called dead reckoning, or, in technical terms, angular path integration. It is the most basic way a moving creature knows where it is and where it is going. In dead reckoning, animals use visual cues, like landmarks, and also a sense of where their bodies are pointed. It is very different from other ways animals navigate, such as the use of polarized light from the sun or sensitivity to the earth’s magnetic field. The researchers published their findings in Nature last month. Dr. Jayaraman had narrowed down the likely location of directional tracking based on other research. So he expected to find activity in the ellipsoid body, a very small region of a very small brain. Dr. Jayaraman and Mr. Seelig, at the Janelia Research Campus of the Howard Hughes Medical Institute in Virginia, engineered neurons there to light up when they were active, and they recorded the activity with a microscopic technique called two-photon calcium imaging that gives a real-time visual picture of the brain in action in a living animal. © 2015 The New York Times Company

Keyword: Learning & Memory
Link ID: 21027 - Posted: 06.08.2015

By SABRINA TAVERNISE WASHINGTON — The global diabetes rate has risen by nearly half over the past two decades, according to a new study, as obesity and the health problems it spawns have taken hold across the developing world. The prevalence of diabetes has been rising in rich countries for several decades, largely driven by increases in the rate of obesity. More recently, poorer countries have begun to follow the trend, with major increases in countries like China, Mexico and India. The study, published Monday in the British medical journal The Lancet, reported a 45 percent rise in the prevalence of diabetes worldwide from 1990 to 2013. Nearly all the rise was in Type 2, which is usually related to obesity and is the most common form of the disease. A major shift is underway in the developing world, in which deaths from communicable diseases like malaria and tuberculosis have declined sharply, and chronic diseases like cancer and diabetes are on the rise. The pattern is linked to economic improvement and more people living longer, but it has left governments in developing countries scrambling to deal with new and often more expensive ways to treat illnesses. The study, led by the Institute for Health Metrics and Evaluation, a research group, was funded by the Bill and Melinda Gates Foundation. It is the largest analysis of global disability data to date, drawing on more than 35,000 data sources in 188 countries. © 2015 The New York Times Company

Keyword: Obesity
Link ID: 21026 - Posted: 06.08.2015

By Sue Bailey, The Canadian Press Scientific studies increasingly suggest marijuana may not be the risk-free high that teens — and sometimes their parents — think it is, researchers say. Yet pot is still widely perceived by young smokers as relatively harmless, said Dr. Romina Mizrahi, director of the Focus on Youth Psychosis Prevention clinic and research program at the Centre for Addiction and Mental Health. She cites a growing body of research that warns of significantly higher incidence of hallucinations, paranoia and the triggering of psychotic illness in adolescent users who are most predisposed. "When you look at the studies in general, you can safely say that in those that are vulnerable, it doubles the risk." Such fallout is increasingly evident in the 19-bed crisis monitoring unit at the Children's Hospital of Eastern Ontario in Ottawa. "I see more and more cases of substance-induced psychosis," said Dr. Sinthu Suntharalingam, a child and adolescent psychiatrist. "The most common substance that's abused is cannabis." One or two cases a week are now arriving on average. "They will present with active hallucinations," Suntharalingam said. "Parents will be very scared. They don't know what's going on. "They'll be seeing things, hearing things, sometimes they will try to self-harm or go after other people." Potential effects need to be better understood She and Mizrahi, an associate professor in psychiatry at University of Toronto, are among other front-line professionals who say more must be done to help kids understand potential effects. "They know the hard drugs, what they can do," Suntharalingam said. "Acid, they'll tell us it can cause all these things so they stay away from it. But marijuana? They'll be: 'Oh, everybody does it."' Mizrahi said the message isn't getting through. ©2015 CBC/Radio-Canada.

Keyword: Drug Abuse; Development of the Brain
Link ID: 21025 - Posted: 06.08.2015

By Brian Handwerk When it comes to mating, female mice must follow their noses. For the first time, scientists have shown that hormones in mice hijack smell receptors in the nose to drive behavior, while leaving the brain completely out of the loop. According to the study, appearing this week in Cell, female mice can smell attractant male pheromones during their reproductive periods. But during periods of diestrus, when the animals are unable to reproduce, the hormone progesterone prompts nasal sensory cells to block male pheromone signals so that they don't reach a female's brain. During this time, female mice display indifference or even hostility toward males. The same sensors functioned normally with regard to other smells, like cat urine, showing they are selective for male pheromones. When ovulation begins, progesterone levels drop, enabling the females to once more smell male pheromones. In short, the system "blinds" female mice to potential mates when the animals are not in estrus. The finding that the olfactory system usurped the brain's role shocked the research team, says lead author Lisa Stowers of the Scripps Research Institute. “The sensory systems are just supposed to sort of suck up everything they can in the environment and pass it all on to the brain. The result just seems wacky to us,” Stowers says. “Imagine this occurring in your visual system," she adds. "If you just ate a big hamburger and then saw a buffet, you might see things like the table and some people and maybe some fruit—but you simply wouldn't see the hamburgers anymore. That's kind of what happens here. Based on this female's internal-state change, she's missing an entire subset of the cues being passed on to her brain.”

Keyword: Chemical Senses (Smell & Taste); Sexual Behavior
Link ID: 21024 - Posted: 06.06.2015