Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 20157

If you missed the great dress debate of 2015 you were probably living under a rock. Staffrooms across the globe threatened to come to a standstill as teachers addressed the all-important question – was the dress white and gold or blue and black? This is just one example of how our brains interpret things differently. So, with the 20th anniversary of Brain Awareness Week from 16 to 22 March, this week we bring you a collection of ideas and resources to get students’ synapses firing. The brain is one of our most interesting organs, and advances in technology and medicine mean we now know more about it than ever before. Brain Awareness Week is a global campaign to raise awareness of the progress and benefits of brain research. The organisers, the Dana Foundation, have put together an assortment of teaching materials for primary and secondary students. For children aged five to nine, the Mindboggling Workbook is a good place to start. It includes information on how the brain works, what it does and how to take care of it. There’s also a section on the nervous system, which you could turn into a fun group activity. Ask one student to lie down on a large sheet of paper while others trace around them. Add a drawing of the brain and the spinal cord. Use different coloured crayons to illustrate how neurons send messages around your body when you a) touch something hot, b) get stung on the leg by a wasp, and c) wriggle your toes after stepping in sand. Can students explain why the brain is described as being more powerful than a computer? © 2015 Guardian News and Media Limited

Keyword: Miscellaneous
Link ID: 20673 - Posted: 03.10.2015

Robin Tricoles The first time it happened, I was 8. I was tucked in bed reading my favorite book when my tongue swelled up to the size of a cow’s, like the giant tongues I had seen in the glass display case at the neighborhood deli. At the same time, the far wall of my bedroom began to recede, becoming a tiny white rectangle floating somewhere in the distance. In the book I was holding, the typeface grew vast on the page. I was intrigued, I remember, but not afraid. Over the next six years, the same thing happened to me dozens of times. Forty years later, while working as a science writer, I stumbled on a scientific paper describing almost exactly what I had experienced. The paper attributed those otherworldly sensations to something called Alice in Wonderland syndrome, or its close cousin, Alice in Wonderland-like syndrome. People with Alice in Wonderland syndrome (AWS) perceive parts of their body to be changing size. For example, their feet may suddenly appear smaller and more distant, or their hands larger than they had been moments before. Those with the closely related Alice in Wonderland-like syndrome (AWLS) misperceive the size and distance of objects, seeing them as startlingly larger, smaller, fatter, or thinner than their natural state. People who experience both sensations, like I did, are classified as having AWLS. The syndrome’s name is commonly attributed to English psychiatrist John Todd, who in 1955 described his adult patients’ illusions of corporal and objective distortions in a paper in the Canadian Medical Association Journal. © 2015 by The Atlantic Monthly Group.

Keyword: Attention
Link ID: 20672 - Posted: 03.10.2015

By Rachel Rabkin Peachman Many women with a history of depression who take antidepressants assume that once they get pregnant, they should try to wean themselves off their meds to avoid negative side effects for the baby. A new large study published in the journal Pediatrics challenges one reason behind that assumption. The research found that taking selective serotonin reuptake inhibitors (the antidepressants also known as S.S.R.I.s) while pregnant does not increase the risk of asthma in the resulting babies. What is associated with an increased risk of asthma? According to this study and other research, untreated prenatal depression. “The mechanisms underlying the association of prenatal depression and asthma are unknown,” said Dr. Xiaoqin Liu, the lead author of the Pediatrics study and an epidemiologist at Aarhus University in Denmark. An association between prenatal depression and asthma does not mean that prenatal depression causes asthma. There could be other reasons for the correlation, genetic or environmental, or both. For example, people who live in dense, polluted urban areas could be at an increased risk of both asthma and depression. The researchers used Denmark’s national registries to evaluate all singleton babies born from 1996 to 2007, and identify the mothers who had a diagnosis of depression or had used antidepressants, or both, during pregnancy or one year beforehand. Using a statistical model, the study authors found that prenatal depression — with or without the use of antidepressants — was associated with a 25 percent increased risk of asthma in children as compared with children whose mothers did not have a record of depression. © 2015 The New York Times Company

Keyword: Depression; Development of the Brain
Link ID: 20671 - Posted: 03.10.2015

Alison Abbott Mediators appointed to analyse the rifts within Europe’s ambitious €1-billion (US$1.1-billion) Human Brain Project (HBP) have called for far-reaching changes both in its governance and its scientific programmes. Most significantly, the report recommends that systems neuroscience and cognitive neuroscience should be reinstated into the HBP. The mediation committee, led by engineer Wolfgang Marquardt, director of Germany’s national Jülich Research Centre, sent its final report to the HBP board of directors on 9 March, and issued a press release summarizing its findings. (The full report will not be published until after the board, a 22-strong team of scientists, discusses its contents at a meeting on 17–18 March). The European Commission flagship project, which launched in October 2013, is intended to boost supercomputing through neuroscience, with the aim of simulating the brain in a computer. But the project has been racked by dissent from the outset. In early 2014, a three-person committee of scientists who ran the HBP’s scientific direction revealed that they planned to eliminate cognitive neuroscience from the initiative, which precipitated a mass protest. More than 150 of Europe’s leading neuroscientists signed a letter to the European Commission, complaining about the project’s management and charging that the HBP plan to simulate the brain using only ‘bottom-up’ data on the behaviour of neurons was doomed to failure if it did not include the top-down constraints provided by systems and cognitive neuroscience. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20670 - Posted: 03.10.2015

By TIMOTHY WILLIAMS In January 1972, Cecil Clayton was cutting wood at his family’s sawmill in southeastern Missouri when a piece of lumber flew off the circular saw blade and struck him in the forehead. The impact caved in part of Mr. Clayton’s skull, driving bone fragments into his brain. Doctors saved his life, but in doing so had to remove 20 percent of his frontal lobe, which psychiatrists say led Mr. Clayton to be tormented for years by violent impulses, schizophrenia and extreme paranoia. In 1996, his lawyers say, those impulses drove Mr. Clayton to kill a law enforcement officer. Today, as Mr. Clayton, 74, sits on death row, his lawyers have returned to that 1972 sawmill accident in a last-ditch effort to save his life, arguing that Missouri’s death penalty law prohibits the execution of severely brain-damaged people. Lawyers for Mr. Clayton, who has an I.Q. of 71, say he should be spared because his injury has made it impossible for him to grasp the significance of his death sentence, scheduled for March 17. “There was a profound change in him that he doesn’t understand, and neither did his family,” said Elizabeth Unger Carlyle, one of Mr. Clayton’s lawyers. While several rulings by the United States Supreme Court in recent years have narrowed the criteria for executing people who have a mental illness, states continue to hold wide sway in establishing who is mentally ill. The debate surrounding Mr. Clayton involves just how profoundly his impairment has affected his ability to understand what is happening to him. Mr. Clayton is missing about 7.7 percent of his brain. © 2015 The New York Times Company

Keyword: Aggression; Attention
Link ID: 20669 - Posted: 03.09.2015

By James Gallagher Health editor, BBC News website, San Diego A dog has been used to sniff out thyroid cancer in people who had not yet been diagnosed, US researchers say. Tests on 34 patients showed an 88% success rate in finding tumours. The team, presenting their findings at the annual meeting of the Endocrine Society, said the animal had an "unbelievable" sense of smell. Cancer Research UK said using dogs would be impractical, but discovering the chemicals the dogs can smell could lead to new tests. The thyroid is a gland in the neck that produces hormones to regulate metabolism. Thyroid tumours are relatively rare and are normally diagnosed by testing hormone levels in the blood and by using a needle to extract cells for testing. Cancers are defective, out-of-control cells. They have their own unique chemistry and release "volatile organic compounds" into the body. The canine approach relies on dogs having 10 times the number of smell receptors as people and being able to pick out the unique smells being released by cancers. The man's best friend approach has already produced promising results in patients with bowel and lung cancers. A team at the University of Arkansas for Medical Sciences (UAMS) had previously showed that a dog could be trained to smell the difference between urine samples of patients with and without thyroid cancer. Frankie the dog Frankie gave the correct diagnosis in 30 out of 34 cases The next step was to see if it could be used as a diagnostic test. Frankie the German Shepherd was trained to lie down when he could smell thyroid cancer in a sample and turn away if the urine was clean.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20668 - Posted: 03.09.2015

By Lily Hay Newman When I was growing up, I had a lazy eye. I had to wear a patch over my stronger eye for many years so that good-for-nothing, freeloading, lazy eye could learn some responsibility and toughen up. Wearing a patch was really lousy, though, because people would ask me about it all the time and say things like, "What's wrong with you?" Always fun to hear. I would have much preferred to treat my condition, which is also called amblyopia, by playing video games. Who wouldn't? And it seems like that dream may become a possibility. On Tuesday, developer Ubisoft announced Dig Rush, a game that uses stereoscopic glasses and blue and red figures in varying contrasts to attempt to treat amblyopia. Working in collaboration with McGill University and the eye treatment startup Amblyotech, Ubisoft created a world where controlling a mole character to mine precious metals is really training patients' brains to coordinate their eyes. When patients wear a patch, they may force their lazy eye to toughen up, but they aren't doing anything to teach their eyes how to work together. This lack of coordination, called strabismus, is another important factor that the game makers hope can be addressed better by Dig Rush than by "patching" alone. Amblyotech CEO Joseph Koziak said in a statement, “[This] electronic therapy has been tested clinically to significantly increase the visual acuity of both children and adults who suffer from this condition without the use of an eye patch.” One advantage of Dig Rush, he noted, is that it's easier to measure compliance with video games.

Keyword: Vision
Link ID: 20667 - Posted: 03.09.2015

By RICHARD A. FRIEDMAN CHANCES are that everyone on this planet has experienced anxiety, that distinct sense of unease and foreboding. Most of us probably assume that anxiety always has a psychological trigger. Yet clinicians have long known that there are plenty of people who experience anxiety in the absence of any danger or stress and haven’t a clue why they feel distressed. Despite years of psychotherapy, many experience little or no relief. It’s as if they suffer from a mental state that has no psychological origin or meaning, a notion that would seem heretical to many therapists, particularly psychoanalysts. Recent neuroscience research explains why, in part, this may be the case. For the first time, scientists have demonstrated that a genetic variation in the brain makes some people inherently less anxious, and more able to forget fearful and unpleasant experiences. This lucky genetic mutation produces higher levels of anandamide — the so-called bliss molecule and our own natural marijuana — in our brains. In short, some people are prone to be less anxious simply because they won the genetic sweepstakes and randomly got a genetic mutation that has nothing at all to do with strength of character. About 20 percent of adult Americans have this mutation. Those who do may also be less likely to become addicted to marijuana and, possibly, other drugs — presumably because they don’t need the calming effects that marijuana provides. One patient of mine, a man in his late 40s, came to see me because he was depressed and lethargic. He told me at our first meeting that he had been using cannabis almost daily for at least the past 15 years. “It became a way of life,” he explained. “Things are more interesting, and I can tolerate disappointments without getting too upset.” © 2015 The New York Times Company

Keyword: Drug Abuse; Stress
Link ID: 20666 - Posted: 03.09.2015

by Penny Sarchet For some of us, it might have been behind the bikeshed. Not so the African cotton leafworm moth (Spodoptera littoralis), which can choose any one of a vast number of plant species to mate on. But these moths remember their first time, returning to the same species in search of other mates. In the wild, this moth feeds and mates on species from as many as 40 different plant families. That much choice means there's usually something available to eat, but selecting and remembering the best plants is tricky. So, recalling what you ate as a larva, or where you first copulated, may help narrow down which plants provide better quality food or are more likely to attract other potential mates. Magali Proffit and David Carrasco of the Swedish University of Agricultural Sciences in Alnarp and their colleagues have discovered that this moth's first mating experience shapes its future preferences. These moths have an innate preference for cotton plants over cabbage. But when the researchers made them mate for the first time on cabbage, the moths later showed an increased preference for mating or laying eggs on this plant. Further experiments revealed that moths didn't just favour plants they were familiar with, even in combination with a sex pheromone – mating had to be involved. © Copyright Reed Business Information Ltd.

Keyword: Sexual Behavior; Learning & Memory
Link ID: 20665 - Posted: 03.09.2015

By Neuroskeptic There is a popular view that all of the natural sciences can be arranged in a chain or ladder according to the complexity of their subjects. On this view, physics forms the base of the ladder because it deals with the simplest building-blocks of matter, atoms and subatomic particles. Chemistry is next up because it studies interacting atoms i.e. molecules. Biology studies complex collections of molecules, i.e. cells. Then comes neuroscience which deals with a complex collection of interacting cells – the brain. Psychology, perhaps, can be seen as the next level above neuroscience, because psychology studies brains interacting with each other and with the environment. So this on this model, we have a kind of Great Chain of Science, something like this: This is an appealing model. But is biology really basic to neuroscience (and psychology)? At first glance it seems like biology – most importantly cell and molecular biology – surely is basic to neuroscience. After all, brains are comprised of cells. All of the functions of brain cells, like synaptic transmission and plasticity, are products of biological machinery, i.e. proteins and ultimately genes. This doesn’t imply that neuroscience could be ‘reduced to’ biology, any more than biology will ever be reduced to pure chemistry, but it does seem to imply that biology is the foundation for neuroscience.

Keyword: Miscellaneous
Link ID: 20664 - Posted: 03.09.2015

by Sarah Zielinski Before they grow wings and fly, young praying mantises have to rely on leaps to move around. But these little mantises are really good at jumping. Unlike most insects, which tend to spin uncontrollably and sometimes crash land, juvenile praying mantises make precision leaps with perfect landings. But how do they do that? To find out, Malcolm Burrows of the University of Cambridge in England and colleagues filmed 58 juvenile Stagmomantis theophila praying mantises making 381 targeted jumps. The results of their study appear March 5 in Current Biology. For each test leap, the researchers put a young insect on a ledge with a black rod placed one to two body lengths away. A jump to the rod was fast — only 80 milliseconds, faster than a blink of an eye — but high-speed video captured every move at 1,000 frames per second. That let the scientists see what was happening: First, the insect shook its head from side to side, scanning its path. Then it rocked backwards and curled up its abdomen, readying itself to take a leap. With a push of its legs, the mantis was off. In the air, it rotated its abdomen, hind legs and front legs, but its body stayed level until it hit the target and landed on all four limbs. “The abdomen, front legs and hind legs performed a series of clockwise and anticlockwise rotations during which they exchanged angular momentum at different times and in different combinations,” the researchers write. “The net result … was that the trunk of the mantis spun by 50˚relative to the horizontal with a near-constant angular momentum, aligning itself perfectly for landing with the front and hind legs ready to grasp the target.” © Society for Science & the Public 2000 - 2015

Keyword: Vision
Link ID: 20663 - Posted: 03.07.2015

By Nicholas Weiler Killer whales wouldn’t get far without their old ladies. A 9-year study of orcas summering off the southern tip of Vancouver Island in the Pacific Northwest finds that menopausal females usually lead their families to find salmon, particularly when the fish are scarce. Older females’ years of foraging experience may help their clans survive in years of famine, an evolutionary benefit that could explain why—like humans—female orcas live for decades past their reproductive prime. “Menopause is a really bizarre trait. Evolutionarily it doesn’t make sense,” says biologist Lauren Brent of the University of Exeter in the United Kingdom, who led the new study. Most animals keep having babies until they drop, part of the evolutionary drive to spread their genes as widely as possible. Only female humans, pilot whales, and killer whales are known to go through menopause: At a certain age, they stop reproducing, but continue to lead long, productive lives. Like humans, female killer whales stop giving birth by about 40, but can live into their 90s. Anthropologists have proposed a controversial explanation for menopause in humans: that grandmothers contribute to their genetic legacies by helping their children and grandchildren survive and reproduce. In hunter-gatherer and other societies, elders find extra food, babysit, and remember tribal lore about how to live through floods, famines, and other hardships. According to the “grandmother hypothesis,” this contribution is so valuable that it helped spur the evolution of women’s long postreproductive lives. Orcas too depend on their elders: Adult killer whales’ mortality rates skyrocket after their elderly mothers die. But how the menopausal whales might help their children survive was not clear, Brent says. © 2015 American Association for the Advancement of Science.

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 20662 - Posted: 03.07.2015

By Jonathan Webb Science reporter, BBC News, San Antonio Physicists have pinned down precisely how pipe-shaped cells in our retina filter the incoming colours. These cells, which sit in front of the ones that actually sense light, play a major role in our colour vision that was only recently confirmed. They funnel crucial red and green light into cone cells, leaving blue to spill over and be sensed by rod cells - which are responsible for our night vision. Key to this process, researchers now say, is the exact shape of the pipes. The long, thin cells are known as "Muller glia" and they were originally thought to play more of a supporting role in the retina. They clear debris, store energy and generally keep the conditions right for other cells - like the rods and cones behind them - to turn light into electrical signals for the brain. But a study published last year confirmed the idea, proposed in earlier simulations, that Muller cells also function rather like optical fibres. 3D scans revealed the pipe-like structure of the Muller cells (in red) sitting above the photoreceptor cells (in blue) 3D scans revealed the pipe-like structure of the Muller cells (in red) sitting above the photoreceptor cells (in blue) And more than just piping light to the back of the retina, where the rods and cones sit, they selectively send red and green light - the most important for human colour vision - to the cone cells, which handle colour. Meanwhile, they leave 85% of blue light to spill over and reach nearby rod cells, which specialise in those wavelengths and give us the mostly black-and-white vision that gets us by in dim conditions. © 2015 BBC.

Keyword: Vision; Glia
Link ID: 20661 - Posted: 03.07.2015

|By Dina Fine Maron Obesity stems primarily from the overconsumption of food paired with insufficient exercise. But this elementary formula cannot explain how quickly the obesity epidemic has spread globally in the past several decades nor why more than one third of adults in the U.S. are now obese. Many researchers believe that a more complex mix of environmental exposures, lifestyle, genetics and the microbiome’s makeup help explain that phenomenon. And a growing body of work suggests that exposure to certain chemicals—found in nature as well as industry—may play an essential role by driving the body to produce and store surplus fat in its tissues. Evidence of that cause-and-effect relationship in humans is still limited, but in laboratory animals and in petri dishes data linking the chemicals to problematic weight gain are mounting. Moreover, the effects in animals appear to be passed on not just to immediate offspring but also grandchildren and great-grandchildren—potentially accounting for some multigenerational obesity. The murkier picture for humans may become clearer in the next five years, says Jerry Heindel, a health science administrator at the National Institute of Environmental Health Sciences. His agency is now funding 57 grants related to obesity and diabetes, he said on March 2 at a meeting of the Institute of Medicine (IOM). The studies look at how chemicals, including those that appear to alter hormone regulation (such as the plasticizer bisphenol A and the antibacterial chemical triclosan), affect weight gain or insulin resistance. Thirty-two of the ongoing studies are in humans. And 20 of those will help assess the longer-term risks to children by tracking the youngsters' chemical levels in utero or as newborns and beyond. © 2015 Scientific American

Keyword: Obesity
Link ID: 20660 - Posted: 03.07.2015

Dr. Lisa Sanders. On Thursday we challenged Well readers to solve the case of a middle-aged woman with arthritis who developed a wasting illness after what looked like a simple cold. Her rheumatologist was worried that the immune suppressing medications the patient took to treat her joint disease had caused the new illness. More than 300 of you took on the challenge, and 17 of you correctly identified this rarity. The correct diagnosis is … Whipple’s disease The first reader to make the diagnosis was Mike Natter, a second-year medical student at the Sidney Kimmel Medical College at Thomas Jefferson University in Philadelphia. Mike said it was an easy case for him because he had been studying for an exam the next day and had just read about the disease. He is a frequent contributor to this column and says that he got the right diagnosis twice before but this was the first time he got it in first. Well done, Mike! The Diagnosis Whipple’s was first identified in 1907 by Dr. George Whipple, who was caring for a fellow physician who had “gradual loss of weight and strength, stools consisting chiefly of neutral fat and fatty acids, indefinite abdominal signs, and a peculiar multiple arthritis.” The patient eventually died. Dr. Whipple suspected an infectious cause because he found bacteria in many of the patient’s affected tissues, but the organism itself wasn’t identified for nearly 80 years. The bug, Tropheryma whipplei, is common and found mostly in soil. And yet the infection is rare. There have been only about 1,000 reported cases of Whipple’s disease in the more than one hundred years since it was first described. Over two-thirds of those were in middle-aged white men. Many of them were farmers or others who had occupational exposure to soil. © 2015 The New York Times Company

Keyword: Obesity
Link ID: 20659 - Posted: 03.07.2015

By David Masci Potential Republican presidential candidate Dr. Ben Carson made news earlier this week when he said that being gay is a “choice,” but when it comes to public opinion, polls show that Americans remain divided over whether “nature” or “nurture” is ultimately responsible for sexual orientation. Four-in-ten Americans (42%) said that being gay or lesbian is “just the way some choose to live,” while a similar share (41%) said that “people are born gay or lesbian,” according to the most recent Pew Research Center poll on the issue, conducted in 2013. Fewer U.S. adults (8%) said that people are gay or lesbian due to their upbringing, while another one-in-ten (9%) said they didn’t know or declined to give a response. People with the most education are the most likely to say that gays and lesbians were born that way. Indeed, 58% of Americans with a postgraduate degree say that people are born gay or lesbian, compared with just 35% of those with a high school diploma or less. The percentage of all Americans who believe that people are born gay or lesbian has roughly doubled (from 20% to 41%) since 1985, when the question was asked in a Los Angeles Times survey. More than three decades of Gallup polls also show a considerable rise in the view that being gay or lesbian is a product of “nature” rather than “nurture.” But the most recent survey, in 2014, still finds that the nation remains split in its feelings on the origins of sexual orientation. Copyright 2015 Pew Research Center

Keyword: Sexual Behavior
Link ID: 20658 - Posted: 03.07.2015

In Archaeology it is very rare to find any soft tissue remains: no skin, no flesh, no hair and definitely no brains. However, in 2009, archaeologists from York Archaeological Trust found something very surprising at a site in Heslington, York. During the excavation of an Iron-age landscape at the University of York, a skull, with the jaw and two vertebrae still attached, was discovered face down in a pit, without any evidence of what had happened to the rest of its body. At first it looked like a normal skull but it was not until it was being cleaned, that Collection Projects Officer, Rachel Cubitt, discovered something loose inside. “I peered though the hole at the base of the skull to investigate and to my surprise saw a quantity of bright yellow spongy material. It was unlike anything I had seen before.” says Rachel. Sonia O’Connor, from Archaeological Sciences, University of Bradford, was able to confirm that this was brain. With the help of York Hospital’s Mortuary they were able to remove the top of the skull in order to get their first look at this astonishingly well-preserved human brain. Since the discovery, a team of 34 specialists have been working on this brain to study and conserve it as much as possible. By radiocarbon dating a sample of jaw bone, it was determined that this person probably lived in the 6th Century BC, which makes this brain about 2,600 years old. By looking at the teeth and the shape of the skull it is likely this person was a man between 26 and 45 years old. An examination of the vertebrae in the neck tells us that he was first hit hard on the neck, and then the neck was severed with a small sharp knife, for reasons we can only guess. © Copyright York Archaeological Trust 2013-2015.

Keyword: Brain imaging
Link ID: 20657 - Posted: 03.07.2015

Hannah Devlin, science correspondent Psychedelic drugs could prove to be highly effective treatments for depression and alcoholism, according to a study which has obtained the first brain scans of people under the influence of LSD. Early results from the trial, involving 20 people, are said to be “very promising” and add to existing evidence that psychoactive drugs could help reverse entrenched patterns of addictive or negative thinking. However, Prof David Nutt, who led the study, warned that patients are missing out on the potential benefits of such treatments due to prohibitive regulations on research into recreational drugs. Speaking at a briefing in London, the government’s former chief drugs adviser, said the restrictions amounted to “the worst censorship in the history of science”. After failing to secure conventional funding to complete the analysis of the latest study on LSD, Nutt and colleagues at Imperial College London, are now attempting to raise £25,000 through the crowd-funding site Walacea.com. “These drugs offer the greatest opportunity we have in mental health,” he said. “There’s little else on the horizon.” There has been a resurgence of medical interest in LSD and psilocybin, the active ingredient in magic mushrooms, after several recent trials produced encouraging results for conditions ranging from depression in cancer patients to post-traumatic stress disorder. © 2015 Guardian News and Media Limited

Keyword: Depression; Drug Abuse
Link ID: 20656 - Posted: 03.05.2015

Zoe Cormier Data from population surveys in the United States challenge public fears that psychedelic drugs such as LSD can lead to psychosis and other mental-health conditions and to increased risk of suicide, two studies have found1, 2. In the first study, clinical psychologists Pål-Ørjan Johansen and Teri Suzanne Krebs, both at the Norwegian University of Science and Technology in Trondheim, scoured data from the US National Survey on Drug Use and Health (NSDUH), an annual random sample of the general population, and analysed answers from more than 135,000 people who took part in surveys from 2008 to 2011. Of those, 14% described themselves as having used at any point in their lives any of the three ‘classic’ psychedelics: LSD, psilocybin (the active ingredient in so-called magic mushrooms) and mescaline (found in the peyote and San Pedro cacti). The researchers found that individuals in this group were not at increased risk of developing 11 indicators of mental-health problems such as schizophrenia, psychosis, depression, anxiety disorders and suicide attempts. Their paper appears in the March issue of the Journal of Psychopharmacology1. The findings are likely to raise eyebrows. Fears that psychedelics can lead to psychosis date to the 1960s, with widespread reports of “acid casualties” in the mainstream news. But Krebs says that because psychotic disorders are relatively prevalent, affecting about one in 50 people, correlations can often be mistaken for causations. “Psychedelics are psychologically intense, and many people will blame anything that happens for the rest of their lives on a psychedelic experience.” © 2015 Nature Publishing Group,

Keyword: Drug Abuse; Schizophrenia
Link ID: 20655 - Posted: 03.05.2015

by Jan Piotrowski It's not the most charismatic fossil ever found, but it may reveal secrets of our earliest evolution. Unearthed in Ethiopia, the broken jaw with greying teeth suggests that the Homo lineage – of which modern humans are the only surviving member – existed up to 400,000 years earlier than previously thought. The fragment dates from around 2.8 million years ago, and is by far the most ancient specimen to bear the Homo signature. The earliest such fossil was one thought to be up to 2.4 million years ages old. Showing a mixture of traits, the new find pinpoints the time when humans began their transition from primitive, apelike Australopithecus to the big-brained conquerer of the world, says Brian Villmoare from the University of Nevada, Las Vegas, whose student made the find. Geological evidence from the same area, also reported this week in a study led by Erin DiMaggio from Pennsylvania State University, shows that the jaw's owner lived just after a major climate shift in the region: forests and waterways rapidly gave way to arid savannah, leaving only the occasional crocodile-filled lake. Except for the sabre-toothed big cat that once roamed these parts, the environment ended up looking much like it does today. It was probably the pressure to adapt to this new world that jump-started our evolution into what we see looking back at us in the mirror today, according to Villmoare. © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 20654 - Posted: 03.05.2015