Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
One afternoon in October 2005, neuroscientist James Fallon was looking at brain scans of serial killers. As part of a research project at UC Irvine, he was sifting through thousands of PET scans to find anatomical patterns in the brain that correlated with psychopathic tendencies in the real world. “I was looking at many scans, scans of murderers mixed in with schizophrenics, depressives and other, normal brains,” he says. “Out of serendipity, I was also doing a study on Alzheimer’s and as part of that, had brain scans from me and everyone in my family right on my desk.” “I got to the bottom of the stack, and saw this scan that was obviously pathological,” he says, noting that it showed low activity in certain areas of the frontal and temporal lobes linked to empathy, morality and self-control. Knowing that it belonged to a member of his family, Fallon checked his lab’s PET machine for an error (it was working perfectly fine) and then decided he simply had to break the blinding that prevented him from knowing whose brain was pictured. When he looked up the code, he was greeted by an unsettling revelation: the psychopathic brain pictured in the scan was his own. Many of us would hide this discovery and never tell a soul, out of fear or embarrassment of being labeled a psychopath. Perhaps because boldness and disinhibition are noted psychopathic tendencies, Fallon has gone all in towards the opposite direction, telling the world about his finding in a TED Talk, an NPR interview and now a new book published last month, The Psychopath Inside. In it, Fallon seeks to reconcile how he—a happily married family man—could demonstrate the same anatomical patterns that marked the minds of serial killers. “I’ve never killed anybody, or raped anyone,” he says. “So the first thing I thought was that maybe my hypothesis was wrong, and that these brain areas are not reflective of psychopathy or murderous behavior.”
Keyword: Aggression; Genes & Behavior
Link ID: 18964 - Posted: 11.25.2013
Robert N. McLay, author of At War with PTSD: Battling Post Traumatic Stress Disorder with Virtual Reality, responds: post-traumatic stress disorder (PTSD) can appear after someone has survived a horrific experience, such as war or sexual assault. A person with PTSD often experiences ongoing nightmares, edginess and extreme emotional changes and may view anything that evokes the traumatic situation as a threat. Although medications and talk therapy can help calm the symptoms of PTSD, the most effective therapies often require confronting the trauma, as with virtual-reality-based treatments. These computer programs, similar to a video game, allow people to feel as if they are in the traumatic scenario. Just as a pilot in a flight simulator might use virtual reality to learn how to safely land a plane without the risk of crashing, a patient with PTSD can learn how to confront painful reminders of trauma without facing any real danger. Virtual-reality programs have been built to simulate driving, the World Trade Center attacks, and combat scenarios in Vietnam and Iraq. The level of the technology varies considerably, from a simple headset that displays rather cartoonish images to Hollywood-quality special effects. A therapist typically observes what patients are seeing while they navigate the virtual experience. They can coach a patient to take on increasingly difficult challenges while making sure that the person does not become overwhelmed. To do so, some therapists may connect the subject to physiological monitoring devices; others may use virtual reality along with talk therapy. In the latter scenario, the patient recites the story of the trauma and reflects on it while passing through the simulation. The idea is to desensitize patients to their trauma and train them not to panic, all in a controlled environment. © 2013 Scientific American
Keyword: Stress; Learning & Memory
Link ID: 18963 - Posted: 11.25.2013
Medical marijuana can alleviate pain and nausea, but it can also cause decreased attention span and memory loss. A new study in mice finds that taking an over-the-counter pain medication like ibuprofen may help curb these side effects. "This is what we call a seminal paper," says Giovanni Marsicano, a neuroscientist at the University of Bordeaux in France who was not involved in the work. If the results hold true in humans, they "could broaden the medical use of marijuana," he says. "Many people in clinical trials are dropping out from treatments, because they say, ‘I cannot work anymore. I am stoned all the time.’ ” People have used marijuana for hundreds of years to treat conditions such as chronic pain, multiple sclerosis, and epilepsy. Studies in mice have shown that it can reduce some of the neural damage seen in Alzheimer's disease. The main psychoactive ingredient, tetrahydrocannabinol (THC), is approved by the Food and Drug Administration to treat anorexia in AIDS patients and the nausea triggered by chemotherapy. Although recreational drug users usually smoke marijuana, patients prescribed THC take it as capsules. Many people find the side effects hard to bear, however. The exact cause of these side effects is unclear. In the brain, THC binds to receptors called CB1 and CB2, which are involved in neural development as well as pain perception and appetite. The receptors are normally activated by similar compounds, called endocannabinoids, that are produced by the human body. When one of these compounds binds to CB1, it suppresses the activity of an enzyme called cyclooxygenase-2 (COX-2). The enzyme has many functions. For instance, painkillers such as ibuprofen and aspirin work by blocking COX-2. Researchers have hypothesized that the suppression of COX-2 could be the cause of THC's side effects, such as memory problems. © 2013 American Association for the Advancement of Science
Keyword: Drug Abuse; Learning & Memory
Link ID: 18962 - Posted: 11.23.2013
By R. Douglas Fields San Diego—Would we have Poe’s Raven today if the tormented author had taken lithium to suppress his bipolar illness? Not likely, considering the high frequency of psychiatric illnesses among writers and artists, concluded psychiatrist Kay Jamison of Johns Hopkins Medical School speaking last week at the Society for Neuroscience annual meeting in San Diego. Madness electrifies the creative process, Jamison concluded, but this difficult drug-use dilemma raises an even more provocative question: Would we have Lucy in the Sky with Diamonds had the Beatles not taken LSD? Lord Tennyson, Virginia Woolf and Vincent Van Gogh are familiar examples of artists and writers who suffered serious mental illnesses, but Jamison explained that psychiatric illness was the cruel engine of their creativity. Tracing their family pedigrees, she showed that many of these artists’ siblings, parents and descendants were institutionalized in mental hospitals, committed suicide, or endured life-long struggles with mania, despair, schizophrenia or other mental disorders. The genetic backbone to mental illness is strong. Ernest Hemingway and his supermodel granddaughter Margaux Hemingway both killed themselves. Separated from one another in environment and experience by a generation, their fates were inevitably tethered by their DNA. In all, seven members of the Hemingway family died at their own hand. This raises the question of why the genes of such devastating brain dysfunctions should persist in the human gene pool. Statistics show that among all categories of creative artists, writers suffer by far the highest incidence of bipolar disorder, outstripping all other artistic professions. Why? Jamison concludes that the manic phase of bipolar disorder infuses the writer with furious energy and limitless stamina. The author foregoes sleep, is driven to take daring risks, expands their imagination and embraces grandiose thinking. © 2013 Scientific American
Keyword: Drug Abuse; Schizophrenia
Link ID: 18961 - Posted: 11.23.2013
By BENEDICT CAREY Grading college students on quizzes given at the beginning of every class, rather than on midterms or a final exam, increases both attendance and overall performance, scientists reported Wednesday. The findings — from an experiment in which 901 students in a popular introduction to psychology course at the University of Texas took their laptops to class and were quizzed online — demonstrate that the computers can act as an aid to teaching, not just a distraction. Moreover, the study is the latest to show how tests can be used to enhance learning as well as measure it. The report, appearing in the journal PLoS One, found that this “testing effect” was particularly strong in students from lower-income households. Psychologists have known for almost a century that altering the timing of tests can affect performance. In the past decade, they have shown that taking a test — say, writing down all you can remember from a studied prose passage — can deepen the memory of that passage better than further study. The new findings stand as a large-scale prototype for how such testing effects can be exploited in the digital era, experts said, though they cautioned that it was not yet clear how widely they could be applied. “This study is important because it introduces a new method to implement frequent quizzing with feedback in large classrooms, which can be difficult to do,” said Jeffrey D. Karpicke, a professor of psychology at Purdue, who was not involved in the study. He added, “This is the first large study to show that classroom quizzing can help reduce achievement gaps” due to socioeconomic background. © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18960 - Posted: 11.23.2013
by Simon Makin "The only thing we have to fear is fear itself," said Franklin D. Roosevelt. He might have been onto something: research suggests that the anticipation of pain is actually worse than the pain itself. In other words, people are happy to endure a bit more pain, if it means they spend less time waiting for it. Classical theories of decision-making suppose that people bring rewards forward and postpone punishments, because we give far-off events less weight. This is called "temporal discounting". But this theory seems to go out the window when it comes to pain. One explanation for this is that the anticipation of pain is itself unpleasant, a phenomenon that researchers have appropriately termed "dread". To investigate how dread varies with time, Giles Story at University College London, and his colleagues, hooked up 33 volunteers to a device that gave them mild electric shocks. The researchers also presented people with a series of choices between more or less mildly painful shocks, sooner or later. During every "episode" there was a minimum of two shocks, which could rise to a maximum of 14, but before they were given them, people had to make a choice such as nine extra shocks now or six extra shocks five episodes from now. The number of shocks they received each time was determined by these past choices. Although a few people always chose to experience the minimum pain, 70 per cent of the time, on average, participants chose to receive the extra shocks sooner rather than a smaller number later. By varying the number of shocks and when they occurred, the team was able to figure out that the dread of pain increased exponentially as pain approached in time. Similar results occurred in a test using hypothetical dental appointments. © Copyright Reed Business Information Ltd.
Keyword: Pain & Touch; Emotions
Link ID: 18959 - Posted: 11.23.2013
By Victoria Gill Science reporter, BBC News Great tits use different alarm calls for different predators, according to a scientist in Japan. The researcher analysed the birds' calls and found they made "jar" sounds for snakes and "chicka" sounds for crows and martens. This, he says, is the first demonstration birds can communicate vocally about the type of predator threatening them. The findings are published in the journal Animal Behaviour. From his previous observations, the researcher, Dr Toshitaka Suzuki, from the Graduate University for Advanced Studies in Kanagawa, found great tits appeared to be able to discriminate between different predators. To test whether they could also communicate this information, he placed models of three different animals that prey on nestlings - snakes, crows and martens - close to the birds' nest boxes. He then recorded and analysed the birds' responses. "Parents usually make alarm calls when they approach and mob the nest predators," said Dr Suzuki. "They produced specific 'jar' alarm calls for the snakes and the same 'chicka' alarm call in response to both the crows and martens," he said. But a closers analysis of the sounds showed the birds had used different "note combinations" in their crow alarm calls from those they had used for the martens. Dr Suzuki thinks the birds might have evolved what he called a "combinatorial communication system" - combining different notes to produce calls with different meanings. Since snakes are able to slither into nest boxes, they pose a much greater threat to great tit nestlings than other birds or mammals, so Dr Suzuki says it makes sense that the birds would have a specific snake alarm call. BBC © 2013
Keyword: Animal Communication; Evolution
Link ID: 18958 - Posted: 11.23.2013
By Neuroskeptic I am sitting reading a book. After a while, I get up and make a cup of coffee. I’ve been thinking about this scenario lately as I’ve pondered ‘what remains to be discovered’ in our understanding the brain. By this I mean, what (if anything) prevents neuroscience from at least sketching out an explanation for all of human behaviour? A complete explanation of any given behaviour – such as my reading a particular book – would be impossible, as it would require detailed knowledge of all my brain activity. But neuroscience could sketch an account of some stages of the reading. We have models for how my motor cortex and cerebellum might coordinate my fingers to turn the pages of my book. Other models try to make sense of the recognition of the letters by my visual cortex. This is what I mean by ‘beginning to account for’. We have theories that are not wholly speculative. While we don’t yet have the whole story of motor control or visual perception, we have made a start. Yet I’m not sure that we can even begin to explain: why did I stop what I was doing, get up, and make coffee at that particular time? The puzzle, it seems, does not lie in my actual choice to make some coffee (as opposed to not making it.) We could sketch an explanation for how, once the mental image (memory) of coffee ‘crossed my mind’, that image set off dopamine firing (i.e. I like coffee), and this dopamine, acting on corticostriatal circuits, selected the action of making coffee over the less promising alternatives. But why did that mental image of coffee cross my mind in the first place? And why did it do so just then, not thirty seconds before or afterwards?
Keyword: Consciousness
Link ID: 18957 - Posted: 11.23.2013
Erika Check Hayden Researchers have shown that just two genes from the Y chromosome — that genetic emblem of masculinity in most mammals — are all that is needed for a male mouse to conceive healthy offspring using assisted reproduction. The same team had previously reported1 that male mice missing only seven genes from their Y chromosomes could father healthy babies. The study brings researchers one step closer to creating mice that can be fathers without any contribution from the Y chromosome at all. The findings also have implications for human infertility, because the work suggests that the assisted-reproduction technique used in the mice might be safer for human use than is currently thought. “To me it is a further demonstration that there isn't much left on the poor old Y chromosome that is essential. Who needs a Y?” says Jennifer Marshall Graves, a geneticist at the La Trobe Institute of Molecular Science in Melbourne, Australia, who was not involved in the research. An embryo without a Y chromosome normally develops into a female, but biologists have long questioned whether the entire chromosome is necessary to produce a healthy male. A single gene from the Y chromosome, called Sry, is known to be sufficient to create an anatomically male mouse — albeit one that will be infertile because it will lack some of the genes involved in producing sperm — as researchers have shown by removing the Y chromosome and inserting Sry into other chromosomes. Why it takes two © 2013 Nature Publishing Group
Keyword: Sexual Behavior
Link ID: 18956 - Posted: 11.23.2013
By Gary Stix The emerging academic discipline of neuroethics has been driven, in part, by the recognition that introducing brain scans as legal evidence is fraught with peril. Most neuroscientists think that a brain scan is unable to provide an accurate representation of the state of mind of a defendant or determine whether his frontal lobes predispose to some wanton action. The consensus view holds that studying spots on the wrinkled cerebral cortex that are bigger or smaller in some criminal offenders may hint at overarching insights into the roots of violence, but lack the requisite specificity to be used as evidence in any individual case. “I believe that our behavior is a production of activity in our brain circuits,” Steven E. Hyman of the Broad Institute of Harvard and MIT told a session at the American Association for the Advancement of Science’s annual meeting earlier this year. “But I would never tell a parole board to decide whether to release somebody or hold on to somebody, based on their brain scan as an individual, because I can’t tell what are the causal factors in that individual.” It doesn’t seem to really matter, though, what academic experts believe about the advisability of brain scans as Exhibit One at trial. The entry of neuroscience in the courtroom has already begun, big time. The introduction of a brain scan in a legal case was once enough to generate local headlines. No more. Hundreds of legal opinions each year have begun to invoke the science of mind and brain to bolster legal arguments—references not only to brain scans, but a range of studies that show that the amygdala is implicated in this or the anterior cingulate cortex is at fault for that. The legal establishment, in short, has begun a love affair with all things brain. © 2013 Scientific American
Keyword: Consciousness; Aggression
Link ID: 18955 - Posted: 11.21.2013
by Anil Ananthaswamy Can you tickle yourself if you are fooled into thinking that someone else is tickling you? A new experiment says no, challenging a widely accepted theory about how our brains work. It is well known that we can't tickle ourselves. In 2000, Sarah-Jayne Blakemore of University College London (UCL) and colleagues came up with a possible explanation. When we intend to move, the brain sends commands to the muscles, but also predicts the sensory consequences of the impending movement. When the prediction matches the actual sensations that arise, the brain dampens down its response to those sensations. This prevents us from tickling ourselves (NeuroReport, DOI: 10.1097/00001756-200008030-00002). Jakob Hohwy of Monash University in Clayton, Australia, and colleagues decided to do a tickle test while simultaneously subjecting people to a body swap illusion. In this illusion, the volunteer and experimenter sat facing each other. The subject wore goggles that displayed the feed from a head-mounted camera. In some cases the camera was mounted on the subject's head, so that they saw things from their own perspective, while in others it was mounted on the experimenter's head, providing the subject with the experimenter's perspective. Using their right hands, both the subject and the experimenter held on to opposite ends of a wooden rod, which had a piece of foam attached to each end. The subject and experimenter placed their left palms against the foam at their end. Next, the subject or the experimenter took turns to move the rod with their right hand, causing the piece of foam to tickle both of their left palms. © Copyright Reed Business Information Ltd.
Keyword: Attention
Link ID: 18954 - Posted: 11.21.2013
By Evelyn Boychuk, Ever since Toronto Mayor Rob Ford admitted to having smoked crack cocaine, various city councillors and media observers have publicly advised him to seek drug counselling. But in a CNN interview that aired Nov. 18, Ford continued to stand by his message: “I’m not an addict.” The ongoing saga of the mayor’s crack use has raised unanswered questions about how addictive the drug really is. It’s been commonly accepted that crack is more addictive than other drugs, but addictions researchers and drug counsellors say it’s hard to compare the addictiveness of specific substances because drug-taking is a highly individual experience. Robin Haslam, director of operations and procedures for Addiction Canada, says that he has never met someone who can “just casually smoke crack.” However, people have different thresholds of addiction. “I know people who have used crack once, and never touched it again. I also know people who smoked marijuana once, and became very impaired,” he says. Carl Hart, author of High Price: A Neuroscientist's Journey of Self-Discovery That Challenges Everything You Know About Drugs and Society, told CBC Radio’s Day 6 that crack “is not uniquely addictive, or it’s not something that is special, as we have all been taught.” Hart said that the percentage of people that become addicted to crack is lower than most think. “For example, 10 to 20 per cent of people will become addicted — that means that 80 to 90 per cent of people won’t become addicted.” © CBC 2013
Keyword: Drug Abuse
Link ID: 18953 - Posted: 11.21.2013
By Jason Tetro For millennia, the human race has sought to combat psychological disorders through the intervention of natural – and eventually synthetic – chemicals. Originally, the sources for these psychoactive substances were the various fruits and flowers, including the Areca tree (betel nut), the poppy (opium), and the coca plant (cocaine). But in the 20th Century, new actives were being created in the lab thanks in part to the discovery of lysergic acid, better known as LSD, in 1938. By the middle of the 1950s, the psychiatric community was fascinated by the idea that mental health could be restored through the direct use of drugs or in combination with traditional psychotherapy. The idea took off in the 1960s as research continued to elucidate the biology of psychiatry. It essentially created a new avenue for psychiatric treatment: psychopharmacology. This inevitably led to the synthesis of a new compound, 3-(p-trifluoromethylphenoxy)-N-methyl-3-phenylpropylamine, which eventually became known as fluoxetine, and then, as we have all come to know it, Prozac. By the late 1980s, it was known by another name: the wonder drug. Today, pharmacologic compounds for psychiatric treatment are numerous and up to 20% of all Americans are taking some type of psychotropic medication totalling some $34 billion dollars annually. While there have been calls for a reduction in use of these chemicals, primarily due to the fact that many are ineffective, there is a constant pressure from the public to have all their problems solved by a pill.
Keyword: Depression; Schizophrenia
Link ID: 18952 - Posted: 11.21.2013
By James Gallagher Health and science reporter, BBC News The damage caused by concussion can be detected months after the injury and long after patients feel like they have recovered, brain scans show. Concussion has become highly controversial in sport, with concerns raised that players are putting their brain at risk. Researchers at the University of New Mexico said athletes may be being returned to action too quickly. While UK doctors said the attitude to head injury was "too relaxed" in sport. Debate over concussion and head injury has lead to resignations over new rules in rugby, controversy in football after a player was kept on the field after being knocked out, and has been a long-standing issue in American football. Concussion is an abnormal brain function that results from an external blast, jolt or impact to the head. Even if the knock does not result in a skull fracture, the brain can still experience a violent rattling that leads to injury. Because the brain is a soft gelatinous material surrounded by a rigid bony skull, such traumatic injuries can cause changes in brain function, such as bleeding, neuron damage and swelling. Research shows that repetitive concussions increase the risk of sustained memory loss, worsened concentration or prolonged headaches. Long-term The US study, published in the journal Neurology, compared the brains of 50 people who had mild concussion with 50 healthy people. BBC © 2013
Keyword: Brain Injury/Concussion
Link ID: 18951 - Posted: 11.21.2013
By JOYCE COHEN Earlier this fall, Seattle Seahawks fans at CenturyLink Field broke the world record for loudest stadium crowd with a skull-splitting 136.6 decibels. That volume, as the Seahawks’ website boasts, hits the scale somewhere between “serious hearing damage” and “eardrum rupture.” Just weeks later, Kansas City Chiefs fans at Arrowhead Stadium topped that number with 137.5 screaming decibels of their own. The measuring method used for the Guinness World Record has an edge of gimmickry. That A-weighted peak measurement, reached for a split second near the measuring device, displays the highest possible readout. For a vulnerable ear, however, game-day noise isn’t just harmless fun. With peaks and troughs, the decibel level of noise reaching a typical spectator averages in the mid-90s, but for a longer time. Such noise is enough to cause permanent damage and to increase the likelihood of future damage. “The extent to which hearing-related issues get so little attention is amazing and troubling,” said M. Charles Liberman, a professor of otology at Harvard Medical School and director of a hearing research lab at the Massachusetts Eye and Ear Infirmary. “Many people are damaging their ears with repeated noise exposure such that their hearing abilities will significantly diminish as they age, much more so than if they were more careful,” he said. Ears are deceptive. Even if they seem to recover from the muffling, ringing and fullness after a rousing game, they don’t really recover. It’s not just the tiny sensory cells in the cochlea that are damaged by noise, Dr. Liberman said, but also the nerve fibers between the ears and the brain that degrade over time. Copyright 2013 The New York Times Company
Keyword: Hearing
Link ID: 18950 - Posted: 11.21.2013
Jessica Wright A tiny fiber-optic probe inserted into the reward center of the mouse brain monitors how the mouse feels about meeting a peer — or a golf ball. The unpublished technique was presented last week at the at the 2013 Society for Neuroscience annual meeting in San Diego. Mice feel the most satisfaction when sniffing another mouse’s rear and when walking away from a golf ball, the study found. The new technique is one of only a few ways to read the electrical activity of neurons in freely moving mice and is the most noninvasive, making it ideal for monitoring social interactions. The method takes advantage of a fluorescent molecule that lights up only in the presence of calcium, which rushes into the cell when neurons fire. The researchers used mice engineered to express this molecule only in neurons that make dopamine — the chemical messenger that mediates a sense of reward — in the ventral tegmental area (VTA). The researchers placed the cable in the VTA, the source of most of the brain’s dopamine neurons. The fiber-optic cable is 400 micrometers in diameter, and could probably be half that size, says Lisa Gunaydin, who developed the method as a graduate student in Karl Deisseroth’s lab at Stanford University in California. When neurons expressing the fluorescent molecule fire, the cable reads these as a series of spikes. In the study, the researchers gave thirsty mice sweet water and, as expected, their dopamine activity in the VTA spiked each time they drank. When the mice interact with a new mouse, or a golf ball, the dopamine neurons fire more on the first encounter but dull with repeated visits, suggesting that the mice are most excited by novelty. © Copyright 2013 Simons Foundation
Keyword: Drug Abuse
Link ID: 18949 - Posted: 11.21.2013
By Helen Briggs BBC News A condition where people experience a mixing of the senses, such as tasting words, has been linked with autism. Research suggests synaesthesia is nearly three times as common in adults with autism spectrum disorder than in the general population. The two conditions may share common features such as unusual wiring of the brain, say UK scientists. The study helps understanding of how people with autism experience life, says the National Autistic Society. Synaesthesia is a condition where one sense automatically triggers another. Some people experience tastes when they read or hear words, some perceive numbers as shapes, others see colours when they hear music. People with synaesthesia might say: "The letter q is dark brown," or: "The word 'hello' tastes like coffee," for example. Following anecdotal evidence of links between synaesthesia and Asperger's syndrome, researchers at the Autism Research Centre at Cambridge University set out to test the idea. More than 200 study participants - 164 adults diagnosed with high-functioning autism or Asperger's syndrome, and 97 adults without autism - were asked to fill in questionnaires to measure synaesthesia and autism traits. The study found one in five adults with autism spectrum conditions - a range of related developmental disorders, including autism and Asperger's syndrome - had synaesthesia compared with about 7% of people with no signs of the disorders. Prof Simon Baron-Cohen, who led the research, told BBC News: "Synaesthesia involves a mixing of the senses and it's a very subjective private experience, so the only way we know it's happening is if you ask people to report on their experiences. BBC © 2013
Keyword: Autism
Link ID: 18948 - Posted: 11.20.2013
By Rahul K. Parikh, The message showed up on my desk one day while I was seeing a patient. Its choppy shorthand read: “Admits to injecting testosterone. Now decreased libido. Call back to discuss.” The caller was a 15-year-old lacrosse player who hadn’t been part of my practice long. Like many boys in his age group, he rarely came to the office. When I responded to his message later that afternoon, the young man carried his end of the conversation with the typical terseness of a teenager. “Where did you get the steroids?” I asked. “On the Internet.” “How long did you use them?” “A few months.” “And what are you experiencing now?” He told me his nipples were sore and swollen. “I’ve been more tired and moody as well.” My patient was experiencing classic side effects of steroid use. About 6 percent of teenagers admit to using performance-enhancing drugs, according to a recent survey, though it’s easy to assume that that number is low. How many teens would admit to using such drugs, even anonymously to a researcher? And yet here was one teen, forced by the drug’s side effect, having to make an embarrassing confession to me and his family. (Details of this case have been altered to protect patient privacy.) Despite my patient’s fear, I was confident that a young, healthy teenager who briefly used steroids would bounce back, though it might take some time — and patience — for his symptoms to dissipate. When I explained this to my patient, he told me that he wanted his testosterone level tested, to make sure there wasn’t something more seriously wrong. I got the sense that he thought there was some way I could magically undo the harm he had caused himself. I paused and considered his request, which came across more like an order. © 1996-2013 The Washington Post
Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 18947 - Posted: 11.20.2013
Ewen Callaway New genome sequences from two extinct human relatives suggest that these ‘archaic’ groups bred with humans and with each other more extensively than was previously known. The ancient genomes, one from a Neanderthal and one from a different archaic human group, the Denisovans, were presented on 18 November at a meeting at the Royal Society in London. They suggest that interbreeding went on between the members of several ancient human-like groups living in Europe and Asia more than 30,000 years ago, including an as-yet unknown human ancestor from Asia. “What it begins to suggest is that we’re looking at a ‘Lord of the Rings’-type world — that there were many hominid populations,” says Mark Thomas, an evolutionary geneticist at University College London who was at the meeting but was not involved in the work. The first Neanderthal1 and the Denisovan2 genome sequences revolutionized the study of ancient human history, not least because they showed that these groups interbred with anatomically modern humans, contributing to the genetic diversity of many people alive today. All humans whose ancestry originates outside of Africa owe about 2% of their genome to Neanderthals; and certain populations living in Oceania, such as Papua New Guineans and Australian Aboriginals, got about 4% of their DNA from interbreeding between their ancestors and Denisovans, who are named after the cave in Siberia’s Altai Mountains where they were discovered. The cave contains remains deposited there between 30,000 and 50,000 years ago. © 2013 Nature Publishing Group
Keyword: Evolution
Link ID: 18946 - Posted: 11.20.2013
By BENEDICT CAREY Curing insomnia in people with depression could double their chance of a full recovery, scientists are reporting. The findings, based on an insomnia treatment that uses talk therapy rather than drugs, are the first to emerge from a series of closely watched studies of sleep and depression to be released in the coming year. A student demonstrating equipment at Colleen Carney’s sleep lab at Ryerson University. Dr. Carney is the lead author of a new report about the effects of insomnia treatment on depression. The new report affirms the results of a smaller pilot study, giving scientists confidence that the effects of the insomnia treatment are real. If the figures continue to hold up, the advance will be the most significant in the treatment of depression since the introduction of Prozac in 1987. Depression is the most common mental disorder, affecting some 18 million Americans in any given year, according to government figures, and more than half of them also have insomnia. Experts familiar with the new report said that the results were plausible and that if supported by other studies, they should lead to major changes in treatment. “It would be an absolute boon to the field,” said Dr. Nada L. Stotland, professor of psychiatry at Rush Medical College in Chicago, who was not connected with the latest research. “It makes good common sense clinically,” she continued. “If you have a depression, you’re often awake all night, it’s extremely lonely, it’s dark, you’re aware every moment that the world around you is sleeping, every concern you have is magnified.” The study is the first of four on sleep and depression nearing completion, all financed by the National Institute of Mental Health. They are evaluating a type of talk therapy for insomnia that is cheap, relatively brief and usually effective, but not currently a part of standard treatment. © 2013 The New York Times Company
Keyword: Depression; Sleep
Link ID: 18945 - Posted: 11.19.2013


.gif)

