Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By MARY LOU JEPSEN IN my early 30s, for a few months, I altered my body chemistry and hormones so that I was closer to a man in his early 20s. I was blown away by how dramatically my thoughts changed. I was angry almost all the time, thought about sex constantly, and assumed I was the smartest person in the entire world. Over the years I had met guys rather like this. I was not experimenting with hormone levels out of idle curiosity or in some kind of quirky science experiment. I was on hormone treatments because I’d had a tumor removed along with part of my pituitary gland, which makes key hormones the body needs to function. This long journey may have started as early as 1978, when I was 13. I spent a summer in intensive care with an unknown disease. After that summer, I never thought I would live a long life. So I wanted to live, to do interesting, fascinating work in the limited time I thought I had left. I took on the math-intensive art form of holography, and in my early 20s traveled the world, living on university fellowships to pursue this esoteric craft. I didn’t date much, really — perhaps because I didn’t have many hormones, though I didn’t know that at the time. I worked as an artist, played in a band, met Andy Warhol, Christo, Lou Reed and David Byrne. I had fun. But the gravity of my illness grew in the 1990s. The growth that shut down my pituitary gland’s ability to produce hormones did so insidiously over many years. By my early 20s it was, I suspect in retrospect, causing misdiagnosis of symptoms that were most likely caused by lack of hormones like cortisol. No diagnosis was found, despite the efforts of many doctors. I was a doctoral student in electrical engineering at an Ivy League school, but was growing progressively worse. I routinely slept about 20 hours a day, lived with a constant blistering headache and frequent vomiting, and was periodically wheelchair-bound. Large sections of my skin cycled through a rainbow of colors and sores, half of my face wouldn’t move as if Novocain had been applied. I drooled. Worse: I felt stupid. I couldn’t subtract anymore. I couldn’t make a to-do list, let alone accomplish items on one. I recognized that I wasn’t capable of continuing in graduate school. Utterly defeated, I filled out the paperwork to drop out. © 2013 The New York Times Company
By NATASHA SINGER One afternoon a few months ago, a 45-year-old sales representative named Mike called “The Dr. Harry Fisch Show,” a weekly men’s health program on the Howard Stern channel on Sirius XM Radio, where no male medical or sexual issue goes unexplored. “I feel like a 70-year-old man in a 45-year-old body,” Mike, from Vancouver, British Columbia, told Dr. Fisch on the live broadcast. “I want to feel good. I don’t want to feel tired all day.” A regular listener, Mike had heard Dr. Fisch, a Park Avenue urologist and fertility specialist, talk about a phenomenon called “low testosterone” or “low T.” Dr. Fisch likes to say that a man’s testosterone level is “the dipstick” of his health; he regularly appears on programs like “CBS This Morning” to talk about the malaise that may coincide with low testosterone. He is also the medical expert featured on IsItLowT.com, an informational website sponsored by AbbVie, the drug maker behind AndroGel, the best-selling prescription testosterone gel. Like many men who have seen that site or commercials or online quizzes about “low T,” Mike suspected that diminished testosterone was the cause of his lethargy. And he hoped, as the marketing campaigns seem to suggest, that taking a prescription testosterone drug would make him feel more energetic. “I took your advice and I went and got my testosterone checked,” Mike told Dr. Fisch. Mike’s own physician, he related, told him that his testosterone “was a little low” and prescribed a testosterone medication. Mike also said he had diabetes and high blood pressure and was 40 pounds overweight. Dr. Fisch explained that conditions like obesity might be accompanied by decreased testosterone and energy, and he urged Mike to exercise more and to lose weight. But if Mike had trouble overhauling his diet and exercise habits, Dr. Fisch said, taking testosterone might give him the boost he needed to do so. “If it gives you more energy to exercise,” Dr. Fisch said of the testosterone drug, “I’m all for it.” © 2013 The New York Times Company
By Janet Davison, CBC News If headlines in the past few weeks are to be believed, a "Flesh-eating 'zombie' drug" that could devour users "from the inside out" is finding its way onto American streets. Then came reports suggesting that "krokodil," a cheap and highly addictive homemade substitute for heroin that surfaced first in Russia about 10 years ago, had appeared in Ontario's Niagara region. But so far, neither the U.S. Drug Enforcement Agency nor Health Canada has identified krokodil, also known as desomorphine, in any samples they've analyzed since the DEA found two instances of it in 2004. And police in Niagara are now saying the reported cases of the drug — an ugly concoction of codeine mixed with common products such as gasoline, lighter fluid, paint thinner or industrial cleaning oil — haven't been medically confirmed. Krokodil is named for the Russian word for crocodile and its tendency to turn users' skin rough and scaly. The injectable opioid can cause brain damage and severe tissue damage, sometimes leading to gangrene, amputations and even death. It has also been linked to pneumonia, blood poisoning, meningitis, liver and kidney problems, rotting gums and bone infections. The horrific health problems the drug has caused among the well over 100,000 users in Russia and Ukraine have been well documented by researchers in publications such as the International Journal of Drug Policy. But so far there is no solid, official proof that krokodil has reached Canada. The recent news reports about the drug coupled with the lack of hard evidence to back them up underline how difficult it is for health and law enforcement officials to keep up with the evolving mix of street drugs. © CBC 2013
Keyword: Drug Abuse
Link ID: 18967 - Posted: 11.25.2013
by Erika Engelhaupt If you had to have a prosthetic hand, would you want it to look like a real hand? Or would you prefer a gleaming metallic number, something that doesn’t even try to look human? A new study looks at one of the issues that prosthetic designers and wearers face in making this decision: the creepy factor. People tend to get creeped out by robots or prosthetic devices that look almost, but not quite, human. So Ellen Poliakoff and colleagues at the University of Manchester in England had people rate the eeriness of various prosthetic hands. Forty-three volunteers looked at photographs of prosthetic and real hands. They rated both how humanlike (realistic) the hands were and how eerie they were, defined as “mysterious, strange, or unexpected as to send a chill up the spine.” Real human hands were rated both the most humanlike and the least eerie (a good thing for humans). Metal hands that were clearly mechanical were rated the least humanlike, but less eerie overall than prosthetic hands made to look like real hands, the team reports in the latest issue of Perception. The realistic prosthetics, like the rubber hand shown above, fell into what's known as the uncanny valley. That term, invented by roboticist Matsuhiro Mori in 1970, describes how robots become unnerving as they come to look more humanlike. The superrealistic Geminoid DK robot and the animated characters in the movie The Polar Express suffer from this problem. They look almost human, but not quite, and this mismatch between expectation and reality is one of the proposed explanations for the uncanny valley. In particular, if something looks like a human but doesn’t quite move like one, it’s often considered eerie. © Society for Science & the Public 2000 - 2013
Barn owl nestlings recognise their siblings' calls, according to researchers. Instead of competing aggressively for food, young barn owls are known to negotiate by calling out. A team of scientists in Switzerland discovered that the owlets have remarkably individual calls. They suggest this is to communicate each birds' needs and identity in the nest. The findings were announced in the Journal of Evolutionary Biology by Dr Amelie Dreiss and colleagues at the University of Lausanne, Switzerland. Barn owls (Tyto alba) are considered one of the most widespread species of bird and are found on every continent except Antarctica. An average clutch size ranges between four and six eggs but some have been known to contain up to 12. Previous studies have highlighted how barn owl nestlings, known as owlets, negotiate with their siblings for food instead of fighting. While their parents search for food the owlets advertise their hunger to their brothers and sisters by calling out. "These vocal signals deter siblings from vocalizing and from competing for the prey at parental return," explained Dr Dreiss. "If there is a disagreement, they can escalate signal intensity little by little, always without physical aggression, until less hungry siblings finally withdraw from the contest." BBC © 2013
One afternoon in October 2005, neuroscientist James Fallon was looking at brain scans of serial killers. As part of a research project at UC Irvine, he was sifting through thousands of PET scans to find anatomical patterns in the brain that correlated with psychopathic tendencies in the real world. “I was looking at many scans, scans of murderers mixed in with schizophrenics, depressives and other, normal brains,” he says. “Out of serendipity, I was also doing a study on Alzheimer’s and as part of that, had brain scans from me and everyone in my family right on my desk.” “I got to the bottom of the stack, and saw this scan that was obviously pathological,” he says, noting that it showed low activity in certain areas of the frontal and temporal lobes linked to empathy, morality and self-control. Knowing that it belonged to a member of his family, Fallon checked his lab’s PET machine for an error (it was working perfectly fine) and then decided he simply had to break the blinding that prevented him from knowing whose brain was pictured. When he looked up the code, he was greeted by an unsettling revelation: the psychopathic brain pictured in the scan was his own. Many of us would hide this discovery and never tell a soul, out of fear or embarrassment of being labeled a psychopath. Perhaps because boldness and disinhibition are noted psychopathic tendencies, Fallon has gone all in towards the opposite direction, telling the world about his finding in a TED Talk, an NPR interview and now a new book published last month, The Psychopath Inside. In it, Fallon seeks to reconcile how he—a happily married family man—could demonstrate the same anatomical patterns that marked the minds of serial killers. “I’ve never killed anybody, or raped anyone,” he says. “So the first thing I thought was that maybe my hypothesis was wrong, and that these brain areas are not reflective of psychopathy or murderous behavior.”
Robert N. McLay, author of At War with PTSD: Battling Post Traumatic Stress Disorder with Virtual Reality, responds: post-traumatic stress disorder (PTSD) can appear after someone has survived a horrific experience, such as war or sexual assault. A person with PTSD often experiences ongoing nightmares, edginess and extreme emotional changes and may view anything that evokes the traumatic situation as a threat. Although medications and talk therapy can help calm the symptoms of PTSD, the most effective therapies often require confronting the trauma, as with virtual-reality-based treatments. These computer programs, similar to a video game, allow people to feel as if they are in the traumatic scenario. Just as a pilot in a flight simulator might use virtual reality to learn how to safely land a plane without the risk of crashing, a patient with PTSD can learn how to confront painful reminders of trauma without facing any real danger. Virtual-reality programs have been built to simulate driving, the World Trade Center attacks, and combat scenarios in Vietnam and Iraq. The level of the technology varies considerably, from a simple headset that displays rather cartoonish images to Hollywood-quality special effects. A therapist typically observes what patients are seeing while they navigate the virtual experience. They can coach a patient to take on increasingly difficult challenges while making sure that the person does not become overwhelmed. To do so, some therapists may connect the subject to physiological monitoring devices; others may use virtual reality along with talk therapy. In the latter scenario, the patient recites the story of the trauma and reflects on it while passing through the simulation. The idea is to desensitize patients to their trauma and train them not to panic, all in a controlled environment. © 2013 Scientific American
Medical marijuana can alleviate pain and nausea, but it can also cause decreased attention span and memory loss. A new study in mice finds that taking an over-the-counter pain medication like ibuprofen may help curb these side effects. "This is what we call a seminal paper," says Giovanni Marsicano, a neuroscientist at the University of Bordeaux in France who was not involved in the work. If the results hold true in humans, they "could broaden the medical use of marijuana," he says. "Many people in clinical trials are dropping out from treatments, because they say, ‘I cannot work anymore. I am stoned all the time.’ ” People have used marijuana for hundreds of years to treat conditions such as chronic pain, multiple sclerosis, and epilepsy. Studies in mice have shown that it can reduce some of the neural damage seen in Alzheimer's disease. The main psychoactive ingredient, tetrahydrocannabinol (THC), is approved by the Food and Drug Administration to treat anorexia in AIDS patients and the nausea triggered by chemotherapy. Although recreational drug users usually smoke marijuana, patients prescribed THC take it as capsules. Many people find the side effects hard to bear, however. The exact cause of these side effects is unclear. In the brain, THC binds to receptors called CB1 and CB2, which are involved in neural development as well as pain perception and appetite. The receptors are normally activated by similar compounds, called endocannabinoids, that are produced by the human body. When one of these compounds binds to CB1, it suppresses the activity of an enzyme called cyclooxygenase-2 (COX-2). The enzyme has many functions. For instance, painkillers such as ibuprofen and aspirin work by blocking COX-2. Researchers have hypothesized that the suppression of COX-2 could be the cause of THC's side effects, such as memory problems. © 2013 American Association for the Advancement of Science
By R. Douglas Fields San Diego—Would we have Poe’s Raven today if the tormented author had taken lithium to suppress his bipolar illness? Not likely, considering the high frequency of psychiatric illnesses among writers and artists, concluded psychiatrist Kay Jamison of Johns Hopkins Medical School speaking last week at the Society for Neuroscience annual meeting in San Diego. Madness electrifies the creative process, Jamison concluded, but this difficult drug-use dilemma raises an even more provocative question: Would we have Lucy in the Sky with Diamonds had the Beatles not taken LSD? Lord Tennyson, Virginia Woolf and Vincent Van Gogh are familiar examples of artists and writers who suffered serious mental illnesses, but Jamison explained that psychiatric illness was the cruel engine of their creativity. Tracing their family pedigrees, she showed that many of these artists’ siblings, parents and descendants were institutionalized in mental hospitals, committed suicide, or endured life-long struggles with mania, despair, schizophrenia or other mental disorders. The genetic backbone to mental illness is strong. Ernest Hemingway and his supermodel granddaughter Margaux Hemingway both killed themselves. Separated from one another in environment and experience by a generation, their fates were inevitably tethered by their DNA. In all, seven members of the Hemingway family died at their own hand. This raises the question of why the genes of such devastating brain dysfunctions should persist in the human gene pool. Statistics show that among all categories of creative artists, writers suffer by far the highest incidence of bipolar disorder, outstripping all other artistic professions. Why? Jamison concludes that the manic phase of bipolar disorder infuses the writer with furious energy and limitless stamina. The author foregoes sleep, is driven to take daring risks, expands their imagination and embraces grandiose thinking. © 2013 Scientific American
By BENEDICT CAREY Grading college students on quizzes given at the beginning of every class, rather than on midterms or a final exam, increases both attendance and overall performance, scientists reported Wednesday. The findings — from an experiment in which 901 students in a popular introduction to psychology course at the University of Texas took their laptops to class and were quizzed online — demonstrate that the computers can act as an aid to teaching, not just a distraction. Moreover, the study is the latest to show how tests can be used to enhance learning as well as measure it. The report, appearing in the journal PLoS One, found that this “testing effect” was particularly strong in students from lower-income households. Psychologists have known for almost a century that altering the timing of tests can affect performance. In the past decade, they have shown that taking a test — say, writing down all you can remember from a studied prose passage — can deepen the memory of that passage better than further study. The new findings stand as a large-scale prototype for how such testing effects can be exploited in the digital era, experts said, though they cautioned that it was not yet clear how widely they could be applied. “This study is important because it introduces a new method to implement frequent quizzing with feedback in large classrooms, which can be difficult to do,” said Jeffrey D. Karpicke, a professor of psychology at Purdue, who was not involved in the study. He added, “This is the first large study to show that classroom quizzing can help reduce achievement gaps” due to socioeconomic background. © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18960 - Posted: 11.23.2013
by Simon Makin "The only thing we have to fear is fear itself," said Franklin D. Roosevelt. He might have been onto something: research suggests that the anticipation of pain is actually worse than the pain itself. In other words, people are happy to endure a bit more pain, if it means they spend less time waiting for it. Classical theories of decision-making suppose that people bring rewards forward and postpone punishments, because we give far-off events less weight. This is called "temporal discounting". But this theory seems to go out the window when it comes to pain. One explanation for this is that the anticipation of pain is itself unpleasant, a phenomenon that researchers have appropriately termed "dread". To investigate how dread varies with time, Giles Story at University College London, and his colleagues, hooked up 33 volunteers to a device that gave them mild electric shocks. The researchers also presented people with a series of choices between more or less mildly painful shocks, sooner or later. During every "episode" there was a minimum of two shocks, which could rise to a maximum of 14, but before they were given them, people had to make a choice such as nine extra shocks now or six extra shocks five episodes from now. The number of shocks they received each time was determined by these past choices. Although a few people always chose to experience the minimum pain, 70 per cent of the time, on average, participants chose to receive the extra shocks sooner rather than a smaller number later. By varying the number of shocks and when they occurred, the team was able to figure out that the dread of pain increased exponentially as pain approached in time. Similar results occurred in a test using hypothetical dental appointments. © Copyright Reed Business Information Ltd.
By Victoria Gill Science reporter, BBC News Great tits use different alarm calls for different predators, according to a scientist in Japan. The researcher analysed the birds' calls and found they made "jar" sounds for snakes and "chicka" sounds for crows and martens. This, he says, is the first demonstration birds can communicate vocally about the type of predator threatening them. The findings are published in the journal Animal Behaviour. From his previous observations, the researcher, Dr Toshitaka Suzuki, from the Graduate University for Advanced Studies in Kanagawa, found great tits appeared to be able to discriminate between different predators. To test whether they could also communicate this information, he placed models of three different animals that prey on nestlings - snakes, crows and martens - close to the birds' nest boxes. He then recorded and analysed the birds' responses. "Parents usually make alarm calls when they approach and mob the nest predators," said Dr Suzuki. "They produced specific 'jar' alarm calls for the snakes and the same 'chicka' alarm call in response to both the crows and martens," he said. But a closers analysis of the sounds showed the birds had used different "note combinations" in their crow alarm calls from those they had used for the martens. Dr Suzuki thinks the birds might have evolved what he called a "combinatorial communication system" - combining different notes to produce calls with different meanings. Since snakes are able to slither into nest boxes, they pose a much greater threat to great tit nestlings than other birds or mammals, so Dr Suzuki says it makes sense that the birds would have a specific snake alarm call. BBC © 2013
By Neuroskeptic I am sitting reading a book. After a while, I get up and make a cup of coffee. I’ve been thinking about this scenario lately as I’ve pondered ‘what remains to be discovered’ in our understanding the brain. By this I mean, what (if anything) prevents neuroscience from at least sketching out an explanation for all of human behaviour? A complete explanation of any given behaviour – such as my reading a particular book – would be impossible, as it would require detailed knowledge of all my brain activity. But neuroscience could sketch an account of some stages of the reading. We have models for how my motor cortex and cerebellum might coordinate my fingers to turn the pages of my book. Other models try to make sense of the recognition of the letters by my visual cortex. This is what I mean by ‘beginning to account for’. We have theories that are not wholly speculative. While we don’t yet have the whole story of motor control or visual perception, we have made a start. Yet I’m not sure that we can even begin to explain: why did I stop what I was doing, get up, and make coffee at that particular time? The puzzle, it seems, does not lie in my actual choice to make some coffee (as opposed to not making it.) We could sketch an explanation for how, once the mental image (memory) of coffee ‘crossed my mind’, that image set off dopamine firing (i.e. I like coffee), and this dopamine, acting on corticostriatal circuits, selected the action of making coffee over the less promising alternatives. But why did that mental image of coffee cross my mind in the first place? And why did it do so just then, not thirty seconds before or afterwards?
Link ID: 18957 - Posted: 11.23.2013
Erika Check Hayden Researchers have shown that just two genes from the Y chromosome — that genetic emblem of masculinity in most mammals — are all that is needed for a male mouse to conceive healthy offspring using assisted reproduction. The same team had previously reported1 that male mice missing only seven genes from their Y chromosomes could father healthy babies. The study brings researchers one step closer to creating mice that can be fathers without any contribution from the Y chromosome at all. The findings also have implications for human infertility, because the work suggests that the assisted-reproduction technique used in the mice might be safer for human use than is currently thought. “To me it is a further demonstration that there isn't much left on the poor old Y chromosome that is essential. Who needs a Y?” says Jennifer Marshall Graves, a geneticist at the La Trobe Institute of Molecular Science in Melbourne, Australia, who was not involved in the research. An embryo without a Y chromosome normally develops into a female, but biologists have long questioned whether the entire chromosome is necessary to produce a healthy male. A single gene from the Y chromosome, called Sry, is known to be sufficient to create an anatomically male mouse — albeit one that will be infertile because it will lack some of the genes involved in producing sperm — as researchers have shown by removing the Y chromosome and inserting Sry into other chromosomes. Why it takes two © 2013 Nature Publishing Group
Keyword: Sexual Behavior
Link ID: 18956 - Posted: 11.23.2013
By Gary Stix The emerging academic discipline of neuroethics has been driven, in part, by the recognition that introducing brain scans as legal evidence is fraught with peril. Most neuroscientists think that a brain scan is unable to provide an accurate representation of the state of mind of a defendant or determine whether his frontal lobes predispose to some wanton action. The consensus view holds that studying spots on the wrinkled cerebral cortex that are bigger or smaller in some criminal offenders may hint at overarching insights into the roots of violence, but lack the requisite specificity to be used as evidence in any individual case. “I believe that our behavior is a production of activity in our brain circuits,” Steven E. Hyman of the Broad Institute of Harvard and MIT told a session at the American Association for the Advancement of Science’s annual meeting earlier this year. “But I would never tell a parole board to decide whether to release somebody or hold on to somebody, based on their brain scan as an individual, because I can’t tell what are the causal factors in that individual.” It doesn’t seem to really matter, though, what academic experts believe about the advisability of brain scans as Exhibit One at trial. The entry of neuroscience in the courtroom has already begun, big time. The introduction of a brain scan in a legal case was once enough to generate local headlines. No more. Hundreds of legal opinions each year have begun to invoke the science of mind and brain to bolster legal arguments—references not only to brain scans, but a range of studies that show that the amygdala is implicated in this or the anterior cingulate cortex is at fault for that. The legal establishment, in short, has begun a love affair with all things brain. © 2013 Scientific American
by Anil Ananthaswamy Can you tickle yourself if you are fooled into thinking that someone else is tickling you? A new experiment says no, challenging a widely accepted theory about how our brains work. It is well known that we can't tickle ourselves. In 2000, Sarah-Jayne Blakemore of University College London (UCL) and colleagues came up with a possible explanation. When we intend to move, the brain sends commands to the muscles, but also predicts the sensory consequences of the impending movement. When the prediction matches the actual sensations that arise, the brain dampens down its response to those sensations. This prevents us from tickling ourselves (NeuroReport, DOI: 10.1097/00001756-200008030-00002). Jakob Hohwy of Monash University in Clayton, Australia, and colleagues decided to do a tickle test while simultaneously subjecting people to a body swap illusion. In this illusion, the volunteer and experimenter sat facing each other. The subject wore goggles that displayed the feed from a head-mounted camera. In some cases the camera was mounted on the subject's head, so that they saw things from their own perspective, while in others it was mounted on the experimenter's head, providing the subject with the experimenter's perspective. Using their right hands, both the subject and the experimenter held on to opposite ends of a wooden rod, which had a piece of foam attached to each end. The subject and experimenter placed their left palms against the foam at their end. Next, the subject or the experimenter took turns to move the rod with their right hand, causing the piece of foam to tickle both of their left palms. © Copyright Reed Business Information Ltd.
Link ID: 18954 - Posted: 11.21.2013
By Evelyn Boychuk, Ever since Toronto Mayor Rob Ford admitted to having smoked crack cocaine, various city councillors and media observers have publicly advised him to seek drug counselling. But in a CNN interview that aired Nov. 18, Ford continued to stand by his message: “I’m not an addict.” The ongoing saga of the mayor’s crack use has raised unanswered questions about how addictive the drug really is. It’s been commonly accepted that crack is more addictive than other drugs, but addictions researchers and drug counsellors say it’s hard to compare the addictiveness of specific substances because drug-taking is a highly individual experience. Robin Haslam, director of operations and procedures for Addiction Canada, says that he has never met someone who can “just casually smoke crack.” However, people have different thresholds of addiction. “I know people who have used crack once, and never touched it again. I also know people who smoked marijuana once, and became very impaired,” he says. Carl Hart, author of High Price: A Neuroscientist's Journey of Self-Discovery That Challenges Everything You Know About Drugs and Society, told CBC Radio’s Day 6 that crack “is not uniquely addictive, or it’s not something that is special, as we have all been taught.” Hart said that the percentage of people that become addicted to crack is lower than most think. “For example, 10 to 20 per cent of people will become addicted — that means that 80 to 90 per cent of people won’t become addicted.” © CBC 2013
Keyword: Drug Abuse
Link ID: 18953 - Posted: 11.21.2013
By Jason Tetro For millennia, the human race has sought to combat psychological disorders through the intervention of natural – and eventually synthetic – chemicals. Originally, the sources for these psychoactive substances were the various fruits and flowers, including the Areca tree (betel nut), the poppy (opium), and the coca plant (cocaine). But in the 20th Century, new actives were being created in the lab thanks in part to the discovery of lysergic acid, better known as LSD, in 1938. By the middle of the 1950s, the psychiatric community was fascinated by the idea that mental health could be restored through the direct use of drugs or in combination with traditional psychotherapy. The idea took off in the 1960s as research continued to elucidate the biology of psychiatry. It essentially created a new avenue for psychiatric treatment: psychopharmacology. This inevitably led to the synthesis of a new compound, 3-(p-trifluoromethylphenoxy)-N-methyl-3-phenylpropylamine, which eventually became known as fluoxetine, and then, as we have all come to know it, Prozac. By the late 1980s, it was known by another name: the wonder drug. Today, pharmacologic compounds for psychiatric treatment are numerous and up to 20% of all Americans are taking some type of psychotropic medication totalling some $34 billion dollars annually. While there have been calls for a reduction in use of these chemicals, primarily due to the fact that many are ineffective, there is a constant pressure from the public to have all their problems solved by a pill.
By James Gallagher Health and science reporter, BBC News The damage caused by concussion can be detected months after the injury and long after patients feel like they have recovered, brain scans show. Concussion has become highly controversial in sport, with concerns raised that players are putting their brain at risk. Researchers at the University of New Mexico said athletes may be being returned to action too quickly. While UK doctors said the attitude to head injury was "too relaxed" in sport. Debate over concussion and head injury has lead to resignations over new rules in rugby, controversy in football after a player was kept on the field after being knocked out, and has been a long-standing issue in American football. Concussion is an abnormal brain function that results from an external blast, jolt or impact to the head. Even if the knock does not result in a skull fracture, the brain can still experience a violent rattling that leads to injury. Because the brain is a soft gelatinous material surrounded by a rigid bony skull, such traumatic injuries can cause changes in brain function, such as bleeding, neuron damage and swelling. Research shows that repetitive concussions increase the risk of sustained memory loss, worsened concentration or prolonged headaches. Long-term The US study, published in the journal Neurology, compared the brains of 50 people who had mild concussion with 50 healthy people. BBC © 2013
Keyword: Brain Injury/Concussion
Link ID: 18951 - Posted: 11.21.2013
By JOYCE COHEN Earlier this fall, Seattle Seahawks fans at CenturyLink Field broke the world record for loudest stadium crowd with a skull-splitting 136.6 decibels. That volume, as the Seahawks’ website boasts, hits the scale somewhere between “serious hearing damage” and “eardrum rupture.” Just weeks later, Kansas City Chiefs fans at Arrowhead Stadium topped that number with 137.5 screaming decibels of their own. The measuring method used for the Guinness World Record has an edge of gimmickry. That A-weighted peak measurement, reached for a split second near the measuring device, displays the highest possible readout. For a vulnerable ear, however, game-day noise isn’t just harmless fun. With peaks and troughs, the decibel level of noise reaching a typical spectator averages in the mid-90s, but for a longer time. Such noise is enough to cause permanent damage and to increase the likelihood of future damage. “The extent to which hearing-related issues get so little attention is amazing and troubling,” said M. Charles Liberman, a professor of otology at Harvard Medical School and director of a hearing research lab at the Massachusetts Eye and Ear Infirmary. “Many people are damaging their ears with repeated noise exposure such that their hearing abilities will significantly diminish as they age, much more so than if they were more careful,” he said. Ears are deceptive. Even if they seem to recover from the muffling, ringing and fullness after a rousing game, they don’t really recover. It’s not just the tiny sensory cells in the cochlea that are damaged by noise, Dr. Liberman said, but also the nerve fibers between the ears and the brain that degrade over time. Copyright 2013 The New York Times Company
Link ID: 18950 - Posted: 11.21.2013