Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1883

By Nicholas Bakalar A new study suggests that early to bed and early to rise makes a man healthy — although not necessarily wealthy or wise. Korean researchers recruited 1,620 men and women, ages 47 to 59, and administered a questionnaire to establish whether they were morning people or night owls. They found 480 morning types, 95 night owls, and 1,045 who fit into neither group. The scientists measured all for glucose tolerance, body composition and waist size, and gathered information on other health and behavioral characteristics. The study is online in The Journal of Clinical Endocrinology & Metabolism. After controlling for an array of variables, they found that compared with morning people, men who were night owls were significantly more likely to have diabetes, and women night owls were more than twice as likely to have metabolic syndrome — high blood sugar levels, excess body fat around the waist, and abnormal lipid readings. The reasons for the effect are unclear, but the scientists suggest that consuming more calories after 8 p.m. and exposure to artificial light at night can both affect metabolic regulation. Can a night owl become a morning person? “Yes,” said the lead author, Dr. Nan Hee Kim, an endocrinologist at the Korea University College of Medicine. “It can be modified by external cues such as light, activity and eating behavior. But it isn’t known if this would improve the metabolic outcomes.” © 2015 The New York Times Company

Keyword: Sleep
Link ID: 20781 - Posted: 04.10.2015

Jon Hamilton Researchers have discovered the exact structure of the receptor that makes our sensory nerves tingle when we eat sushi garnished with wasabi. And because the "wasabi receptor" is also involved in pain perception, knowing its shape should help pharmaceutical companies develop new drugs to fight pain. The receptor, which scientists call TRPA1, is "an important molecule in the pain pathway," says David Julius, a professor of physiology at the University of California, San Francisco and an author of a paper published in this week's Nature. "A dream of mine is that some of the work we do will translate into medicines people can take for chronic pain." Julius led a team that discovered the receptor about a decade ago. Since then, researchers have shown that TRPA1 receptors begin sending distress signals to the brain whenever they encounter pungent chemical irritants, including not only wasabi but tear gas and air pollution from cars or wood fires. The receptors also become activated in response to chemicals released by the body itself when tissue becomes inflamed from an injury or a disease like rheumatoid arthritis. © 2015 NPR

Keyword: Pain & Touch
Link ID: 20780 - Posted: 04.10.2015

By Rachel Feltman If you give a mouse an eating disorder, you might just figure out how to treat the disease in humans. In a new study published Thursday in Cell Press, researchers created mice who lacked a gene associated with disordered eating in humans. Without it, the mice showed behaviors not unlike those seen in humans with eating disorders: They tended to be obsessive compulsive and have trouble socializing, and they were less interested in eating high-fat food than the control mice. The findings could lead to novel drug treatments for some of the 24 million Americans estimated to suffer from eating disorders. In a 2013 study, the same researchers went looking for genes that might contribute to the risk of an eating disorder. Anorexia nervosa and bulimia nervosa aren't straightforwardly inherited -- there's definitely more to an eating disorder than your genes -- but it does seem like some families might have higher risks than others. Sure enough, the study of two large families, each with several members who had eating disorders, yielded mutations in two interacting genes. In one family, the estrogen-related receptor α (ESRRA) gene was mutated. The other family had a mutation on another gene that seemed to affect how well ESRRA could do its job. So in the latest study, they created mice that didn't have ESRRA in the parts of the brain associated with eating disorders. "You can't go testing this kind of gene expression in a human," lead author and University of Iowa neuorscientist Michael Lutter said. "But in mice, you can manipulate the expression of the gene and then look at how it changes their behavior."

Keyword: Anorexia & Bulimia; Genes & Behavior
Link ID: 20778 - Posted: 04.10.2015

Mo Costandi In 2009, researchers at the University of California, Santa Barbara performed a curious experiment. In many ways, it was routine — they placed a subject in the brain scanner, displayed some images, and monitored how the subject's brain responded. The measured brain activity showed up on the scans as red hot spots, like many other neuroimaging studies. Except that this time, the subject was an Atlantic salmon, and it was dead. Dead fish do not normally exhibit any kind of brain activity, of course. The study was a tongue-in-cheek reminder of the problems with brain scanning studies. Those colorful images of the human brain found in virtually all news media may have captivated the imagination of the public, but they have also been subject of controversy among scientists over the past decade or so. In fact, neuro-imagers are now debating how reliable brain scanning studies actually are, and are still mostly in the dark about exactly what it means when they see some part of the brain "light up." Functional magnetic resonance imaging (fMRI) measures brain activity indirectly by detecting changes in the flow of oxygen-rich blood, or the blood oxygen-level dependent (BOLD) signal, with its powerful magnets. The assumption is that areas receiving an extra supply of blood during a task have become more active. Typically, researchers would home in on one or a few "regions of interest," using 'voxels,' tiny cube-shaped chunks of brain tissue containing several million neurons, as their units of measurement.

Keyword: Brain imaging
Link ID: 20775 - Posted: 04.10.2015

Jordan Gaines Lewis Hodor hodor hodor. Hodor hodor? Hodor. Hodor-hodor. Hodor! Oh, um, excuse me. Did you catch what I said? Fans of the hit HBO show Game of Thrones, the fifth season of which premieres this Sunday, know what I’m referencing, anyway. Hodor is the brawny, simple-minded stableboy of the Stark family in Winterfell. His defining characteristic, of course, is that he only speaks a single word: “Hodor.” But those who read the A Song of Ice and Fire book series by George R R Martin may know something that the TV fans don’t: his name isn’t actually Hodor. According to his great-grandmother Old Nan, his real name is Walder. “No one knew where ‘Hodor’ had come from,” she says, “but when he started saying it, they started calling him by it. It was the only word he had.” Whether he intended it or not, Martin created a character who is a textbook example of someone with a neurological condition called expressive aphasia. In 1861, French physician Paul Broca was introduced to a man named Louis-Victor Leborgne. While his comprehension and mental functioning remained relatively normal, Leborgne progressively lost the ability to produce meaningful speech over a period of 20 years. Like Hodor, the man was nicknamed Tan because he only spoke a single word: “Tan.”

Keyword: Language
Link ID: 20773 - Posted: 04.10.2015

|By Gareth Cook The wait has been long, but the discipline of neuroscience has finally delivered a full-length treatment of the zombie phenomenon. In their book, Do Zombies Dream of Undead Sheep?, scientists Timothy Verstynen and Bradley Voytek cover just about everything you might want to know about the brains of the undead. It's all good fun, and if you learn some serious neuroscience along the way, well, that's fine with them, too. Voytek answered questions from contributing editor Gareth Cook. How is it that you and your co-author came to write a book about zombies? Clearly, it is an urgent public health threat, but I would not have expected a book from neuroscientists on the topic. Indeed! You think you're prepared for the zombie apocalypse and then—BAM!—it happens, and only then do you realize how poorly prepared you really were. Truly the global concern of our time. Anyway, this whole silly thing started when Tim and I would get together to watch zombie movies with our wives and friends. Turns out when you get some neuroscientists together to watch zombie movies, after a few beers they start to diagnose them and mentally dissect their brains. Back in the summer of 2010 zombie enthusiast and author—and head of the Zombie Research Society—Matt Mogk got in touch with me to see if we were interested in doing something at the intersection of zombies and neuroscience. © 2015 Scientific American

Keyword: Miscellaneous
Link ID: 20772 - Posted: 04.10.2015

By Jonathan Webb Science reporter, BBC News Living in total darkness, the animals' eyes have disappeared over millions of years A study of blind crustaceans living in deep, dark caves has revealed that evolution is rapidly withering the visual parts of their brain. The findings catch evolution in the act of making this adjustment - as none of the critters have eyes, but some of them still have stumpy eye-stalks. Three different species were studied, each representing a different subgroup within the same class of crustaceans. The research is published in the journal BMC Neuroscience. The class of "malocostracans" also includes much better-known animals like lobsters, shrimps and wood lice, but this study focussed on three tiny and obscure examples that were only discovered in the 20th Century. It is the first investigation of these mysterious animals' brains. "We studied three species. All of them live in caves, and all of them are very rare or hardly accessible," said lead author Dr Martin Stegner, from the University of Rostock in Germany. Specifically, his colleagues retrieved the specimens from the coast of Bermuda, from Table Mountain in South Africa, and from Monte Argentario in Italy. One of the species was retrieved from caves on the coast of Bermuda The animals were preserved rather than living, so the team could not observe their tiny brains in action. But by looking at the physical shape of the brain, and making comparisons with what we know about how the brain works in their evolutionary relatives, the researchers were able to assign jobs to the various lobes, lumps and spindly structures they could see under the microscope. © 2015 BBC.

Keyword: Evolution; Vision
Link ID: 20769 - Posted: 04.08.2015

Tom Bawden Scientists have deciphered the secrets of gibbon “speech” – discovering that the apes are sophisticated communicators employing a range of more than 450 different calls to talk to their companions. The research is so significant that it could provide clues on the evolution of human speech and also suggests that other animal species could speak a more precise language than has been previously thought, according to lead author Dr Esther Clarke of Durham University. Her study found that gibbons produce different categories of “hoo” calls – relatively quiet sounds that are distinct from their more melodic “song” calls. These categories of call allow the animals to distinguish when their fellow gibbons are foraging for food, alerting them to distant noises or warning others about the presence of predators. In addition, Dr Clarke found that each category of “hoo” call can be broken down further, allowing gibbons to be even more specific in their communication. A warning about lurking raptor birds, for example, sounds different to one about pythons or clouded leopards – being pitched at a particularly low frequency to ensure it is too deep for the birds of prey to hear. The warning call denoting the presence of tigers and leopards is the same because they belong to the same class of big cats, the research found. © independent.co.uk

Keyword: Language; Evolution
Link ID: 20768 - Posted: 04.08.2015

Do Alcoholics Anonymous participants do better at abstinence than nonparticipants because they are more motivated? Or is it because of something inherent in the A.A. program? How researchers answered these questions in a recent study offers insight into challenges of evidence-based medicine and evidence-informed policy. The study, published in the journal Alcoholism: Clinical and Experimental Research, teased apart a treatment effect (improvement due to A.A. itself) and a selection effect (driven by the type of people who seek help). The investigators found that there is a genuine A.A. treatment effect. Going to an additional two A.A. meetings per week produced at least three more days of alcohol abstinence per month. Separating treatment from selection effects is a longstanding problem in social and medical science. Their entanglement is one of the fundamental ways in which evidence of correlation fails to be a sign of causation. For many years, researchers and clinicians have debated whether the association of A.A. with greater abstinence was caused by treatment or a correlation that arises from the type of people who seek it. Such confounding is often addressed with an experiment in which individuals are randomly assigned to either a treatment or a nontreatment (or control) group in order to remove the possibility of self-selection. The treatment effect is calculated by comparing outcomes obtained by participants in each group. Several studies of A.A. have applied this approach. For instance, Kimberly Walitzer, Kurt Dermen and Christopher Barrick randomized alcoholics to receive treatment that strongly encouraged and supported A.A. participation or a control group. The former exhibited a greater degree of abstinence. In an ideal randomized controlled trial (R.C.T.), everyone selected for treatment receives it and no one in the control group does. The difference in outcomes is the treatment effect, free of bias from selection. That’s the ideal. However, in practice, randomized controlled trials can still suffer selection problems. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 20767 - Posted: 04.08.2015

by Hal Hodson For a few days last summer, a handful of students walked through a park behind the University of Hannover in Germany. Each walked solo, but followed the same route as the others: made the same turns, walked the same distance. This was odd, because none of them knew where they were going. Instead, their steps were steered from a phone 10 paces behind them, which sent signals via bluetooth to electrodes attached to their legsMovie Camera. These stimulated the students' muscles, guiding their steps without any conscious effort. Max Pfeiffer of the University of Hannover was the driver. His project directs electrical currentMovie Camera into the students' sartorius, the longest muscle in the human body, which runs from the inside of the knee to the top of the outer thigh. When it contracts, it pulls the leg out and away from the body. To steer his test subjects left, Pfeiffer would zap their left sartorius, opening their gait and guiding them in that direction. Pfeiffer hopes his system will free people's minds up for other things as they navigate the world, allowing them to focus on their conversation or enjoy their surroundings. Tourists could keep their eyes on the sights while being imperceptibly guided around the city. Acceptance may be the biggest problem, although it is possible that the rise of wearable computing might help. Pfeiffer says the electrode's current causes a tingling sensation that diminishes the more someone uses the system. Volunteers said they were comfortable with the system taking control of their leg muscles, but only if they felt they could take control back. © Copyright Reed Business Information Ltd

Keyword: Robotics
Link ID: 20761 - Posted: 04.06.2015

Drawing on the widest survey of sexual behaviour since the Kinsey Report, David Spiegelhalter, in his book Sex By Numbers, answers key questions about our private lives. Here he reveals how Kinsey’s contested claim that 10% of us are gay is actually close to the mark For a single statistic to be the primary propaganda weapon for a radical political movement is unusual. Back in 1977, the US National Gay Task Force (NGTF) was invited into the White House to meet President Jimmy Carter’s representatives – a first for gay and lesbian groups. The NGTF’s most prominent campaigning slogan was “we are everywhere”, backed up by the memorable statistical claim that one in 10 of the US population was gay – this figure was deeply and passionately contested. So where did Bruce Voeller, a scientist who was a founder and first director of the NGTF, get this nice round 10% from? To find out, we have to delve back into Alfred Kinsey’s surveys in 1940s America, which were groundbreaking at the time but are now seen as archaic in their methods: he sought out respondents in prisons and the gay underworld, made friends with them and, over a cigarette, noted down their behaviours using an obscure code. Kinsey did not believe that sexual identity was fixed and simply categorised, and perhaps his most lasting contribution was his scale, still used today, in which individuals are rated from exclusively heterosexual to exclusively homosexual on a scale of 0 to 6. Kinsey’s headline finding was that “at least 37% of the male population has some homosexual experience between the beginning of adolescence and old age”, meaning physical contact to the point of orgasm. © 2015 Guardian News and Media Limited

Keyword: Sexual Behavior
Link ID: 20760 - Posted: 04.06.2015

by Bethany Brookshire A new round of dietary do’s and don’ts accompanied last month’s scientific report on the latest food research, summarizing everything from aspartame to saturated fats. The report puts eggs back on the menu. High dietary cholesterol is no longer linked to blood cholesterol in most healthy people. But what grabbed the headlines? Coffee, of course. Many of us are happy to raise a mug to our legal stimulant of choice, especially with the report’s suggestion that three to five cups of joe get a pass. But where do these numbers come from? What science do nutrition experts take into account to determine whether coffee is harmful or safe? And — perhaps the most important question — what does “three to five cups” really mean? The good news for coffee comes from the 2015 Dietary Guidelines Advisory Committee, a group of experts in nutrition and health appointed by the Department of Health and Human Services and the U.S. Department of Agriculture to review the science behind what Americans should eat. The report, released February 19, is not the be-all-end-all of what should be on our plates and in our cups. Instead, it’s a scientific report intended to help the HHS and USDA make policy decisions for the next edition of the Dietary Guidelines for Americans, due out later this year. This is the first time the U.S. Dietary Guidelines have addressed coffee at all. But now, there is enough science on coffee to make a closer look worthwhile, says Tom Brenna, a food scientist at Cornell University and a member of the Committee. “There was so much evidence out there,” he says. “Instead of just five or six papers on the subject, there’s a huge number.” © Society for Science & the Public 2000 - 2015

Keyword: Drug Abuse
Link ID: 20758 - Posted: 04.06.2015

Cory Turner To survive, we humans need to be able to do a handful of things: breathe, of course. And drink and eat. Those are obvious. We're going to focus now on a less obvious — but no less vital — human function: learning. Because new research out today in the journal Science sheds light on the very building blocks of learning. Imagine an 11-month-old sitting in a high chair opposite a small stage where you might expect, say, a puppet show. Except this is a lab at Johns Hopkins University. Instead of a puppeteer, a researcher is rolling a red and blue striped ball down a ramp, toward a little wall at the bottom. Even babies seem to know the ball can't go through that wall, though not necessarily because they learned it. It's what some scientists call core knowledge — something, they say, we're born with. "Some pieces of knowledge are so fundamental in guiding regular, everyday interactions with the environment, navigating through space, reaching out and picking up an object, avoiding an oncoming object — those things are so fundamental to survival that they're really selected for by evolution," says Lisa Feigenson, a professor of psychological and brain sciences at Hopkins and one of the researchers behind this study. Which explains why the baby seems genuinely surprised when the ball rolls down the ramp and does go through the wall — thanks to some sleight of hand by the researchers: © 2015 NPR

Keyword: Development of the Brain
Link ID: 20756 - Posted: 04.04.2015

By Matt McFarland The individuals who have founded some of the most success tech companies are decidedly weird. Examine the founder of a truly innovative company and you’ll find a rebel without the usual regard for social customs. This begs the question, why? Why aren’t more “normal” people with refined social graces building tech companies that change the world? Why are only those on the periphery reaching great heights? If you ask tech investor Peter Thiel, the problem is a social environment that’s both powerful and destructive. Only individuals with traits reminiscent of Asperger’s Syndrome, which frees them from an attachment to social conventions, have the strength to create innovative businesses amid a culture that discourages daring entrepreneurship. “Many of the more successful entrepreneurs seem to be suffering from a mild form of Asperger’s where it’s like you’re missing the imitation, socialization gene,” Thiel said Tuesday at George Mason University. “We need to ask what is it about our society where those of us who do not suffer from Asperger’s are at some massive disadvantage because we will be talked out of our interesting, original, creative ideas before they’re even fully formed. Oh that’s a little bit too weird, that’s a little bit too strange and maybe I’ll just go ahead and open the restaurant that I’ve been talking about that everyone else can understand and agree with, or do something extremely safe and conventional.” An individual with Asperger’s Syndrome — a form of autism — has limited social skills, a willingness to obsess and an interest in systems. Those diagnosed with Asperger’s Syndrome tend to be unemployed or underemployed at rates that far exceed the general population. Fitting into the world is difficult.

Keyword: Autism
Link ID: 20755 - Posted: 04.04.2015

Emily Hodgkin As a nation we think we understand autism. Since the first discovery of the condition just over 70 years ago awareness of autism has continued to grow. Despite this, 87 per cent of people affected by autism think the general public has a bad understanding of the condition. Many of the common myths surrounding autism have been debunked - including the perception that people with autism can’t hold a job. But only 15 per cent of adults in the UK with autism are in full-time employment, while 61 per cent of people with autism currently not in employment say they want to work. Research suggests that employers are missing out on abilities that people on the autism spectrum have in greater abundance – such as heightened abilities in pattern recognition and logical reasoning, as well as a greater attention to detail. Mark Lever, chief executive of the National Autistic Society (NAS) said: "It's remarkable that awareness has increased so much since the NAS was set up over 50 years ago, a time when people with the condition were often written off and hidden from society. But, as our supporters frequently tell us and the poll confirms, there is still a long way to go before autism is fully understood and people with the condition are able to participate fully in their community. All too often we still hear stories of families experiencing judgemental attitudes or individuals facing isolation or unemployment due to misunderstandings or myths around autism.” There are around 700,000 autistic people in the UK – more than 1 in a 100. So as it's more common than perhaps expected, what other myths still exist? © independent.co.uk

Keyword: Autism
Link ID: 20754 - Posted: 04.04.2015

By Amy Ellis Nutt and Brady Dennis For people with amyotrophic lateral sclerosis, which attacks the body’s motor neurons and renders a person unable to move, swallow or breathe, the search for an effective treatment has been a crushing disappointment. The only drug available for the disease, approved two decades ago, typically extends life just a few months. Then in the fall, a small California biotech company named Genervon began extolling the benefits of GM604, its new ALS drug. In an early-stage trial with 12 patients, the results were “statistically significant,” “very robust” and “dramatic,” the company said in news releases. Such enthusiastic pronouncements are unusual for such a small trial. In February, Genervon took an even bolder step: It applied to the Food and Drug Administration for “accelerated approval,” which allows promising treatments for serious or life-threatening diseases to bypass costly, large-scale efficacy trials and go directly to market. ALS patients responded by pleading with the FDA, in emotional videos and e-mails, to grant broad access to the experimental drug. Online forums lit up, and a Change.org petition calling for rapid approval attracted more than a half-million signatures. “Why would anyone oppose it?” asked ALS patient David Huntley in a letter read aloud in the past week at a rally on Capitol Hill. Huntley, a former triathlete, can no longer speak or travel, so his wife, Linda Clark, flew from San Diego to speak for him.

Keyword: ALS-Lou Gehrig's Disease
Link ID: 20752 - Posted: 04.04.2015

Davide Castelvecchi Boots rigged with a simple spring-and-ratchet mechanism are the first devices that do not require power aids such as batteries to make walking more energy efficient. People walking in the boots expend 7% less energy than they do walking in normal shoes, the devices’ inventors report on 1 April in Nature1. That may not sound like much, but the mechanics of the human body have been shaped by millions of years of evolution, and some experts had doubted that there was room for further improvement in human locomotion, short of skating along on wheels. “It is the first paper of which I’m aware that demonstrates that a passive system can reduce energy expenditure during walking,” says Michael Goldfarb, a mechanical engineer at Vanderbilt University in Nashville, Tennessee, who develops exoskeletons for aiding people with disabilities. As early as the 1890s, inventors tried to boost the efficiency of walking by using devices such as rubber bands, says study co-author Gregory Sawicki, a biomedical engineer and locomotion physiologist at North Carolina State University in Raleigh. More recently, engineers have built unpowered exoskeletons that enable people to do tasks such as lifting heavier weights — but do not cut down the energy they expend. (Biomechanists still debate whether the running ‘blades’ made famous by South African sprinter Oscar Pistorius are more energetically efficient than human feet.2, 3) For their device, Sawicki and his colleagues built a mechanism that parallels human physiology. When a person swings a leg forward to walk, elastic energy is stored mostly in the Achilles tendon of their standing leg. That energy is released when the standing leg's foot pushes into the ground and the heel lifts off, propelling the body forwards. “There is basically a catapult in our ankle,” Sawicki says. © 2015 Nature Publishing Group

Keyword: Robotics
Link ID: 20750 - Posted: 04.02.2015

By Catherine Saint Louis Joni Mitchell, 71, was taken to a hospital in Los Angeles on Tuesday after she was found unconscious at her Los Angeles home. In recent years, the singer has complained of a number of health problems, including one particularly unusual ailment: Morgellons disease. People who believe they have the condition report lesions that don’t heal, “fibers” extruding from their skin and uncomfortable sensations like pins-and-needles tingling or stinging. Sufferers may also report fatigue and problems with short-term memory and concentration. But Morgellons is not a medically accepted diagnosis. Scientists have struggled for nearly a decade to find a cause and have come up mostly empty-handed. Researchers at the Centers for Disease Control and Prevention studied 115 people who said they had the condition. In a report published in 2012, they said they were unable to identify an infectious source for the patients’ “unexplained dermopathy.” There was no evidence of an environmental link, and the “fibers” from patients resembled those from clothing that had gotten trapped in a scab or crusty skin. The investigators cast doubt on Morgellons as a distinct condition and said that it might be something doctors were already familiar with: delusional infestation, a psychiatric condition characterized by an unshakable but erroneous belief that one’s skin is infested with bugs or parasites. Drug use can contribute to such delusions, and the investigators noted evidence of drug use — prescription or illicit — in half of the people they examined. Of the 36 participants who completed neuropsychological testing, 11 percent had high scores for depression, and 63 percent, unsurprisingly, were preoccupied with health issues. © 2015 The New York Times Company

Keyword: Pain & Touch
Link ID: 20749 - Posted: 04.02.2015

​​The commonly-prescribed drug acetaminophen or paracetamol does nothing to help low back pain, and may affect the liver when used regularly, a large new international study has confirmed. Reporting in today's issue of the British Medical Journal researchers also say the benefits of the drug are unlikely to be worth the risks when it comes to treating osteoarthritis in the hip or knee. "Paracetamol has been widely recommended as being a safe medication, but what we are saying now is that paracetamol doesn't bring any benefit for patients with back pain, and it brings only trivial benefits to those with osteoarthritis," Gustavo Machado of The George Institute for Global Health and the University of Sydney, tells the Australian Broadcasting Corporation. "In addition to that it might bring harm to those patients." Most international clinical guidelines recommend acetaminophen as the "first choice" of treatment for low back pain and osteoarthritis of the hip and knee. However, despite a trial last year questioning the use of acetaminophen to treat low back pain, there has never been a systematic review of the evidence for this. Machado and colleagues analyzed three clinical trials and confirmed that acetaminophen is no better than placebo at treating low back pain. An analysis of 10 other clinical trials by the researchers quantified for the first time the effect acetaminophen has on reducing pain from osteoarthritis in the knee and hip. "We concluded that it is too small to be clinically worthwhile," says Machado. He says the effects of acetaminophen on the human body are not well understood and just because it can stop headaches, it doesn't mean the drug will work in all circumstances. ©2015 CBC/Radio-Canada.

Keyword: Pain & Touch
Link ID: 20748 - Posted: 04.02.2015

Alison Abbott Historian of psychology Douwe Draaisma knows well how to weave science, history and literature into irresistible tales. Forgetting, his latest collection of essays around the theme of memory, is — like his successful Nostalgia Factory (Yale University Press, 2013) — hard to put down. His vivid tour through the history of memory-repression theories brings home how dangerous and wrong, yet persistent, were the ideas of Sigmund Freud and his intellectual heirs. Freud thought that traumatic memories and shameful thoughts could be driven from the consciousness, but not forgotten. They would simmer in the unconscious, influencing behaviour. He maintained that forcing them out with psychoanalysis, and confronting patients with them, would be curative. Draaisma relates the case of an 18-year-old whom Freud dubbed Dora, diagnosed in 1900 with 'hysteria'. Dora's family refused to believe that the husband of her father's mistress had made sexual advances to her. Among other absurdities, Freud told Dora that her nervous cough reflected her repressed desire to fellate the man. Dora broke off the therapy, which Freud saw as proof of his theory. He thought that patients will naturally resist reawakening painful thoughts. What Dora did not buy, plenty of others did. Psychoanalysis boomed, becoming lucrative. Its principles were adopted in the 1990s by an unlikely alliance of lawyers and some feminists, who argued that repressed memories of childhood abuse could be recovered with techniques such as hypnosis, and used as evidence in court. Many judges went along with it; the rush of claims cast a shadow over genuine cases of abuse, Draaisma points out. We now know from studies of post-traumatic stress disorder that traumatic memories are impossible to repress. They flood into the conscious mind in horrifying flashbacks. © 2015 Macmillan Publishers Limited

Keyword: Learning & Memory
Link ID: 20747 - Posted: 04.02.2015