Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By CATHERINE SAINT LOUIS Nearly 55 percent of infants nationwide are put to bed with soft blankets or covered by a comforter, even though such bedding raises the chances of suffocation or sudden infant death syndrome, federal researchers reported Monday. Their study, published in the journal Pediatrics, is the first to estimate how many infants sleep with potentially hazardous quilts, bean bags, blankets or pillows. Despite recommendations to avoid putting anything but a baby in a crib, two-thirds of black and Latino parents still use bedding that is both unnecessary and unsafe, the study also found. “I was startled a little bit by the number of people still using bedding in the sleep area,” said Dr. Michael Goodstein, a neonatologist in York, Pa., who serves on a task force on sleep-related infant deaths at the American Academy of Pediatrics. “Sleeping face down on soft bedding increases the risks of SIDS 21-fold.” Among the risk factors for SIDS, “bedding has fallen through the cracks,” said Dr. Thomas G. Keens, the chairman of the California SIDS Advisory Council. “This article is a wake-up call.” The new analysis looked at data gathered from 1993 to 2010 in the National Infant Sleep Position Study, which surveyed a random sample of nearly 19,000 parents by telephone. Use of infant bedding declined roughly 23 percent annually from 1993 to 2000. In recent years, however, the declines have slowed or stalled entirely. From 2001 to 2010, use of inappropriate bedding for white and Hispanic infants declined just 5 to 7 percent annually. There was no decline in the use of such bedding for black infants. Parents in the new study were not asked their reasons for using bedding. Previous research has found that they worry infants will be cold, or that the crib mattress is too hard. © 2014 The New York Times Company
By BILL PENNINGTON It happens dozens of times in every N.F.L. game. There is a fierce collision, or perhaps a running back is slammed to the ground. Most of the time, all the players rise to their feet uneventfully. Other times, as the pileup unravels, a player gets up slowly. His gait may be unsteady. For decades in the N.F.L., the operative term for the situation was that someone “got dinged.” It was a cute, almost harmless-sounding description of what was often a concussion or a worrying subconcussive blow to the head. But with the N.F.L. agreeing to pay hundreds of millions of dollars to settle a lawsuit brought by about 5,000 former players who said the league hid from them the dangers of repeated hits to the head, a backpedaling league has corrected its lingo and hastily amended its methodology. The N.F.L. now has a concussion management protocol, outlined in an inches-thick document that commands teams to institute a specific, detailed game-day and postconcussion course of action. Once, the treatment of players with head injuries varied from team to team and could be haphazard. Beginning last season, all players suspected of having a head injury — should they lose consciousness from a collision or experience symptoms like a headache, dizziness or disorientation — were required to go through the concussion protocol system. It features a broad cast: a head-injury spotter in the press box, athletic trainers on the bench, doctors and neuro-trauma specialists on the sideline and experts in neuro-cognitive testing in the locker room. The system is far from foolproof — players with serious symptoms remain in games. But as the N.F.L. grapples with a sobering threat to the welfare of its work force, not to mention a public-relations nightmare, the new concussion protocol is meant to establish a systemic, itemized policy on how potential brain injuries should be handled. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20372 - Posted: 12.01.2014
By John Edward Terrell We will certainly hear it said many times between now and the 2016 elections that the country’s two main political parties have “fundamental philosophical differences.” But what exactly does that mean? At least part of the schism between Republicans and Democrats is based in differing conceptions of the role of the individual. We find these differences expressed in the frequent heated arguments about crucial issues like health care and immigration. In a broad sense, Democrats, particularly the more liberal among them, are more likely to embrace the communal nature of individual lives and to strive for policies that emphasize that understanding. Republicans, especially libertarians and Tea Party members on the ideological fringe, however, often trace their ideas about freedom and liberty back to Enlightenment thinkers of the 17th and 18th centuries, who argued that the individual is the true measure of human value, and each of us is naturally entitled to act in our own best interests free of interference by others. Self-described libertarians generally also pride themselves on their high valuation of logic and reasoning over emotion. The basic unit of human social life is not and never has been the selfish and self-serving individual. Philosophers from Aristotle to Hegel have emphasized that human beings are essentially social creatures, that the idea of an isolated individual is a misleading abstraction. So it is not just ironic but instructive that modern evolutionary research, anthropology, cognitive psychology and neuroscience have come down on the side of the philosophers who have argued that the basic unit of human social life is not and never has been the selfish, self-serving individual. Contrary to libertarian and Tea Party rhetoric, evolution has made us a powerfully social species, so much so that the essential precondition of human survival is and always has been the individual plus his or her relationships with others. © 2014 The New York Times Company
Link ID: 20371 - Posted: 12.01.2014
Daniel Freeman and Jason Freeman “Although it is a waste of time to argue with a paranoid patient about his delusions, he may still be persuaded to keep them to himself, to repress them as far as possible and to forgo the aggressive action they might suggest, in general to conduct his life as if they did not exist.” This quote from Clinical Psychiatry, a hugely influential textbook in the 1950s and 1960s, epitomises the way in which unusual mental states were generally understood for much of the 20th century. Delusions (such as paranoid thoughts) and hallucinations (hearing voices, for example) were of interest purely as symptoms of psychosis, or what used to be called madness. Apart from their utility in diagnosis, they were deemed to be meaningless: the incomprehensible effusions of a diseased brain. Or in the jargon: “empty speech acts, whose informational content refers to neither world nor self”. There’s a certain irony here, of course, in experts supposedly dedicated to understanding the way the mind works dismissing certain thoughts as unworthy of attention or explanation. The medical response to these phenomena, which were considered to be an essentially biological problem, was to eradicate them with powerful antipsychotic drugs. This is not to say that other strategies weren’t attempted: in one revealing experiment in the 1970s, patients in a ward for “paranoid schizophrenics” in Vermont, US, were rewarded with tokens for avoiding “delusional talk”. These tokens could be exchanged for items including “meals, extra dessert, visits to the canteen, cigarettes, time off the ward, time in the TV and game room, time in bedroom between 8am and 9pm, visitors, books and magazines, recreation, dances on other wards.” (It didn’t work: most patients modified their behaviour temporarily, but “changes in a patient’s delusional system and general mental status could not be detected by a psychiatrist”.) © 2014 Guardian News and Media Limited
Link ID: 20369 - Posted: 11.29.2014
|By Jason G. Goldman A sharp cry pierces the air. Soon a worried mother deer approaches the source of the sound, expecting to find her fawn. But the sound is coming from a speaker system, and the call isn't that of a baby deer at all. It's an infant fur seal's. Because deer and seals do not live in the same habitats, mother deer should not know how baby seal screams sound, reasoned biologists Susan Lingle of the University of Winnipeg and Tobias Riede of Midwestern University, who were running the acoustic experiment. So why did a mother deer react with concern? Over two summers, the researchers treated herds of mule deer and white-tailed deer on a Canadian farm to modified recording of the cries of a wide variety of infant mammals—elands, marmots, bats, fur seals, sea lions, domestic cats, dogs and humans. By observing how mother deer responded, Lingle and Riede discovered that as long as the fundamental frequency was similar to that of their own infants' calls, those mothers approached the speaker as if they were looking for their offspring. Such a reaction suggests deep commonalities among the cries of most young mammals. (The mother deer did not show concern for white noise, birdcalls or coyote barks.) Lingle and Riede published their findings in October in the American Naturalist. Researchers had previously proposed that sounds made by different animals during similar experiences—when they were in pain, for example—would share acoustic traits. “As humans, we often ‘feel’ for the cry of young animals,” Lingle says. That empathy may arise because emotions are expressed in vocally similar ways among mammals. © 2014 Scientific American
By Virginia Morell When we listen to someone talking, we hear some sounds that combine to make words and other sounds that convey such things as the speaker’s emotions and gender. The left hemisphere of our brain manages the first task, while the right hemisphere specializes in the second. Dogs also have this kind of hemispheric bias when listening to the sounds of other dogs. But do they have it with human sounds? To find out, two scientists had dogs sit facing two speakers. The researchers then played a recorded short sentence—“Come on, then”—and watched which way the dogs turned. When the animals heard recordings in which individual words were strongly emphasized, they turned to the right—indicating that their left hemispheres were engaged. But when they listened to recordings that had exaggerated intonations, they turned to the left—a sign that the right hemisphere was responding. Thus, dogs seem to process the elements of speech very similarly to the way humans do, the scientists report online today in Current Biology. According to the researchers, the findings support the idea that our canine pals are indeed paying close attention not only to who we are and how we say things, but also to what we say. © 2014 American Association for the Advancement of Science.
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Sandra G. Boodman ‘That’s it — I’m done,” Rachel Miller proclaimed, the sting of the neurologist’s judgment fresh as she recounted the just-concluded appointment to her husband. Whatever was wrong with her, Miller decided after that 2009 encounter, she was not willing to risk additional humiliation by seeing another doctor who might dismiss her problems as psychosomatic. The Baltimore marketing executive had spent the previous two years trying to figure out what was causing her bizarre symptoms, some of which she knew made her sound delusional. Her eyes felt “weird,” although her vision was 20/20. Normal sounds seemed hugely amplified: at night when she lay in bed, her breathing and heartbeat were deafening. Water pounding on her back in the shower sounded like a roar. She was plagued by dizziness. “I had started to feel like a person in one of those stories where someone has been committed to a mental hospital by mistake or malice and they desperately try to appear sane,” recalled Miller, now 53. She began to wonder if she really was crazy; numerous tests had ruled out a host of possible causes, including a brain tumor. Continuing to look for answers seemed futile, since all the doctors she had seen had failed to come up with anything conclusive. “My attitude was: If it’s something progressive like MS [multiple sclerosis] or ALS [amyotrophic lateral sclerosis], it’ll get bad enough that someone will eventually figure it out.” Figuring it out would take nearly three more years and was partly the result of an oddity that Miller mentioned to another neurologist, after she lifted her moratorium on seeing doctors.
Link ID: 20353 - Posted: 11.25.2014
By Victoria Colliver Marianne Austin watched her mother go blind from age-related macular degeneration, an eye disease that affects about 10 million older Americans. Now that Austin has been diagnosed with the same condition, she wants to avoid her mother’s experience. “I’ve seen what can happen and the devastation it can cause,” said Austin, 67, of Atherton, who found out she had the disease last year. “I call it having seen the movie. I don’t like that ending, I want to change the movie, and I don’t want to wait 10 years until something is proven in research.” About 10 percent of patients diagnosed with age-related macular degeneration will develop the form of the disease that causes permanent blindness. It’s unclear just how much genetics plays a role, so there’s no definitive way to predict who will progress to that stage or when that would happen. But a team of Stanford doctors think they may have found a way. In a study, published this month in the medical journal Investigative Ophthalmology and Visual Science, researchers analyzed data from 2,146 retinal scans from 244 macular degeneration patients at Stanford from 2008 to 2013. They then created an algorithm that predicted whether a particular patient would be likely to develop the form of the disease that causes blindness within less than a year, three years or up to five years. For those with macular degeneration to go blind, the disease has to advance from what is known as the “dry” form to the “wet” form. The sooner a doctor can notice changes, the better chance there is to save a patient’s vision.
|By Christof Koch Point to any one organ in the body, and doctors can tell you something about what it does and what happens if that organ is injured by accident or disease or is removed by surgery—whether it be the pituitary gland, the kidney or the inner ear. Yet like the blank spots on maps of Central Africa from the mid-19th century, there are structures whose functions remain unknown despite whole-brain imaging, electroencephalographic recordings that monitor the brain's cacophony of electrical signals and other advanced tools of the 21st century. Consider the claustrum. It is a thin, irregular sheet of cells, tucked below the neocortex, the gray matter that allows us to see, hear, reason, think and remember. It is surrounded on all sides by white matter—the tracts, or wire bundles, that interconnect cortical regions with one another and with other brain regions. The claustra—for there are two of them, one on the left side of the brain and one on the right—lie below the general region of the insular cortex, underneath the temples, just above the ears. They assume a long, thin wisp of a shape that is easily overlooked when inspecting the topography of a brain image. Advanced brain-imaging techniques that look at the white matter fibers coursing to and from the claustrum reveal that it is a neural Grand Central Station. Almost every region of the cortex sends fibers to the claustrum. These connections are reciprocated by other fibers that extend back from the claustrum to the originating cortical region. Neuroanatomical studies in mice and rats reveal a unique asymmetry—each claustrum receives input from both cortical hemispheres but only projects back to the overlying cortex on the same side. Whether or not this is true in people is not known. Curiouser and curiouser, as Alice would have said. © 2014 Scientific American
Link ID: 20350 - Posted: 11.24.2014
By Amy Ellis Nutt Debbie Hall undergoes external brain stimulation at Ohio State's Wexner Medical Center. Hall was partially paralyzed on her left side after a stroke. Doctors are conducting a study to see if a device known as NexStim can `prep` a stroke victim's brain immediately prior to physical therapy so that the therapy will be more effective. (The Ohio State University Wexner Medical Center) Using non-invasive transcranial magnetic stimulation, or TMS, researchers at Ohio State Wexner Medical Center may have found a way to help prep a stroke victim's brain prior to physical therapy to aid a more complete recovery. When one side of the brain is damaged by a stroke, the corresponding healthy part goes into overdrive in order to compensate, said Dr. Marcie Bockbrader, principle investigator of the study. She believes the hyperactivity in the healthy side may actually slow recovery in the injured side. The technology, called NexStim, employs TMS to prepare a stroke patient's brain for physical therapy by sending low-frequency magnetic pulses painlessly through a victim's scalp to suppress activity in the healthy part of the motor cortex. This allows the injured side to make use of more energy during physical therapy, which immediately follows the transcranial magnetic stimulation. "This device targets the overactive side, quieting it down enough, so that through therapies the injured side can learn to express itself again," said Bockbrader, an assistant professor of physical medicine and rehabilitation, in a new release.
Link ID: 20349 - Posted: 11.24.2014
by Hal Hodson Yet another smartwatch launched this week. Called Embrace, it is rather different from the latest offerings from Apple, Samsung and Motorola: it can spot the warning signs of an epileptic seizure. Embrace was developed by Matteo Lai and his team at a firm called Empatica, with the help of Rosalind Picard at the Massachusetts Institute of Technology. It measures the skin's electrical activity as a proxy for changes deep in the brain, and uses a model built on years of clinical data to tell which changes portend a seizure. It also gathers the usual temperature and motion data that smartwatches collect, allowing the wearer to measure physical activity and sleep quality. Empatica launched a crowdfunding campaign on Indiegogo on Tuesday and has already raised more than $120,000. Backers who pledge $169 will receive an Embrace watch. The idea for the wristband came when Picard and her colleagues were running a study on the emotional states of children with autism, measuring skin conductance at the wrist as part of the study. Picard noticed that one of the children had registered a spike in electrical activity that turned out to have happened 20 minutes before they noticed the symptoms of a seizure. "It shocked me when I realised these things were showing up on the wrist," says Picard. The whole point of Embrace is to prevent sudden unexplained death in epilepsy (SUDEP). Its causes are not fully understood, but Picard says they understand enough to know how to reduce the chances of dying after an epileptic seizure. © Copyright Reed Business Information Ltd.
Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o
Link ID: 20347 - Posted: 11.24.2014
By CLYDE HABERMAN The notion that a person might embody several personalities, each of them distinct, is hardly new. The ancient Romans had a sense of this and came up with Janus, a two-faced god. In the 1880s, Robert Louis Stevenson wrote “Strange Case of Dr. Jekyll and Mr. Hyde,” a novella that provided us with an enduring metaphor for good and evil corporeally bound. Modern comic books are awash in divided personalities like the Hulk and Two-Face in the Batman series. Even heroic Superman has his alternating personas. But few instances of the phenomenon captured Americans’ collective imagination quite like “Sybil,” the study of a woman said to have had not two, not three (like the troubled figure in the 1950s’ “Three Faces of Eve”), but 16 different personalities. Alters, psychiatrists call them, short for alternates. As a mass-market book published in 1973, “Sybil” sold in the millions. Tens of millions watched a 1976 television movie version. The story had enough juice left in it for still another television film in 2007. Sybil Dorsett, a pseudonym, became the paradigm of a psychiatric diagnosis once known as multiple personality disorder. These days, it goes by a more anodyne label: dissociative identity disorder. Either way, the strange case of the woman whose real name was Shirley Ardell Mason made itself felt in psychiatrists’ offices across the country. Pre-"Sybil,” the diagnosis was rare, with only about 100 cases ever having been reported in medical journals. Less than a decade after “Sybil” made its appearance, in 1980, the American Psychiatric Association formally recognized the disorder, and the numbers soared into the thousands. People went on television to tell the likes of Jerry Springer and Leeza Gibbons about their many alters. One woman insisted that she had more than 300 identities within her (enough, if you will, to fill the rosters of a dozen major-league baseball teams). Even “Eve,” whose real name is Chris Costner Sizemore, said in the mid-1970s that those famous three faces were surely an undercount. It was more like 22, she said. © 2014 The New York Times Company
Link ID: 20346 - Posted: 11.24.2014
Christopher Stringer Indeed, skeletal evidence from every inhabited continent suggests that our brains have become smaller in the past 10,000 to 20,000 years. How can we account for this seemingly scary statistic? Some of the shrinkage is very likely related to the decline in humans' average body size during the past 10,000 years. Brain size is scaled to body size because a larger body requires a larger nervous system to service it. As bodies became smaller, so did brains. A smaller body also suggests a smaller pelvic size in females, so selection would have favored the delivery of smaller-headed babies. What explains our shrinking body size, though? This decline is possibly related to warmer conditions on the earth in the 10,000 years after the last ice age ended. Colder conditions favor bulkier bodies because they conserve heat better. As we have acclimated to warmer temperatures, the way we live has also generally become less physically demanding, which overall serves to drive down body weights. Another likely reason for this decline is that brains are energetically expensive and will not be maintained at larger sizes unless it is necessary. The fact that we increasingly store and process information externally—in books, computers and online—means that many of us can probably get by with smaller brains. Some anthropologists have also proposed that larger brains may be less efficient at certain tasks, such as rapid computation, because of longer connection pathways. © 2014 Scientific American
Link ID: 20345 - Posted: 11.24.2014
by Linda Geddes A tapeworm that usually infects dogs, frogs and cats has made its home inside a man's brain. Sequencing its genome showed that it contains around 10 times more DNA than any other tapeworm sequenced so far, which could explain its ability to invade many different species. When a 50-year-old Chinese man was admitted to a UK hospital complaining of headaches, seizures, an altered sense of smell and memory flashbacks, his doctors were stumped. Tests for tuberculosis, syphilis, HIV and Lyme disease were negative, and although an MRI scan showed an abnormal region in the right side of his brain, a biopsy found inflammation, but no tumour. Over the next four years, further MRIs recorded the abnormal region moving across the man's brain (see animation), until finally his doctors decided to operate. To their immense surprise, they pulled out a 1 centimetre-long ribbon-shaped worm. It looked like a tapeworm, but was unlike any seen before in the UK, so a sample of its tissue was sent to Hayley Bennett and her colleagues at the Wellcome Trust Sanger Institute in Cambridge, UK. Genetic sequencing identified it as Spirometra erinaceieuropaei, a rare species of tapeworm found in China, South Korea, Japan and Thailand. Just 300 human infections have been reported since 1953, and not all of them in the brain. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 20344 - Posted: 11.21.2014
By Tara Parker-Pope Most people who drink to get drunk are not alcoholics, suggesting that more can be done to help heavy drinkers cut back, a new government report concludes. The finding, from a government survey of 138,100 adults, counters the conventional wisdom that every “falling-down drunk” must be addicted to alcohol. Instead, the results from the National Survey on Drug Use and Health show that nine out of 10 people who drink too much are not addicts, and can change their behavior with a little — or perhaps a lot of — prompting. “Many people tend to equate excessive drinking with alcohol dependence,’’ sad Dr. Robert Brewer, who leads the alcohol program at the Centers for Disease Control and Prevention. “We need to think about other strategies to address these people who are drinking too much but who are not addicted to alcohol.” Excessive drinking is viewed as a major public health problem that results in 88,000 deaths a year, from causes that include alcohol poisoning and liver disease, to car accidents and other accidental deaths. Excessive drinking is defined as drinking too much at one time or over the course of a week. For men, it’s having five or more drinks in one sitting or 15 drinks or more during a week. For women, it’s four drinks on one occasion or eight drinks over the course of a week. Underage drinkers and women who drink any amount while pregnant also are defined as “excessive drinkers.” Surprisingly, about 29 percent of the population meets the definition for excessive drinking, but 90 percent of them do not meet the definition of alcoholism. That’s good news because it means excessive drinking may be an easier problem to solve than previously believed. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20342 - Posted: 11.21.2014
By Jyoti Madhusoodanan Eurasian jays are tricky thieves. They eavesdrop on the noises that other birds make while hiding food in order to steal the stash later, new research shows. Scientists trying to figure out if the jays (Garrulus glandarius) could remember sounds and make use of the information placed trays of two materials—either sand or gravel—in a spot hidden from a listening jay’s view. Other avian participants of the same species, which were given a nut, cached the treat in one of the two trays. Fifteen minutes later, the listening bird was permitted to hunt up the stash (video). When food lay buried in a less noisy material such as sand, jays searched randomly. But if they heard gravel being tossed around as treats were hidden, they headed to the pebbles to pilfer the goods. Previous studies have shown that jays—like crows, ravens, and other bird burglars that belong to the corvid family—can remember where they saw food being hidden and return to the spot to look for the cache. But these new results, published in Animal Cognition this month, provide the first evidence that these corvids can also recollect sounds to locate and steal stashes of food. In their forest homes, where birds are heard more often than they are seen, this sneaky strategy might give eavesdropping jays a better chance at finding hidden feasts.
Link ID: 20339 - Posted: 11.21.2014