Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1660

by Viviane Callier It's a fresh problem. People who smoke menthol cigarettes often smoke more frequently and can be less likely to quit – and it could be because fresh-tasting menthol is changing their brains to more sensitive to nicotine. How menthol enhances nicotine addiction has been something of a mystery. Now, Brandon Henderson at the California Institute of Technology in Pasadena and his colleagues have shown that exposing mice to menthol alone causes them to develop more nicotinic receptors, the parts of the brain that are targeted by nicotine. Menthol can be used medically to relieve minor throat irritations, and menthol-flavoured cigarettes were first introduced in the 1920s. But smokers of menthol cigarettes can be less likely to quit. In one study of giving up smoking, 50 per cent of unflavoured-cigarette smokers were able to quit, while menthol smokers showed quitting rates as low as 23 per cent, depending on ethnicity. Over time, smokers of both menthol and unflavoured cigarettes acquire more receptors for nicotine, particularly in neurons involved in the body's neural pathways for reward and motivation. And research last year showed that smokers of menthol cigarettes develop even more of these receptors than smokers of unflavoured cigarettes. To understand how menthol may be altering the brain, Henderson's team exposed mice to either menthol with nicotine, or menthol alone. They found that, even without nicotine, menthol increased the numbers of brain nicotinic receptors. They saw a 78 per cent increase in one particular brain region – the ventral tegmental area – which is involved in the dopamine signalling pathway that mediates in addiction. © Copyright Reed Business Information Ltd.

Keyword: Drug Abuse
Link ID: 20395 - Posted: 12.06.2014

Injections of a new drug may partially relieve paralyzing spinal cord injuries, based on indications from a study in rats, which was partly funded by the National Institutes of Health. The results demonstrate how fundamental laboratory research may lead to new therapies. “We’re very excited at the possibility that millions of people could, one day, regain movements lost during spinal cord injuries,” said Jerry Silver, Ph.D., professor of neurosciences, Case Western Reserve University School of Medicine, Cleveland, and a senior investigator of the study published in Nature. Every year, tens of thousands of people are paralyzed by spinal cord injuries. The injuries crush and sever the long axons of spinal cord nerve cells, blocking communication between the brain and the body and resulting in paralysis below the injury. On a hunch, Bradley Lang, Ph.D., the lead author of the study and a graduate student in Dr. Silver’s lab, came up with the idea of designing a drug that would help axons regenerate without having to touch the healing spinal cord, as current treatments may require. “Originally this was just a side project we brainstormed in the lab,” said Dr. Lang. After spinal cord injury, axons try to cross the injury site and reconnect with other cells but are stymied by scarring that forms after the injury. Previous studies suggested their movements are blocked when the protein tyrosine phosphatase sigma (PTP sigma), an enzyme found in axons, interacts with chondroitin sulfate proteoglycans, a class of sugary proteins that fill the scars.

Keyword: Regeneration
Link ID: 20394 - Posted: 12.04.2014

Ewen Callaway A shell found on Java in the late 1800s was recently found to bear markings that seem to have been carved intentionally half a million years ago. The photograph is about 15 millimetres wide. Expand A zigzag engraving on a shell from Indonesia is the oldest abstract marking ever found. But what is most surprising about the half-a-million-year-old doodle is its likely creator — the human ancestor Homo erectus. "This is a truly spectacular find and has the potential to overturn the way we look at early Homo," says Nick Barton, an archaeologist at the University of Oxford, UK, who was not involved in the discovery, which is described in a paper published online in Nature on 3 December1. By 40,000 years ago, and probably much earlier, anatomically modern humans — Homo sapiens — were painting on cave walls in places as far apart as Europe2 and Indonesia3. Simpler ochre engravings found in South Africa date to 100,000 years ago4. Earlier this year, researchers reported a 'hashtag' engraving in a Gibraltar cave once inhabited by Neanderthals5. That was the first evidence for drawing in any extinct species. But until the discovery of the shell engraving, nothing approximating art has been ascribed to Homo erectus. The species emerged in Africa about 2 million years ago and trekked as far as the Indonesian island of Java, before going extinct around 140,000 years ago. Most palaeoanthropologists consider the species to be the direct ancestor of both humans and Neanderthals. © 2014 Nature Publishing Group

Keyword: Evolution
Link ID: 20390 - Posted: 12.04.2014

Jia You Ever wonder how cockroaches scurry around in the dark while you fumble to switch on the kitchen light? Scientists know the insect navigates with its senses of touch and smell, but now they have found a new piece to the puzzle: A roach can also see its environment in pitch darkness, by pooling visual signals from thousands of light-sensitive cells in each of its compound eyes, known as photoreceptors. To test the sensitivity of roach vision, researchers created a virtual reality system for the bugs, knowing that when the environment around a roach rotates, the insect spins in the same direction to stabilize its vision. First, they placed the roach on a trackball, where it couldn’t navigate with its mouthpart or antennae. Then the scientists spun black and white gratings around the insect, illuminated by light at intensities ranging from a brightly lit room to a moonless night. The roach responded to its rotating environment in light as dim as 0.005 lux, when each of its photoreceptors was picking up only one photon every 10 seconds, the researchers report online today in The Journal of Experimental Biology. They suggest that the cockroach must rely on unknown neural processing in the deep ganglia, an area in the base of the brain involved in coordinating movements, to process such complex visual information. Understanding this mechanism could help scientists design better imaging systems for night vision. © 2014 American Association for the Advancement of Science.

Keyword: Vision
Link ID: 20389 - Posted: 12.04.2014

Katharine Sanderson Although we do not have X-ray vision like Superman, we have what could seem to be another superpower: we can see infrared light — beyond what was traditionally considered the visible spectrum. A series of experiments now suggests that this little-known, puzzling effect could occur when pairs of infrared photons simultaneously hit the same pigment protein in the eye, providing enough energy to set in motion chemical changes that allow us to see the light. Received wisdom, and the known chemistry of vision, say that human eyes can see light with wavelengths between 400 (blue) and 720 nanometres (red). Although this range is still officially known as the 'visible spectrum', the advent of lasers with very specific infrared wavelengths brought reports that people were seeing laser light with wavelengths above 1,000 nm as white, green and other colours. Krzysztof Palczewski, a pharmacologist at Case Western Reserve University in Cleveland, Ohio, says that he has seen light of 1,050 nm from a low-energy laser. “You see it with your own naked eye,” he says. To find out whether this ability is common or a rare occurrence, Palczewski scanned the retinas of 30 healthy volunteers with a low-energy beam of light, and changed its wavelength. As the wavelength increased into the infrared (IR), participants found the light at first harder to detect, but at around 1,000 nm the light became easier to see. How humans can do this has puzzled scientists for years. Palczewski wanted to test two leading hypotheses to explain infrared vision. © 2014 Nature Publishing Group,

Keyword: Vision
Link ID: 20388 - Posted: 12.03.2014

By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20387 - Posted: 12.03.2014

Katie Langin In the first couple of years after birth, sea lion sons seem to be more reliant on their mothers—consuming more milk and sticking closer to home—than sea lion daughters are, according to a study on Galápagos sea lions published in the December issue of the journal Animal Behaviour. The young males venture out to sea on occasion, but their female counterparts dive for their own food much more often. The curious thing is, it's not like the young males aren't capable of diving. As one-year-olds, males can dive to the same depth as females (33 feet, or 10 meters, on a typical dive). It's also not like their mother's milk is always on hand. Sea lion moms frequently leave their growing offspring for days at a time to find food at sea. (Watch a video of a Galápagos sea lion giving birth.) And yet, despite all this, for some reason sons are far less likely than daughters to take to the sea and seek out their own food. "We always saw the [young] males around the colony surfing in tide pools, pulling the tails of marine iguanas, resting, sleeping," said Paolo Piedrahita, a Ph.D. student at Bielefeld University in Germany and the lead author of the study. "It's amazing. You can see an animal—40 kilograms [88 pounds]—just resting, waiting for mom." © 1996-2014 National Geographic Society.

Keyword: Sexual Behavior; Evolution
Link ID: 20384 - Posted: 12.03.2014

Close to 8 percent of Americans have depression of some kind, but only about a third of those are getting treated for it, a major federal survey finds. The most depressed group? Women ages 40 to 59. More than 12 percent of women that age say they're depressed. The least? Teenage boys. Just 4 percent of them have been diagnosed with depression. "During 2009-2012, 7.6 percent of Americans aged 12 and over had depression (moderate or severe depressive symptoms in the past 2 weeks)," Laura Pratt and Debra Brody of the National Center for Health Statistics wrote. "About 3 percent of Americans aged 12 and over had severe depressive symptoms," they added. "Of those with severe symptoms, 35 percent reported having contact with a mental health professional in the past year." This is troubling, because depression is difficult to treat and does best when people are given a combination of drugs and counseling. People living below the poverty level were more than twice as likely to have depression than people making more money. Almost 43 percent of people with severe depressive symptoms reported serious difficulties in work, home and social activities.

Keyword: Depression
Link ID: 20383 - Posted: 12.03.2014

|By Ryan Bradley Five years ago Viviana Gradinaru was slicing thin pieces of mouse brain in a neurobiology lab, slowly compiling images of the two-dimensional slivers for a three-dimensional computer rendering. In her spare time, she would go to see the Body Worlds exhibit. She was especially fascinated by the “plasticized” remains of the human circulatory system on display. It struck her that much of what she was doing in the lab could be done more efficiently with a similar process. “Tissue clearing” has been around for more than a century, but existing methods involve soaking tissue samples in solvents, which is slow and usually destroys the fluorescent proteins necessary for marking certain cells of interest. To create a better approach, Gradinaru, at the time a graduate student, and her colleagues in neuroscientist Karl Deisseroth's lab focused on replacing the tissue's lipid molecules, which make it opaque.* To keep the tissue from collapsing, however, the replacement would need to give it structure, as lipids do. The first step was to euthanize a rodent and pump formaldehyde into its body, through its heart. Next they removed the skin and filled its blood vessels with acrylamide monomers, white, odorless, crystalline compounds. The monomers created a supportive hydrogel mesh, replacing the lipids and clearing the tissue. Before long, they could render an entire mouse body transparent in two weeks. Soon they were using transparent mice to map complete mouse nervous systems. The transparency made it possible for them to identify peripheral nerves—tiny bundles of nerves that are poorly understood—and to map the spread of viruses across the mouse's blood-brain barrier, which they did by marking the virus with a fluorescent agent, injecting it into the mouse's tail and watching it spread into the brain. © 2014 Scientific American

Keyword: Brain imaging
Link ID: 20382 - Posted: 12.03.2014

By Joyce Cohen Like many people, George Rue loved music. He played guitar in a band. He attended concerts often. In his late 20s, he started feeling a dull ache in his ears after musical events. After a blues concert almost nine years ago, “I left with terrible ear pain and ringing, and my life changed forever,” said Mr. Rue, 45, of Waterford, Conn. He perceived all but the mildest sounds as not just loud, but painful. It hurt to hear. Now, he has constant, burning pain in his ears, along with ringing, or tinnitus, so loud it’s “like a laser beam cutting a sheet of steel.” Everyday noise, like a humming refrigerator, adds a feeling of “needles shooting into my ears,” said Mr. Rue, who avoids social situations and was interviewed by email because talking by phone causes pain. Mr. Rue was given a diagnosis of hyperacusis, a nonspecific term that has assorted definitions, including “sound sensitivity,” “decreased sound tolerance,” and “a loudness tolerance problem.” But hyperacusis sometimes comes with ear pain, too, a poorly understood medical condition that is beginning to receive more serious attention. “This is clearly an emerging field,” said Richard Salvi of the Department of Communicative Disorders and Sciences at the University at Buffalo and a scientific adviser to Hyperacusis Research, a nonprofit group that funds research on the condition. “Further work is required to understand the symptoms, etiology and underlying neural mechanisms.” Loud noises, even when they aren’t painful, can damage both the sensory cells and sensory nerve fibers of the inner ear over time, causing hearing impairment, said M. Charles Liberman, a professor of otology at Harvard Medical School, who heads a hearing research lab at the Massachusetts Eye and Ear Infirmary. And for some people who are susceptible, possibly because of some combination of genes that gives them “tender” ears, noise sets in motion “an anomalous response,” he said. © 2014 The New York Times Company

Keyword: Hearing
Link ID: 20381 - Posted: 12.02.2014

|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,

Keyword: Learning & Memory
Link ID: 20380 - Posted: 12.02.2014

By Nicholas Bakalar Short-term psychotherapy may be an effective way to prevent repeated suicide attempts. Using detailed Danish government health records, researchers studied 5,678 people who had attempted suicide and then received a program of short-term psychotherapy based on needs, including crisis intervention, cognitive therapy, behavioral therapy, and psychodynamic and psychoanalytic treatment. They compared them with 17,034 people who had attempted suicide but received standard care, including admission to a hospital, referral for treatment or discharge with no referral. They were able to match the groups in more than 30 genetic, health, behavioral and socioeconomic characteristics. The study is online in Lancet Psychiatry. Treatment focused on suicide prevention and comprised eight to 10 weeks of individual sessions. Over a 20-year follow-up, 16.5 percent of the treated group attempted suicide again, compared with 19.1 percent of the untreated group. In the treated group, 1.6 percent died by suicide, compared with 2.2 percent of the untreated. “Suicide is a rare event,” said the lead author, Annette Erlangsen, an associate professor at the Johns Hopkins Bloomberg School of Public Health, “and you need a huge sample to study it. We had that, and we were able to find a significant effect.” The authors estimate that therapy prevented 145 suicide attempts and 30 deaths by suicide in the group studied. © 2014 The New York Times Company

Keyword: Depression
Link ID: 20379 - Posted: 12.02.2014

By Sarah C. P. Williams Craving a stiff drink after the holiday weekend? Your desire to consume alcohol, as well as your body’s ability to break down the ethanol that makes you tipsy, dates back about 10 million years, researchers have discovered. The new finding not only helps shed light on the behavior of our primate ancestors, but also might explain why alcoholism—or even the craving for a single drink—exists in the first place. “The fact that they could put together all this evolutionary history was really fascinating,” says Brenda Benefit, an anthropologist at New Mexico State University, Las Cruces, who was not involved in the study. Scientists knew that the human ability to metabolize ethanol—allowing people to consume moderate amounts of alcohol without getting sick—relies on a set of proteins including the alcohol dehydrogenase enzyme ADH4. Although all primates have ADH4, which performs the crucial first step in breaking down ethanol, not all can metabolize alcohol; lemurs and baboons, for instance, have a version of ADH4 that’s less effective than the human one. Researchers didn’t know how long ago people evolved the more active form of the enzyme. Some scientists suspected it didn’t arise until humans started fermenting foods about 9000 years ago. Matthew Carrigan, a biologist at Santa Fe College in Gainesville, Florida, and colleagues sequenced ADH4 proteins from 19 modern primates and then worked backward to determine the sequence of the protein at different points in primate history. Then they created copies of the ancient proteins coded for by the different gene versions to test how efficiently each metabolized ethanol. They showed that the most ancient forms of ADH4—found in primates as far back as 50 million years ago—only broke down small amounts of ethanol very slowly. But about 10 million years ago, the team reports online today in the Proceedings of the National Academy of Sciences, a common ancestor of humans, chimpanzees, and gorillas evolved a version of the protein that was 40 times more efficient at ethanol metabolism. © 2014 American Association for the Advancement of Science.

Keyword: Drug Abuse; Evolution
Link ID: 20378 - Posted: 12.02.2014

by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Glia
Link ID: 20375 - Posted: 12.01.2014

By CATHERINE SAINT LOUIS Nearly 55 percent of infants nationwide are put to bed with soft blankets or covered by a comforter, even though such bedding raises the chances of suffocation or sudden infant death syndrome, federal researchers reported Monday. Their study, published in the journal Pediatrics, is the first to estimate how many infants sleep with potentially hazardous quilts, bean bags, blankets or pillows. Despite recommendations to avoid putting anything but a baby in a crib, two-thirds of black and Latino parents still use bedding that is both unnecessary and unsafe, the study also found. “I was startled a little bit by the number of people still using bedding in the sleep area,” said Dr. Michael Goodstein, a neonatologist in York, Pa., who serves on a task force on sleep-related infant deaths at the American Academy of Pediatrics. “Sleeping face down on soft bedding increases the risks of SIDS 21-fold.” Among the risk factors for SIDS, “bedding has fallen through the cracks,” said Dr. Thomas G. Keens, the chairman of the California SIDS Advisory Council. “This article is a wake-up call.” The new analysis looked at data gathered from 1993 to 2010 in the National Infant Sleep Position Study, which surveyed a random sample of nearly 19,000 parents by telephone. Use of infant bedding declined roughly 23 percent annually from 1993 to 2000. In recent years, however, the declines have slowed or stalled entirely. From 2001 to 2010, use of inappropriate bedding for white and Hispanic infants declined just 5 to 7 percent annually. There was no decline in the use of such bedding for black infants. Parents in the new study were not asked their reasons for using bedding. Previous research has found that they worry infants will be cold, or that the crib mattress is too hard. © 2014 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 20374 - Posted: 12.01.2014

By BILL PENNINGTON It happens dozens of times in every N.F.L. game. There is a fierce collision, or perhaps a running back is slammed to the ground. Most of the time, all the players rise to their feet uneventfully. Other times, as the pileup unravels, a player gets up slowly. His gait may be unsteady. For decades in the N.F.L., the operative term for the situation was that someone “got dinged.” It was a cute, almost harmless-sounding description of what was often a concussion or a worrying subconcussive blow to the head. But with the N.F.L. agreeing to pay hundreds of millions of dollars to settle a lawsuit brought by about 5,000 former players who said the league hid from them the dangers of repeated hits to the head, a backpedaling league has corrected its lingo and hastily amended its methodology. The N.F.L. now has a concussion management protocol, outlined in an inches-thick document that commands teams to institute a specific, detailed game-day and postconcussion course of action. Once, the treatment of players with head injuries varied from team to team and could be haphazard. Beginning last season, all players suspected of having a head injury — should they lose consciousness from a collision or experience symptoms like a headache, dizziness or disorientation — were required to go through the concussion protocol system. It features a broad cast: a head-injury spotter in the press box, athletic trainers on the bench, doctors and neuro-trauma specialists on the sideline and experts in neuro-cognitive testing in the locker room. The system is far from foolproof — players with serious symptoms remain in games. But as the N.F.L. grapples with a sobering threat to the welfare of its work force, not to mention a public-relations nightmare, the new concussion protocol is meant to establish a systemic, itemized policy on how potential brain injuries should be handled. © 2014 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 20372 - Posted: 12.01.2014

By John Edward Terrell We will certainly hear it said many times between now and the 2016 elections that the country’s two main political parties have “fundamental philosophical differences.” But what exactly does that mean? At least part of the schism between Republicans and Democrats is based in differing conceptions of the role of the individual. We find these differences expressed in the frequent heated arguments about crucial issues like health care and immigration. In a broad sense, Democrats, particularly the more liberal among them, are more likely to embrace the communal nature of individual lives and to strive for policies that emphasize that understanding. Republicans, especially libertarians and Tea Party members on the ideological fringe, however, often trace their ideas about freedom and liberty back to Enlightenment thinkers of the 17th and 18th centuries, who argued that the individual is the true measure of human value, and each of us is naturally entitled to act in our own best interests free of interference by others. Self-described libertarians generally also pride themselves on their high valuation of logic and reasoning over emotion. The basic unit of human social life is not and never has been the selfish and self-serving individual. Philosophers from Aristotle to Hegel have emphasized that human beings are essentially social creatures, that the idea of an isolated individual is a misleading abstraction. So it is not just ironic but instructive that modern evolutionary research, anthropology, cognitive psychology and neuroscience have come down on the side of the philosophers who have argued that the basic unit of human social life is not and never has been the selfish, self-serving individual. Contrary to libertarian and Tea Party rhetoric, evolution has made us a powerfully social species, so much so that the essential precondition of human survival is and always has been the individual plus his or her relationships with others. © 2014 The New York Times Company

Keyword: Evolution
Link ID: 20371 - Posted: 12.01.2014

Daniel Freeman and Jason Freeman “Although it is a waste of time to argue with a paranoid patient about his delusions, he may still be persuaded to keep them to himself, to repress them as far as possible and to forgo the aggressive action they might suggest, in general to conduct his life as if they did not exist.” This quote from Clinical Psychiatry, a hugely influential textbook in the 1950s and 1960s, epitomises the way in which unusual mental states were generally understood for much of the 20th century. Delusions (such as paranoid thoughts) and hallucinations (hearing voices, for example) were of interest purely as symptoms of psychosis, or what used to be called madness. Apart from their utility in diagnosis, they were deemed to be meaningless: the incomprehensible effusions of a diseased brain. Or in the jargon: “empty speech acts, whose informational content refers to neither world nor self”. There’s a certain irony here, of course, in experts supposedly dedicated to understanding the way the mind works dismissing certain thoughts as unworthy of attention or explanation. The medical response to these phenomena, which were considered to be an essentially biological problem, was to eradicate them with powerful antipsychotic drugs. This is not to say that other strategies weren’t attempted: in one revealing experiment in the 1970s, patients in a ward for “paranoid schizophrenics” in Vermont, US, were rewarded with tokens for avoiding “delusional talk”. These tokens could be exchanged for items including “meals, extra dessert, visits to the canteen, cigarettes, time off the ward, time in the TV and game room, time in bedroom between 8am and 9pm, visitors, books and magazines, recreation, dances on other wards.” (It didn’t work: most patients modified their behaviour temporarily, but “changes in a patient’s delusional system and general mental status could not be detected by a psychiatrist”.) © 2014 Guardian News and Media Limited

Keyword: Schizophrenia
Link ID: 20369 - Posted: 11.29.2014

|By Jason G. Goldman A sharp cry pierces the air. Soon a worried mother deer approaches the source of the sound, expecting to find her fawn. But the sound is coming from a speaker system, and the call isn't that of a baby deer at all. It's an infant fur seal's. Because deer and seals do not live in the same habitats, mother deer should not know how baby seal screams sound, reasoned biologists Susan Lingle of the University of Winnipeg and Tobias Riede of Midwestern University, who were running the acoustic experiment. So why did a mother deer react with concern? Over two summers, the researchers treated herds of mule deer and white-tailed deer on a Canadian farm to modified recording of the cries of a wide variety of infant mammals—elands, marmots, bats, fur seals, sea lions, domestic cats, dogs and humans. By observing how mother deer responded, Lingle and Riede discovered that as long as the fundamental frequency was similar to that of their own infants' calls, those mothers approached the speaker as if they were looking for their offspring. Such a reaction suggests deep commonalities among the cries of most young mammals. (The mother deer did not show concern for white noise, birdcalls or coyote barks.) Lingle and Riede published their findings in October in the American Naturalist. Researchers had previously proposed that sounds made by different animals during similar experiences—when they were in pain, for example—would share acoustic traits. “As humans, we often ‘feel’ for the cry of young animals,” Lingle says. That empathy may arise because emotions are expressed in vocally similar ways among mammals. © 2014 Scientific American

Keyword: Language; Evolution
Link ID: 20368 - Posted: 11.29.2014

By Virginia Morell When we listen to someone talking, we hear some sounds that combine to make words and other sounds that convey such things as the speaker’s emotions and gender. The left hemisphere of our brain manages the first task, while the right hemisphere specializes in the second. Dogs also have this kind of hemispheric bias when listening to the sounds of other dogs. But do they have it with human sounds? To find out, two scientists had dogs sit facing two speakers. The researchers then played a recorded short sentence—“Come on, then”—and watched which way the dogs turned. When the animals heard recordings in which individual words were strongly emphasized, they turned to the right—indicating that their left hemispheres were engaged. But when they listened to recordings that had exaggerated intonations, they turned to the left—a sign that the right hemisphere was responding. Thus, dogs seem to process the elements of speech very similarly to the way humans do, the scientists report online today in Current Biology. According to the researchers, the findings support the idea that our canine pals are indeed paying close attention not only to who we are and how we say things, but also to what we say. © 2014 American Association for the Advancement of Science.

Keyword: Animal Communication; Language
Link ID: 20366 - Posted: 11.29.2014