Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Featured Article

'Language Gene' Has a Partner

Few genes have made the headlines as much as FOXP2. The first gene associated with language disorders , it was later implicated in the evolution of human speech. Girls make more of the FOXP2 protein, which may help explain their precociousness in learning to talk. Now, neuroscientists have figured out how one of its molecular partners helps Foxp2 exert its effects.

The findings may eventually lead to new therapies for inherited speech disorders, says Richard Huganir, the neurobiologist at Johns Hopkins University School of Medicine in Baltimore, Maryland, who led the work. Foxp2 controls the activity of a gene called Srpx2, he notes, which helps some of the brain's nerve cells beef up their connections to other nerve cells. By establishing what SRPX2 does, researchers can look for defective copies of it in people suffering from problems talking or learning to talk.

Until 2001, scientists were not sure how genes influenced language. Then Simon Fisher, a neurogeneticist now at the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, and his colleagues fingered FOXP2 as the culprit in a family with several members who had trouble with pronunciation, putting words together, and understanding speech. These people cannot move their tongue and lips precisely enough to talk clearly, so even family members often can?t figure out what they are saying. It “opened a molecular window on the neural basis of speech and language,” Fisher says.

Photo credit: Yoichi Araki, Ph.D.


Links 1 - 20 of 20175

Hannah Devlin, science correspondent They may stop short of singing The Bells of Saint Mary’s, as demonstrated by the mouse organ in Monty Python, but scientists have discovered that male mice woo females with ultrasonic songs. The study shows for the first time that mouse song varies depending on the context and that male mice have a specific style of vocalisation reserved for when they smell a female in the vicinity. In turn, females appear to be more interested in this specific style of serenade than other types of squeak that male mice produce. “It was surprising to me how much change occurs to these songs in different social contexts, when the songs are thought to be innate,” said Erich Jarvis, who led the work at Duke University in North Carolina. “It is clear that the mouse’s ability to vocalise is a lot more limited than a songbird’s or human’s, and yet it’s remarkable that we can find these differences in song complexity.” The findings place mice in an elite group of animal vocalisers, that was once thought to be limited to birds, whales, and some primates. Mouse song is too high-pitched for the human ear to detect, but when listened to at a lower frequency, it sounds somewhere between birdsong and the noise of clean glass being scrubbed. The Duke University team recorded the male mice when they were roaming around their cages, when they were exposed to the smell of female urine and when they were placed in the presence of a female mouse. They found that males sing louder and more complex songs when they smell a female but don’t see her. By comparison, the songs were longer and simpler when they were directly addressing their potential mate, according to the findings published in Frontiers of Behavioural Neuroscience. © 2015 Guardian News and Media Limited

Keyword: Hearing; Sexual Behavior
Link ID: 20751 - Posted: 04.02.2015

Davide Castelvecchi Boots rigged with a simple spring-and-ratchet mechanism are the first devices that do not require power aids such as batteries to make walking more energy efficient. People walking in the boots expend 7% less energy than they do walking in normal shoes, the devices’ inventors report on 1 April in Nature1. That may not sound like much, but the mechanics of the human body have been shaped by millions of years of evolution, and some experts had doubted that there was room for further improvement in human locomotion, short of skating along on wheels. “It is the first paper of which I’m aware that demonstrates that a passive system can reduce energy expenditure during walking,” says Michael Goldfarb, a mechanical engineer at Vanderbilt University in Nashville, Tennessee, who develops exoskeletons for aiding people with disabilities. As early as the 1890s, inventors tried to boost the efficiency of walking by using devices such as rubber bands, says study co-author Gregory Sawicki, a biomedical engineer and locomotion physiologist at North Carolina State University in Raleigh. More recently, engineers have built unpowered exoskeletons that enable people to do tasks such as lifting heavier weights — but do not cut down the energy they expend. (Biomechanists still debate whether the running ‘blades’ made famous by South African sprinter Oscar Pistorius are more energetically efficient than human feet.2, 3) For their device, Sawicki and his colleagues built a mechanism that parallels human physiology. When a person swings a leg forward to walk, elastic energy is stored mostly in the Achilles tendon of their standing leg. That energy is released when the standing leg's foot pushes into the ground and the heel lifts off, propelling the body forwards. “There is basically a catapult in our ankle,” Sawicki says. © 2015 Nature Publishing Group

Keyword: Robotics
Link ID: 20750 - Posted: 04.02.2015

By Catherine Saint Louis Joni Mitchell, 71, was taken to a hospital in Los Angeles on Tuesday after she was found unconscious at her Los Angeles home. In recent years, the singer has complained of a number of health problems, including one particularly unusual ailment: Morgellons disease. People who believe they have the condition report lesions that don’t heal, “fibers” extruding from their skin and uncomfortable sensations like pins-and-needles tingling or stinging. Sufferers may also report fatigue and problems with short-term memory and concentration. But Morgellons is not a medically accepted diagnosis. Scientists have struggled for nearly a decade to find a cause and have come up mostly empty-handed. Researchers at the Centers for Disease Control and Prevention studied 115 people who said they had the condition. In a report published in 2012, they said they were unable to identify an infectious source for the patients’ “unexplained dermopathy.” There was no evidence of an environmental link, and the “fibers” from patients resembled those from clothing that had gotten trapped in a scab or crusty skin. The investigators cast doubt on Morgellons as a distinct condition and said that it might be something doctors were already familiar with: delusional infestation, a psychiatric condition characterized by an unshakable but erroneous belief that one’s skin is infested with bugs or parasites. Drug use can contribute to such delusions, and the investigators noted evidence of drug use — prescription or illicit — in half of the people they examined. Of the 36 participants who completed neuropsychological testing, 11 percent had high scores for depression, and 63 percent, unsurprisingly, were preoccupied with health issues. © 2015 The New York Times Company

Keyword: Pain & Touch
Link ID: 20749 - Posted: 04.02.2015

​​The commonly-prescribed drug acetaminophen or paracetamol does nothing to help low back pain, and may affect the liver when used regularly, a large new international study has confirmed. Reporting in today's issue of the British Medical Journal researchers also say the benefits of the drug are unlikely to be worth the risks when it comes to treating osteoarthritis in the hip or knee. "Paracetamol has been widely recommended as being a safe medication, but what we are saying now is that paracetamol doesn't bring any benefit for patients with back pain, and it brings only trivial benefits to those with osteoarthritis," Gustavo Machado of The George Institute for Global Health and the University of Sydney, tells the Australian Broadcasting Corporation. "In addition to that it might bring harm to those patients." Most international clinical guidelines recommend acetaminophen as the "first choice" of treatment for low back pain and osteoarthritis of the hip and knee. However, despite a trial last year questioning the use of acetaminophen to treat low back pain, there has never been a systematic review of the evidence for this. Machado and colleagues analyzed three clinical trials and confirmed that acetaminophen is no better than placebo at treating low back pain. An analysis of 10 other clinical trials by the researchers quantified for the first time the effect acetaminophen has on reducing pain from osteoarthritis in the knee and hip. "We concluded that it is too small to be clinically worthwhile," says Machado. He says the effects of acetaminophen on the human body are not well understood and just because it can stop headaches, it doesn't mean the drug will work in all circumstances. ©2015 CBC/Radio-Canada.

Keyword: Pain & Touch
Link ID: 20748 - Posted: 04.02.2015

Alison Abbott Historian of psychology Douwe Draaisma knows well how to weave science, history and literature into irresistible tales. Forgetting, his latest collection of essays around the theme of memory, is — like his successful Nostalgia Factory (Yale University Press, 2013) — hard to put down. His vivid tour through the history of memory-repression theories brings home how dangerous and wrong, yet persistent, were the ideas of Sigmund Freud and his intellectual heirs. Freud thought that traumatic memories and shameful thoughts could be driven from the consciousness, but not forgotten. They would simmer in the unconscious, influencing behaviour. He maintained that forcing them out with psychoanalysis, and confronting patients with them, would be curative. Draaisma relates the case of an 18-year-old whom Freud dubbed Dora, diagnosed in 1900 with 'hysteria'. Dora's family refused to believe that the husband of her father's mistress had made sexual advances to her. Among other absurdities, Freud told Dora that her nervous cough reflected her repressed desire to fellate the man. Dora broke off the therapy, which Freud saw as proof of his theory. He thought that patients will naturally resist reawakening painful thoughts. What Dora did not buy, plenty of others did. Psychoanalysis boomed, becoming lucrative. Its principles were adopted in the 1990s by an unlikely alliance of lawyers and some feminists, who argued that repressed memories of childhood abuse could be recovered with techniques such as hypnosis, and used as evidence in court. Many judges went along with it; the rush of claims cast a shadow over genuine cases of abuse, Draaisma points out. We now know from studies of post-traumatic stress disorder that traumatic memories are impossible to repress. They flood into the conscious mind in horrifying flashbacks. © 2015 Macmillan Publishers Limited

Keyword: Learning & Memory
Link ID: 20747 - Posted: 04.02.2015

Helen Shen An ambitious plan is afoot to build the world’s largest public catalogue of neuronal structures. The BigNeuron project, announced on 31 March by the Allen Institute for Brain Science in Seattle, Washington, is designed to help researchers to simulate and understand the human brain. The project might also push neuroscientists to wrestle with fundamental — sometimes even emotional — questions about how to classify neurons. It is the era of the mega-scale brain initiative: Europe’s Human Brain Project aims to model the human brain in a supercomputer, and the US BRAIN Initiative hopes to unravel how networks of neurons work together to produce thoughts and actions. Standing in the way of these projects is a surprising limitation. “We still don’t know how many classes of neurons are in the brain,” says neuroscientist Rafael Yuste at Columbia University in New York City. BigNeuron aims to generate detailed descriptions of tens of thousands of individual neurons from various species, including fruit flies, zebrafish, mice and humans, and to suggest the best computer algorithms for extracting the finely branched shapes of these cells from microscopy data — a difficult and error-prone process. Getting the details of the shapes right is crucial to accurately modelling the behaviour of neurons: their geometry helps to determine how they process and transmit information through electrical and chemical signals. © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20746 - Posted: 04.01.2015

By Virginia Morell Rats and mice in pain make facial expressions similar to those in humans—so similar, in fact, that a few years ago researchers developed rodent “grimace scales,” which help them assess an animal’s level of pain simply by looking at its face. But scientists have questioned whether these expressions convey anything to other rodents, or if they are simply physiological reactions devoid of meaning. Now, researchers report that other rats do pay attention to the emotional expressions of their fellows, leaving an area when they see a rat that’s suffering. “It’s a finding we thought might be true, and are glad that someone figured out how to do an experiment that shows it,” says Jeffrey Mogil, a neuroscientist at McGill University in Montreal, Canada. Mogil’s lab developed pain grimace scales for rats and mice in 2006, and it discovered that mice experience pain when they see a familiar mouse suffering—a psychological phenomenon known as emotional contagion. According to Mogil, a rodent in pain expresses its anguish through narrowed eyes, flattened ears, and a swollen nose and cheeks. Because people can read these visual cues and gauge the intensity of the animal’s pain, Mogil has long thought that other rats could do so as well. In Japan, Satoshi Nakashima, a social cognition psychologist at NTT Communication Science Laboratories in Kanagawa, thought the same thing. And, knowing that other scientists had recently shown that mice can tell the difference between paintings by Picasso and Renoir, he decided to see if rodents could also discriminate between photographs of their fellows’ expressions. He designed the current experiments as part of his doctoral research. © 2015 American Association for the Advancement of Science

Keyword: Pain & Touch; Emotions
Link ID: 20745 - Posted: 04.01.2015

Mo Costandi During the 1960s, neuroscientists Ronald Melzack and Patrick Wall proposed an influential new theory of pain. At the time, researchers were struggling to explain the phenomenon. Some believed that specific nerve fibres carry pain signals up into the brain, while others argued that the pain signals are transmitted by intense firing of non-specific fibres. Neither idea was entirely satisfactory, because they could not explain why spinal surgery often fails to abolish pain, why gentle touch and other innocuous stimuli can sometimes cause excruciating pain, or why intensely painful stimuli are not always experienced as such. Melzack and Wall’s Gate Control Theory stated that inhibitory neurons in the spinal cord control the relay of pain signals into the brain. Despite having some holes in it, the theory provided a revolutionary new framework for understanding the neural basis of pain, and ushered in the modern era of pain research. Now, almost exactly 50 years after the publication of Melzack and Wall’s theory, European researchers provide direct evidence of gatekeeper cells that control the flow of pain and itch signals from the spinal cord to the brain. The experience that we call “pain” is an extremely complex one that often involves emotional aspects. Researchers therefore distinguish it from nociception, the process by which the nervous system detects noxious stimuli. Nociception is mediated by primary sensory neurons, whose cell bodies are clumped together in the dorsal root ganglia that run alongside the spinal cord. Each has a single fibre that splits in two not far from the cell body, sending one branch out to the skin surface and the other into the spinal cord. © 2015 Guardian News and Media Limited

Keyword: Pain & Touch; Emotions
Link ID: 20744 - Posted: 04.01.2015

By LAWRENCE K. ALTMAN, M.D WASHINGTON — Even before Ronald Reagan became the oldest elected president, his mental state was a political issue. His adversaries often suggested his penchant for contradictory statements, forgetting names and seeming absent-mindedness could be linked to dementia. In 1980, Mr. Reagan told me that he would resign the presidency if White House doctors found him mentally unfit. Years later, those doctors and key aides told me they had not detected any changes in his mental abilities while in office. Now a clever new analysis has found that during his two terms in office, subtle changes in Mr. Reagan’s speaking patterns linked to the onset of dementia were apparent years before doctors diagnosed his Alzheimer’s disease in 1994. The findings, published in The Journal of Alzheimer’s Disease by researchers at Arizona State University, do not prove that Mr. Reagan exhibited signs of dementia that would have adversely affected his judgment and ability to make decisions in office. But the research does suggest that alterations in speech one day might be used to predict development of Alzheimer’s and other neurological conditions years before symptoms are clinically perceptible. Detection of dementia at the earliest stages has become a high priority. Many experts now believe that yet-to-be-developed treatments are likely to be effective at preventing or slowing progression of dementia only if it is found before it significantly damages the brain. The “highly innovative” methods used by the researchers may eventually help “to further clarify the extent to which spoken-word changes are associated with normal aging or predictive of subsequent progression to the clinical stages of Alzheimer’s disease,” said Dr. Eric Reiman, the director of the Banner Alzheimer’s Institute in Phoenix, who was not involved in the new study. © 2015 The New York Times Company

Keyword: Alzheimers; Language
Link ID: 20743 - Posted: 04.01.2015

Scientists have found that a compound originally developed as a cancer therapy potentially could be used to treat Alzheimer’s disease. The team demonstrated that the drug, saracatinib, restores memory loss and reverses brain problems in mouse models of Alzheimer’s, and now the researchers are testing saracatinib’s effectiveness in humans. The study was funded by the National Institutes of Health as part of an innovative crowdsourcing initiative to repurpose experimental drugs. Researchers from the Yale University School of Medicine, New Haven, Connecticut, conducted the animal study, published for early view on March 21 in the Annals of Neurology External Web Site Policy, with support from the National Center for Advancing Translational Sciences (NCATS) through its Discovering New Therapeutic Uses for Existing Molecules (New Therapeutic Uses) program. Launched in May 2012, this program matches scientists with a selection of pharmaceutical industry assets that have undergone significant research and development by industry, including safety testing in humans, to test potential ideas for new therapeutic uses. Alzheimer’s disease is the most common form of dementia, a group of disorders that cause progressive loss of memory and other mental processes. An estimated 5 million Americans have Alzheimer’s disease, which causes clumps of amyloid beta protein to build up in the brain, and these protein clusters damage and ultimately kill brain cells (neurons). Alzheimer’s disease also leads to loss of synapses, which are the spaces between neurons through which the cells talk to each other and form memories. Current Alzheimer’s drug therapies can only ease symptoms without stopping disease progression. New treatments are needed that can halt the condition by targeting its underlying mechanisms.

Keyword: Alzheimers
Link ID: 20742 - Posted: 04.01.2015

Sara Reardon A new study finds that children's cognitive skills are linked to family income. The stress of growing up poor can hurt a child’s brain development starting before birth, research suggests — and even very small differences in income can have major effects on the brain. Researchers have long suspected that children’s behaviour and cognitive abilities are linked to their socioeconomic status, particularly for those who are very poor. The reasons have never been clear, although stressful home environments, poor nutrition, exposure to industrial chemicals such as lead and lack of access to good education are often cited as possible factors. In the largest study of its kind, published on 30 March in Nature Neuroscience1, a team led by neuroscientists Kimberly Noble from Columbia University in New York City and Elizabeth Sowell from Children's Hospital Los Angeles, California, looked into the biological underpinnings of these effects. They imaged the brains of 1,099 children, adolescents and young adults in several US cities. Because people with lower incomes in the United States are more likely to be from minority ethnic groups, the team mapped each child’s genetic ancestry and then adjusted the calculations so that the effects of poverty would not be skewed by the small differences in brain structure between ethnic groups. The brains of children from the lowest income bracket — less than US$25,000 — had up to 6% less surface area than did those of children from families making more than US$150,000, the researchers found. In children from the poorest families, income disparities of a few thousand dollars were associated with major differences in brain structure, particularly in areas associated with language and decision-making skills. Children's scores on tests measuring cognitive skills, such as reading and memory ability, also declined with parental income. © 2015 Nature Publishing Group,

Keyword: Development of the Brain; Stress
Link ID: 20741 - Posted: 03.31.2015

By Anna Azvolinsky Differences in male and female rodent sexual behaviors are programmed during brain development, but how exactly this occurs is not clear. In the preoptic area (POA) of the brain—a region necessary for male sex behavior—the female phenotype results from repression of male-linked genes by DNA methylation, according to a study published today (March 30) in Nature Neuroscience. There is very little known about how the brain is masculinized—and even less about how it is feminized—even though the question has been studied for more than 50 years, said Bridget Nugent, study author and now a postdoctoral fellow at the University of Pennsylvania. These sex differences in the brain are programmed toward the end of fetal development, through to one week after birth in rodents. In males, testicular hormones drive masculinization of the brain; this was thought to occur by direct induction of gene expression by hormone-associated transcription factors. Because a feminized brain occurred in the absence of ovarian hormone signals, most researchers assumed that the female brain and behavior was a sort of default state, programmed during development when no male hormones are present. But the downstream mechanisms of how hormones can modify gene expression were not previously known. “This study reveals that DNA methylation plays an important role in regulating sexual differentiation,” said Nirao Shah, who also studies the neural basis for sex-specific behaviors at the University of California, San Francisco, but was not involved with the work. © 1986-2015 The Scientist

Keyword: Sexual Behavior; Epigenetics
Link ID: 20740 - Posted: 03.31.2015

By Maggie Fox and Jane Derenowski A new strain of the polio-like EV-D68 may be causing the rare and mystifying cases of muscle weakness that's affected more than 100 kids across the United States, researchers reported Monday. They say they've found the strongest evidence yet that the virus caused the polio-like syndrome, but they also say it appears to be rare and might have to do with the genetic makeup of the patients. No other germ appears to be responsible, the team reports in the journal Lancet Infectious Diseases. But because most kids were tested many days after they first got sick, it may be impossible to ever know for sure. The body will have cleared the virus itself by then, said Dr. Charles Chiu of the University of California San Francisco, who helped conduct the study. "This is a virus that causes the common cold," Chiu told NBC News. "Parents don't bring their kids in until they are really sick. By that time, typically, the viral levels may be very, very low or undetectable." "Every single virus that we found in the children corresponded to new strain of the virus, called B-1." Enterovirus D68 (EV-D68) is one of about 100 different enteroviruses that infect people. They include polio but also a range of viruses that cause cold-like symptoms. Polio's the only one that is vaccinated against; before widespread vaccination it crippled 35,000 people a year in the United States.

Keyword: Movement Disorders
Link ID: 20739 - Posted: 03.31.2015

By Lawrence Berger A cognitive scientist and a German philosopher walk into the woods and come upon a tree in bloom: What does each one see? And why does it matter? While that may sound like the set-up to a joke making the rounds at a philosophy conference, I pose it here sincerely, as a way to explore the implications of two distinct strains of thought — that of cognitive science and that of phenomenology, in particular, the thought of Martin Heidegger, who offers a most compelling vision of the ultimate significance of our being here, and what it means to be fully human. When we feel that someone is really listening to us, we feel more alive, we feel our true selves coming to the surface — this is the sense in which worldly presence matters. It can be argued that cognitive scientists tend to ignore the importance of what many consider to be essential features of human existence, preferring to see us as information processors rather than full-blooded human beings immersed in worlds of significance. In general, their intent is to explain human activity and life as we experience it on the basis of physical and physiological processes, the implicit assumption being that this is the domain of what is ultimately real. Since virtually everything that matters to us as human beings can be traced back to life as it is experienced, such thinking is bound to be unsettling. For instance, an article in The Times last year by Michael S. A. Graziano, a professor of psychology and neuroscience at Princeton, about whether we humans are “really conscious,” argued, among other things, that “we don’t actually have inner feelings in the way most of us think we do.” © 2015 The New York Times Company

Keyword: Attention; Consciousness
Link ID: 20738 - Posted: 03.31.2015

by Bethany Brookshire Music displays all the harmony and discord the auditory world has to offer. The perfect pair of notes at the end of the Kyrie in Mozart’s Requiem fills churches and concert halls with a single chord of ringing, echoing consonance. Composers such as Arnold Schönberg explored the depths of dissonance — groups of notes that, played together, exist in unstable antagonism, their frequencies crashing and banging against each other. Dissonant chords are difficult to sing and often painful to hear. But they may get less painful with age. As we age, our brains may lose the clear-cut representations of these consonant and dissonant chords, a new study shows. The loss may affect how older people engage with music and shows that age-related hearing loss is more complex than just having to reach for the volume controls. The main mechanism behind age-related hearing loss is the deterioration of the outer hair cells in the cochlea, a coiled structure within our inner ear. When sound waves enter the ear, a membrane vibrates, pulling the hair cells to and fro and kicking off a series of events that produce electrical signals that will be sent onward to the brain. As we age, we lose some of these outer hair cells, and with them goes our ability to hear extremely high frequencies. In a new study, researchers tested how people perceive consonant pairs of musical notes, which are harmonious and generally pleasing, or dissonant ones, which can be harsh and tense. © Society for Science & the Public 2000 - 2015

Keyword: Hearing; Development of the Brain
Link ID: 20737 - Posted: 03.31.2015

By Anne Skomorowsky Because Germanwings pilot Andreas Lubitz killed himself when he purposefully drove a plane carrying 149 other people into a mountain in the Alps, there has been an assumption that he suffered from “depression”—an assumption strengthened by the discovery of antidepressants in his home and reports that he had been treated in psychiatry and neurology clinics. Many patients and other interested parties are rightly concerned that Lubitz’s murderous behavior will further stigmatize the mentally ill. It is certainly true that stigma may lead those in need to avoid treatment. When I was a psychiatrist at an HIV clinic, I was baffled by the shame associated with a visit to see me. Patients at the clinic had advanced AIDS, often contracted through IV drug use or sex work, and many had unprotected sex despite their high viral loads. Some were on parole. Many had lost custody of their children. Many lived in notorious single-room occupancy housing and used cocaine daily. But these issues, somehow, were less embarrassing than the suggestion that they be evaluated by a psychiatrist. Even doctors invoke “depression” to explain anything a reasonable adult wouldn’t do. For my clinic patients, it was shameful to be mentally ill. But to engage in antisocial behavior as a way of life? Not so bad. I think my patients were on to something. Bad behavior—even suicidal behavior—is not the same as depression. It is a truism in psychiatry that depression is underdiagnosed. But as a psychiatrist confronted daily with “problem” patients in the general hospital where I work, I find that depression is also overdiagnosed. Even doctors invoke “depression” to explain anything a reasonable adult wouldn’t do. © 2014 The Slate Group LLC.

Keyword: Depression
Link ID: 20736 - Posted: 03.31.2015

By Emily Underwood Shielded by the skull and packed with fatty tissue, the living brain is perhaps the most difficult organ for scientists to probe. Functional magnetic resonance imaging (fMRI), which noninvasively measures changes in blood flow and oxygen consumption as a proxy for neuronal activation, lags far behind the actual speed of thought. Now, a new technique may provide the fastest yet method of measuring blood flow in the brain, scientists report online today in Nature Methods. The technique, which bounces laser beams off red blood cells, has a resolution of under a millisecond—slightly less time than it takes a neuron to fire—and it has a far higher spatial resolution than fMRI. Even the most powerful fMRI machines, used only on animals, can image only millimeter-wide swaths of tissues including thousands of cells. The new technique, which takes its measurements from sonic waves produced by the beams, can image structures as small as individual blood vessels and cells (see above). Although the technique is not likely to be feasible in humans due to safety concerns, it could provide an important tool to better understand how blood flow and oxygen consumption is related to brain activity. That’s a key question for those relying on cruder and safer tools, such as fMRI, to study the human brain, researchers say. It is also a powerful tool for studying how errant eddies and whorls of blood in blood vessels can sometimes lead to stroke, they say. © 2015 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 20735 - Posted: 03.31.2015

|By Dwayne Godwin and Jorge Cham Our minds are veritable memory machines. © 2015 Scientific American

Keyword: Learning & Memory
Link ID: 20734 - Posted: 03.31.2015

Nicholette Zeliadt, One afternoon in October 2012, a communication therapist from Manchester visited the home of Laura and her three children. Laura sat down at a small white table in a dimly lit room to feed her 10-month-old daughter, Bethany, while the therapist set up a video camera to record the pair’s every movement. (Names of research participants have been changed to protect privacy.) Bethany sat quietly in her high chair, nibbling on macaroni and cheese. She picked up a slimy noodle with her tiny fingers, looked up at Laura and thrust out her hand. “Oh, Mommy’s going to have some, yum,” Laura said. “Clever girl!” Bethany beamed a toothy grin at her mother and let out a brief squeal of laughter, and then turned her head to peer out the window as a bus rumbled by. “Oh, you can hear the bus,” Laura said. “Can you say ‘bus?’” “Bah!” Bethany exclaimed. “Yeah, bus!” Laura said. This ordinary domestic moment, immortalized in the video, is part of the first rigorous test of a longstanding idea: that the everyday interactions between caregiver and child can shape the course of autism1. The dynamic exchanges with a caregiver are a crucial part of any child’s development. As Bethany and her mother chatter away, responding to each other’s glances and comments, for example, the little girl is learning how to combine gestures and words to communicate her thoughts. In a child with autism, however, this ‘social feedback loop’ might go awry. An infant who avoids making eye contact, pays little attention to faces and doesn’t respond to his or her name gives parents few opportunities to engage. The resulting lack of social interaction may reinforce the baby’s withdrawal, funneling into a negative feedback loop that intensifies mild symptoms into a full-blown disorder. © 2015 Guardian News and Media Limited

Keyword: Autism
Link ID: 20733 - Posted: 03.30.2015

|By Roni Jacobson As intangible as they may seem, memories have a firm biological basis. According to textbook neuroscience, they form when neighboring brain cells send chemical communications across the synapses, or junctions, that connect them. Each time a memory is recalled, the connection is reactivated and strengthened. The idea that synapses store memories has dominated neuroscience for more than a century, but a new study by scientists at the University of California, Los Angeles, may fundamentally upend it: instead memories may reside inside brain cells. If supported, the work could have major implications for the treatment of post-traumatic stress disorder (PTSD), a condition marked by painfully vivid and intrusive memories. More than a decade ago scientists began investigating the drug propranolol for the treatment of PTSD. Propranolol was thought to prevent memories from forming by blocking production of proteins required for long-term storage. Unfortunately, the research quickly hit a snag. Unless administered immediately after the traumatic event, the treatment was ineffective. Lately researchers have been crafting a work-around: evidence suggests that when someone recalls a memory, the reactivated connection is not only strengthened but becomes temporarily susceptible to change, a process called memory reconsolidation. Administering propranolol (and perhaps also therapy, electrical stimulation and certain other drugs) during this window can enable scientists to block reconsolidation, wiping out the synapse on the spot. The possibility of purging recollections caught the eye of David Glanzman, a neurobiologist at U.C.L.A., who set out to study the process in Aplysia, a sluglike mollusk commonly used in neuroscience research. Glanzman and his team zapped Aplysia with mild electric shocks, creating a memory of the event expressed as new synapses in the brain. The scientists then transferred neurons from the mollusk into a petri dish and chemically triggered the memory of the shocks in them, quickly followed by a dose of propranolol. © 2015 Scientific American

Keyword: Learning & Memory; Stress
Link ID: 20732 - Posted: 03.30.2015