Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 19645

By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company

Keyword: Stress; Aggression
Link ID: 20161 - Posted: 10.04.2014

by Michael Marshall When we search for the seat of humanity, are we looking at the wrong part of the brain? Most neuroscientists assume that the neocortex, the brain's distinctive folded outer layer, is the thing that makes us uniquely human. But a new study suggests that another part of the brain, the cerebellum, grew much faster in our ape ancestors. "Contrary to traditional wisdom, in the human lineage the cerebellum was the part of the brain that accelerated its expansion most rapidly, rather than the neocortex," says Rob Barton of Durham University in the UK. With Chris Venditti of the University of Reading in the UK, Barton examined how the relative sizes of different parts of the brain changed as primates evolved. During the evolution of monkeys, the neocortex and cerebellum grew in tandem, a change in one being swiftly followed by a change in the other. But starting with the first apes around 25 million years ago through to chimpanzees and humans, the cerebellum grew much faster. As a result, the cerebellums of apes and humans contain far more neurons than the cerebellum of a monkey, even if that monkey were scaled up to the size of an ape. "The difference in ape cerebellar volume, relative to a scaled monkey brain, is equal to 16 billion extra neurons," says Barton. "That's the number of neurons in the entire human neocortex." © Copyright Reed Business Information Ltd.

Keyword: Evolution; Aggression
Link ID: 20160 - Posted: 10.04.2014

|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American

Keyword: Learning & Memory; Aggression
Link ID: 20159 - Posted: 10.04.2014

By Kevin Hartnett You may have seen that deliberately annoying “View of the World from Ninth Avenue” map featured on the cover of the New Yorker a while back. It shows the distorted way geography appears to a Manhattanite: 9th and 10th avenues are the center of the world, New Jersey appears, barely, and everywhere else is just a blip if it registers at all. As it turns out, a similar kind of map exists for the human body — with at least some basis in neuroscience. In August I wrote a story for Ideas on the rise of face transplants and spoke to Michael Sims, author of the book, “Adam’s Navel: A Natural and Cultural History of the Human Form.” During our conversation Sims mentioned an odd diagram published in 1951 by a neurosurgeon named Wilder Penfield. The diagram is known as “Homunculus” (a name taken from a weird and longstanding art form that depicts small human beings); it shows the human body scaled according to the amount of brain tissue dedicated to each part, and arranged according to the locations in the brain that control them. In the diagram, the eyes, lips, nose, and tongue appear grotesquely large, indicating that we devote an outsized amount of brain tissue to operating and receiving sensation from these parts of the body. (Sims’s point was that we devote a lot of processing power to the face, and for that reason find it biologically disorienting that faces could be changeable.) The hand is quite large, too, while the toes, legs, trunks, shoulders, and arms are tiny, the equivalents of Kansas City and Russia on the New Yorker map. “Homunculus” seems like the kind of thing that would have long since been superseded by modern brain science, but it actually continues to have a surprising amount of authority, and often appears in neuroscience textbooks.

Keyword: Pain & Touch
Link ID: 20158 - Posted: 10.04.2014

By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014

Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.

Keyword: Emotions; Aggression
Link ID: 20156 - Posted: 10.04.2014

By Fredrick Kunkle Years ago, many scientists assumed that a woman’s heart worked pretty much the same as a man’s. But as more women entered the male-dominated field of cardiology, many such assumptions vanished, opening the way for new approaches to research and treatment. A similar shift is underway in the study of Alzheimer’s disease. It has long been known that more women than men get the deadly neurodegenerative disease, and an emerging body of research is challenging the common wisdom as to why. Although the question is by no means settled, recent findings suggest that biological, genetic and even cultural influences may play heavy roles. Of the more than 5 million people in the United States who have been diagnosed with Alzheimer’s, the leading cause of dementia, two-thirds are women. Because advancing age is considered the biggest risk factor for the disease, researchers largely have attributed that disparity to women’s longer life spans. The average life expectancy for women is 81 years, compared with 76 for men. Yet “even after taking age into account, women are more at risk,” said Richard Lipton, a physician who heads the Einstein Aging Study at Albert Einstein College of Medicine in New York. With the number of Alzheimer’s cases in the United States expected to more than triple by 2050, some researchers are urging a greater focus on understanding the underlying reasons women are more prone to the disease and on developing gender-specific treatments. .

Keyword: Alzheimers; Aggression
Link ID: 20155 - Posted: 10.04.2014

Carl Zimmer As much as we may try to deny it, Earth’s cycle of day and night rules our lives. When the sun sets, the encroaching darkness sets off a chain of molecular events spreading from our eyes to our pineal gland, which oozes a hormone called melatonin into the brain. When the melatonin latches onto neurons, it alters their electrical rhythm, nudging the brain into the realm of sleep. At dawn, sunlight snuffs out the melatonin, forcing the brain back to its wakeful pattern again. We fight these cycles each time we stay up late reading our smartphones, suppressing our nightly dose of melatonin and waking up grumpy the next day. We fly across continents as if we could instantly reset our inner clocks. But our melatonin-driven sleep cycle lags behind, leaving us drowsy in the middle of the day. Scientists have long wondered how this powerful cycle got its start. A new study on melatonin hints that it evolved some 700 million years ago. The authors of the study propose that our nightly slumbers evolved from the rise and fall of our tiny oceangoing ancestors, as they swam up to the surface of the sea at twilight and then sank in a sleepy fall through the night. To explore the evolution of sleep, scientists at the European Molecular Biology Laboratory in Germany study the activity of genes involved in making melatonin and other sleep-related molecules. Over the past few years, they’ve compared the activity of these genes in vertebrates like us with their activity in a distantly related invertebrate — a marine worm called Platynereis dumerilii. The scientists studied the worms at an early stage, when they were ball-shaped 2-day-old larvae. The ocean swarms with juvenile animals like these. Many of them spend their nights near the ocean surface, feeding on algae and other bits of food. Then they spend the day at lower depths, where they can hide from predators and the sun’s ultraviolet rays. © 2014 The New York Times Company

Keyword: Sleep; Aggression
Link ID: 20154 - Posted: 10.02.2014

BY Bethany Brookshire In this sweet, sweet world we live in, losing weight can be a dull and flavorless experience. Lovely stove-popped popcorn drenched in butter gives way to dry microwaved half-burnt kernels covered in dusty yellow powder. The cookies and candy that help us get through the long afternoons are replaced with virtuous but boring apples and nuts. Even the sugar that livens up our coffee gets a skeptical eye: That’s an extra 23 calories per packet you shouldn’t be eating. What makes life sweet for those of us who are counting calories is artificial sweeteners. Diet soda gives a sweet carbonated fix. A packet of artificial sweetener in your coffee or tea makes it a delicious morning dose. But a new study, published September 17 in Nature, found that the artificial sweetener saccharin has an unintended side effect: It alters the bacterial composition of the gut in mice and humans. The new bacterial neighborhood brings with it higher blood glucose levels, putting the humans and the murine counterparts at risk for diabetes. Many people wondered if the study’s effects were real. We all knew that sugar was bad, but now the scientists are coming for our Splenda! It seems more than a little unfair. But this study was a long time coming. The scientific community has been studying artificial sweeteners and their potential hazards for a long time. And while the new study adds to the literature, there are other studies, currently ongoing and planned for the future, that will determine the extent and necessity of our artificially sweetened future. © Society for Science & the Public 2000 - 2014.

Keyword: Obesity; Aggression
Link ID: 20153 - Posted: 10.02.2014

James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class. The improvements in this case came in executive control, which consists of inhibition (resisting distraction, maintaining focus), working memory, and cognitive flexibility (switching between tasks). The images above show the brain activity in the group of kids who did the program as opposed to the group that didn't. It's the kind of difference that's so dramatic it's a little unsettling. The study only lasted nine months, but when you're only seven years old, nine months is a long time to be sitting in class with a blue head. © 2014 by The Atlantic Monthly Group.

Keyword: ADHD
Link ID: 20152 - Posted: 10.02.2014

|By Nathan Collins Step aside, huge magnets and radioactive tracers—soon some brain activity will be revealed by simply training dozens of red lights on the scalp. A new study in Nature Photonics finds this optical technique can replicate functional MRI experiments, and it is more comfortable, more portable and less expensive. The method is an enhancement of diffuse optical tomography (DOT), in which a device shines tiny points of red light at a subject's scalp and analyzes the light that bounces back. The red light reflects off red hemoglobin in the blood but does not interact as much with tissues of other colors, which allows researchers to recover an fMRI-like image of changing blood flow in the brain at work. For years researchers attempting to use DOT have been limited by the difficulty of packing many heavy light sources and detectors into the small area around the head. They also needed better techniques for analyzing the flood of data that the detectors collected. Now researchers at Washington University in St. Louis and the University of Birmingham in England report they have solved those problems and made the first high-density DOT (HD-DOT) brain scans. The team first engineered a “double halo” structure to support the weight of 96 lights and 92 detectors, more than double the number in earlier arrays. The investigators also dealt with the computing challenges associated with that many lights—for example, they figured out how to filter out interference from blood flow in the scalp and other tissues. The team then used HD-DOT to successfully replicate fMRI studies of vision and language processing—a task impossible for other fMRI alternatives, such as functional near-infrared spectroscopy or electroencephalography, which do not cover a large enough swath of the brain or have sufficient resolution to pinpoint active brain areas. Finally, the team scanned the brains of people who have implanted electrodes for Parkinson's disease—something fMRI can never do because the machine generates electromagnetic waves that can destroy electronic devices such as pacemakers. © 2014 Scientific American

Keyword: Brain imaging
Link ID: 20151 - Posted: 10.02.2014

By CATHERINE SAINT LOUIS Driven by a handful of reports of poliolike symptoms in children, federal health officials have asked the nation’s physicians to report cases of children with limb weakness or paralysis along with specific spinal-cord abnormalities on a magnetic resonance imaging test. As a respiratory illness known as enterovirus 68 is sickening thousands of children from coast to coast, officials are trying to figure out if the weakness could be linked to the virus. The emergence of several cases of limb weakness among children in Colorado put doctors on alert in recent months. The Centers for Disease Control and Prevention issued an advisory on Friday, and this week, other cases of unexplained muscle weakness or paralysis came to light in Michigan, Missouri and Massachusetts. The C.D.C. is investigating the cases of 10 children hospitalized at Children’s Hospital Colorado with unexplained arm or leg weakness since Aug. 9. Some of the children, who range in age from 1 to 18, also developed symptoms like facial drooping, double vision, or difficulty swallowing or talking. Four of them tested positive for enterovirus 68, also known as enterovirus D68, which has recently caused severe respiratory illness in children in 41 states and the District of Columbia. One tested positive for rhinovirus, which can cause the common cold. Two tested negative. Two patients’ specimens are still being processed; another was never tested. It is unclear whether the muscle weakness is connected to the viral outbreak. “It’s one possibility we are looking at, but certainly not the only possibility,” said Mark Pallansch, director of the C.D.C.’s division of viral diseases. © 2014 The New York Times Company

Keyword: Movement Disorders
Link ID: 20150 - Posted: 10.02.2014

By Smitha Mundasad Health reporter, BBC News Measuring people's sense of smell in later life could help doctors predict how likely they are to be alive in five years' time, a PLOS One study suggests. A survey of 3,000 adults found 39% with the poorest sense of smell were dead within five years - compared to just 10% who identified odours correctly. Scientists say the loss of smell sense does not cause death directly, but may be an early warning sign. They say anyone with long-lasting changes should seek medical advice. Researchers from the University of Chicago asked a representative sample of adults between the ages of 57-85 to take part in a quick smell test. The assessment involved identifying distinct odours encased on the tips of felt-tip pens. The smells included peppermint, fish, orange, rose and leather. Five years later some 39% of adults who had the lowest scores (4-5 errors) had passed away, compared with 19% with moderate smell loss and just 10% with a healthy sense of smell (0-1 errors). And despite taking issues such as age, nutrition, smoking habits, poverty and overall health into account, researchers found those with the poorest sense of smell were still at greatest risk. Lead scientist, Prof Jayant Pinto, said: "We think loss of the sense of smell is like the canary in the coal mine. BBC © 2014

Keyword: Chemical Senses (Smell & Taste); Aggression
Link ID: 20149 - Posted: 10.02.2014

By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.

Keyword: Alzheimers; Aggression
Link ID: 20148 - Posted: 10.02.2014

Have you ever wrongly suspected that other people are out to harm you? Have you been convinced that you’re far more talented and special than you really are? Do you sometimes hear things that aren’t actually there? These experiences – paranoia, grandiosity and hallucinations in the technical jargon – are more common among the general population than is usually assumed. But are people who are susceptible simply “made that way”? Are they genetically predisposed, in other words, or have their life experiences made them more vulnerable to these things? It’s an old debate: which is more important, nature or nurture? Scientists nowadays tend to agree that human psychology is a product of a complex interaction between genes and experience – which is all very well, but where does the balance lie? Scientists (including one of the authors of this blog) recently conducted the first ever study among the general population of the relative contributions of genes and environment to the experience of paranoia, grandiosity and hallucinations. How did we go about the research? First, it is important to be clear about the kinds of experience we measured. By paranoia, we mean the unfounded or excessive fear that other people are out to harm us. Grandiosity denotes an unrealistic conviction of one’s abilities and talents. Hallucinations are sensory experiences (hearing voices, for instance) that aren’t caused by external events. Led by Dr Angelica Ronald at Birkbeck, University of London, the team analysed data on almost 5,000 pairs of 16-year-old twins. This is the classical twin design, a standard method for gauging the relative influence of genes and environment. Looking simply at family traits isn’t sufficient: although family members share many genes, they also tend to share many of the same experiences. This is why studies involving twins are so useful. © 2014 Guardian News and Media Limited

Keyword: Schizophrenia
Link ID: 20147 - Posted: 10.02.2014

by Jason M. Breslow As the NFL nears an end to its long-running legal battle over concussions, new data from the nation’s largest brain bank focused on traumatic brain injury has found evidence of a degenerative brain disease in 76 of the 79 former players it’s examined. The findings represent a more than twofold increase in the number of cases of chronic traumatic encephalopathy, or CTE, that have been reported by the Department of Veterans Affairs’ brain repository in Bedford, Mass. Researchers there have now examined the brain tissue of 128 football players who, before their deaths, played the game professionally, semi-professionally, in college or in high school. Of that sample, 101 players, or just under 80 percent, tested positive for CTE. To be sure, players represented in the data represent a skewed population. CTE can only be definitively identified posthumously, and many of the players who have donated their brains for research suspected that they may have had the disease while still alive. For example, former Chicago Bears star Dave Duerson committed suicide in 2011 by shooting himself in the chest, reportedly to preserve his brain for examination. Nonetheless, Dr. Ann McKee, the director of the brain bank, believes the findings suggest a clear link between football and traumatic brain injury. “Obviously this high percentage of living individuals is not suffering from CTE,” said McKee, a neuropathologist who directs the brain bank as part of a collaboration between the VA and Boston University’s CTE Center. But “playing football, and the higher the level you play football and the longer you play football, the higher your risk.” ©1995-2014 WGBH Educational Foundation

Keyword: Brain Injury/Concussion
Link ID: 20146 - Posted: 10.01.2014

By Gretchen Reynolds Exercise may help to safeguard the mind against depression through previously unknown effects on working muscles, according to a new study involving mice. The findings may have broad implications for anyone whose stress levels threaten to become emotionally overwhelming. Mental health experts have long been aware that even mild, repeated stress can contribute to the development of depression and other mood disorders in animals and people. Scientists have also known that exercise seems to cushion against depression. Working out somehow makes people and animals emotionally resilient, studies have shown. But precisely how exercise, a physical activity, can lessen someone’s risk for depression, a mood state, has been mysterious. So for the new study, which was published last week in Cell, researchers at the Karolinska Institute in Stockholm delved into the brains and behavior of mice in an intricate and novel fashion. Mouse emotions are, of course, opaque to us. We can’t ask mice if they are feeling cheerful or full of woe. Instead, researchers have delineated certain behaviors that indicate depression in mice. If animals lose weight, stop seeking out a sugar solution when it’s available — because, presumably, they no longer experience normal pleasures — or give up trying to escape from a cold-water maze and just freeze in place, they are categorized as depressed. And in the new experiment, after five weeks of frequent but intermittent, low-level stress, such as being restrained or lightly shocked, mice displayed exactly those behaviors. They became depressed. The scientists could then have tested whether exercise blunts the risk of developing depression after stress by having mice run first. But, frankly, from earlier research, they knew it would. They wanted to parse how. So they bred pre-exercised mice. © 2014 The New York Times Company

Keyword: Depression
Link ID: 20145 - Posted: 10.01.2014

By Sarah C. P. Williams A wind turbine, a roaring crowd at a football game, a jet engine running full throttle: Each of these things produces sound waves that are well below the frequencies humans can hear. But just because you can’t hear the low-frequency components of these sounds doesn’t mean they have no effect on your ears. Listening to just 90 seconds of low-frequency sound can change the way your inner ear works for minutes after the noise ends, a new study shows. “Low-frequency sound exposure has long been thought to be innocuous, and this study suggests that it’s not,” says audiology researcher Jeffery Lichtenhan of the Washington University School of Medicine in in St. Louis, who was not involved in the new work. Humans can generally sense sounds at frequencies between 20 and 20,000 cycles per second, or hertz (Hz)—although this range shrinks as a person ages. Prolonged exposure to loud noises within the audible range have long been known to cause hearing loss over time. But establishing the effect of sounds with frequencies under about 250 Hz has been harder. Even though they’re above the lower limit of 20 Hz, these low-frequency sounds tend to be either inaudible or barely audible, and people don’t always know when they’re exposed to them. For the new study, neurobiologist Markus Drexl and colleagues at the Ludwig Maximilian University in Munich, Germany, asked 21 volunteers with normal hearing to sit inside soundproof booths and then played a 30-Hz sound for 90 seconds. The deep, vibrating noise, Drexl says, is about what you might hear “if you open your car windows while you’re driving fast down a highway.” Then, they used probes to record the natural activity of the ear after the noise ended, taking advantage of a phenomenon dubbed spontaneous otoacoustic emissions (SOAEs) in which the healthy human ear itself emits faint whistling sounds. © 2014 American Association for the Advancement of Science

Keyword: Hearing
Link ID: 20144 - Posted: 10.01.2014

|By Brian Bienkowski and Environmental Health News Babies born to mothers with high levels of perchlorate during their first trimester are more likely to have lower IQs later in life, according to a new study. The research is the first to link pregnant women's perchlorate levels to their babies’ brain development. It adds to evidence that the drinking water contaminant may disrupt thyroid hormones that are crucial for proper brain development. Perchlorate, which is both naturally occurring and manmade, is used in rocket fuel, fireworks and fertilizers. It has been found in 4 percent of U.S. public water systems serving an estimated 5 to 17 million people, largely near military bases and defense contractors in the U.S. West, particularly around Las Vegas and in Southern California. “We would not recommend action on perchlorate levels from this study alone, although our report highlights a pressing need for larger studies of perchlorate levels from the general pregnant population and those with undetected hypothyroidism,” the authors from the United Kingdom, Italy and Boston wrote in the study published in The Journal of Clinical Endocrinology & Metabolism. The Environmental Protection Agency for decades has debated setting a national drinking water standard for perchlorate. The agency in 2011 announced it would start developing a standard, reversing an earlier decision. In the meantime, two states, California and Massachusetts, have set their own standards. © 2014 Scientific American

Keyword: Development of the Brain; Aggression
Link ID: 20143 - Posted: 10.01.2014

Michael Häusser Use light to read out and control neural activity! This idea, so easily expressed and understood, has fired the imagination of neuroscientists for decades. The advantages of using light as an effector are obvious1: it is noninvasive, can be precisely targeted with exquisite spatial and temporal precision, can be used simultaneously at multiple wavelengths and locations, and can report the presence or activity of specific molecules. However, despite early progress2 and encouragement3, it is only recently that widely usable approaches for optical readout and manipulation of specific neurons have become available. These new approaches rely on genetically encoded proteins that can be targeted to specific neuronal subtypes, giving birth to the term 'optogenetics' to signal the combination of genetic targeting and optical interrogation4. On the readout side, highly sensitive probes have been developed for imaging synaptic release, intracellular calcium (a proxy for neural activity) and membrane voltage. On the manipulation side, a palette of proteins for both activation and inactivation of neurons with millisecond precision using different wavelengths of light have been identified and optimized. The extraordinary versatility and power of these new optogenetic tools are spurring a revolution in neuroscience research, and they have rapidly become part of the standard toolkit of thousands of research labs around the world. Although optogenetics may not yet be a household word (though try it on your mother; she may surprise you), there can be no better proof that optogenetics has become part of the scientific mainstream than the 2013 Brain Prize being awarded to the sextet that pioneered optogenetic manipulation (http://www.thebrainprize.org/flx/prize_winners/prize_winners_2013/) and the incorporation of optogenetics as a central plank in the US National Institutes of Health BRAIN Initiative5. Moreover, there is growing optimism about the prospect of using optogenetic probes not only to understand mechanisms of disease in animal models but also to treat disease in humans, particularly in more accessible parts of the brain such as the retina6. © 2014 Macmillan Publishers Limited

Keyword: Brain imaging
Link ID: 20142 - Posted: 10.01.2014