Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company
By KATHARINE Q. SEELYE SPARTA, N.J. — When Gail Morris came home late one night after taking her daughter to college, she saw her teenage son, Alex, asleep on the sofa in the family room. Nothing seemed amiss. An unfinished glass of apple juice sat on the table. She tucked him in under a blanket and went to bed. The next morning, he would not wake up. He was stiff and was hardly breathing. Over the next several hours, Ms. Morris was shocked to learn that her son had overdosed on heroin. She was told he would not survive. He did survive, but barely. He was in a coma for six weeks. He went blind and had no function in his arms or legs. He could not speak or swallow. Hospitalized for 14 months, Alex, who is 6-foot-1, dropped to 90 pounds. One of his doctors said that Alex had come as close to dying as anyone he knew who had not actually died. Most people who overdose on heroin either die or fully recover. But Alex plunged into a state that was neither dead nor functional. There are no national statistics on how often opioid overdose leads to cases like Alex’s, but doctors say they worry that with the dramatic increase in heroin abuse and overdoses, they will see more such outcomes. “I would expect that we will,” said Dr. Nora Volkow, director of the National Institute on Drug Abuse. “They are starting to report isolated cases like this. And I would not be surprised if you have more intermediate cases with more subtle impairment.” More than 660,000 Americans used heroin in 2012, the federal government says, double the number of five years earlier. Officials attribute much of the increase to a crackdown on prescription painkillers, prompting many users to turn to heroin, which is cheaper and easier to get than other opioids. © 2014 The New York Times Company
|By William Skaggs One of the most frustrating and mysterious medical conditions affecting the mind is impaired consciousness, as can occur with brain damage. Patients in a coma or a vegetative or minimally conscious state sometimes spontaneously recover to varying degrees, but in most cases there is little that doctors can do to help. Now a rigorous study by a group at Liège University Hospital Center in Belgium has found that a simple treatment called transcranial direct-current stimulation (tDCS) can temporarily raise awareness in minimally conscious patients. In tDCS, electrodes are glued to the scalp, and a weak electric current is passed through them to stimulate the underlying brain tissue. Scientists led by neurologist Steven Laureys applied the electric current for 20 minutes to patients' left prefrontal cortex, an area known to be involved in attentiveness and working memory. Afterward, the effects on consciousness were measured by doctors who did not know whether the patient had received real tDCS or a sham treatment, in which the apparatus ran, but no current was delivered. For patients in a vegetative state, who display no communication or purposeful behavior, the stimulation might have led to improvement in two patients, but no statistically compelling evidence emerged. Yet 13 of 30 patients in a minimally conscious state—defined by occasional moments of low-level awareness—showed measurable gains in their responses to questions and sensory stimuli. Some had only recently been injured, but others had been minimally conscious for months. © 2014 Scientific American
Link ID: 19934 - Posted: 08.11.2014
By SERGE F. KOVALESKI Nearly four years ago, Dr. Sue Sisley, a psychiatrist at the University of Arizona, sought federal approval to study marijuana’s effectiveness in treating military veterans with post-traumatic stress disorder. She had no idea how difficult it would be. The proposal, which has the support of veterans groups, was hung up at several regulatory stages, requiring the research’s private sponsor to resubmit multiple times. After the proposed study received final approval in March from federal health officials, the lone federal supplier of research marijuana said it did not have the strains the study needed and would have to grow more — potentially delaying the project until at least early next year. Then, in June, the university fired Dr. Sisley, later citing funding and reorganization issues. But Dr. Sisley is convinced the real reason was her outspoken support for marijuana research. “They could never get comfortable with the idea of this controversial, high-profile research happening on campus,” she said. Dr. Sisley’s case is an extreme example of the obstacles and frustrations scientists face in trying to study the medical uses of marijuana. Dating back to 1999, the Department of Health and Human Services has indicated it does not see much potential for developing marijuana in smoked form into an approved prescription drug. In guidelines issued that year for research on medical marijuana, the agency quoted from an accompanying report that stated, “If there is any future for marijuana as a medicine, it lies in its isolated components, the cannabinoids and their synthetic derivatives.” Scientists say this position has had a chilling effect on marijuana research. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 19933 - Posted: 08.11.2014
by Aviva Rutkin What can the human brain do for a computer? There's at least one team of researchers that thinks it might have the answer. Working at IBM Research–Almaden in San Jose, California, they have just released more details of TrueNorth, a computer chip composed of one million digital "neurons". Under way for several years, the project abandons traditional computer architecture for one inspired by biological synapses and axons. The latest results, published in Science, provide a timely reminder of the promise of brain-inspired computing. The human brain still crushes any modern machines when it comes to tasks like vision or voice recognition. What's more, it manages to do so with less energy than it takes to power a light bulb. Building those qualities into a computer is an alluring prospect to many researchers, like Kwabena Boahen of Stanford University in California. "The first time I learned how computers worked, I thought it was ridiculous," he says. "I basically felt there had to be a better way." Aping the brain's structure could help us build computers that are far more powerful and efficient than today's, says TrueNorth team leader Dharmendra Modha. "We want to approximate the anatomy and physiology, the structure and dynamics of the brain, within today's silicon technology," he says. "I think that the chip and the associated ecosystem have the potential to transform science, technology, business, government and society." But how best to go about building a proper artificial brain is a matter of debate. © Copyright Reed Business Information Ltd
Link ID: 19932 - Posted: 08.09.2014
Posted by Ewen Callaway More than 130 leading population geneticists have condemned a book arguing that genetic variation between human populations could underlie global economic, political and social differences. “A Troublesome Inheritance“, by science journalist Nicholas Wade, was published in June by Penguin Press in New York. The 278-page work garnered widespread criticism, much of it from scientists, for suggesting that genetic differences (rather than culture) explain, for instance, why Western governments are more stable than those in African countries. Wade is former staff reporter and editor at the New York Times, Science and Nature. But the letter — signed by a who’s who of population genetics and human evolution researchers, and to be published in the 10 August New York Times — represents a rare unified statement from scientists in the field and includes many whose work was cited by Wade. “It’s just a measure of how unified people are in their disdain for what was done with the field,” says Michael Eisen, a geneticist at the University of California, Berkeley, who co-drafted the letter. “Wade juxtaposes an incomplete and inaccurate explanation of our research on human genetic differences with speculation that recent natural selection has led to worldwide differences in I.Q. test results, political institutions and economic development. We reject Wade’s implication that our findings substantiate his guesswork. They do not,” states the letter, which is a response to a critical review of the book published in the New York Times. “This letter is driven by politics, not science,” Wade said in a statement. “I am confident that most of the signatories have not read my book and are responding to a slanted summary devised by the organizers.” © 2014 Macmillan Publishers Limited
Keyword: Genes & Behavior
Link ID: 19931 - Posted: 08.09.2014
by Bethany Brookshire For most of us, where our birthday falls in the year doesn’t matter much in the grand scheme of things. A July baby doesn’t make more mistakes than a Christmas kid — at least, not because of their birthdays. But for neurons, birth date plays an important role in how these cells find their connections in the brain, a new study finds. Nerve cells that form early in development will make lots of connections — and lots of mistakes. Neurons formed later are much more precise in their targeting. The findings are an important clue to help scientists understand how the brain wires itself during development. And with more information on how the brain forms its network, scientists might begin to see what happens when that network is injured or malformed. Many, many brain cells are born as the brain develops. Each one has to reach out and make connections, sometimes to other cells around them and sometimes to other regions of the brain. To do this, these nerve cells send out axons, long, incredibly thin projections that reach out to other regions. How mammalian axons end up at their final destination in the growing brain remains a mystery. To find out how developing brains get wired up, Jessica Osterhout and colleagues at the University of California, San Diego and colleagues started in the eye. They looked at retinal ganglion cells, neurons that connect the brain and the eye. “It’s easy to access,” explains Andrew Huberman, a neuroscientist at UC San Diego and an author on the paper. “Your retina is basically part of the central nervous system that got squeezed into your eye during development.” Retinal ganglion cells all have the same function: To convey visual information from the eyes to the brain. But they are not all the same. © Society for Science & the Public 2000 - 2013
Ian Sample, science editor Stroke patients who took part in a small pilot study of a stem cell therapy have shown tentative signs of recovery six months after receiving the treatment. Doctors said the condition of all five patients had improved after the therapy, but that larger trials were needed to confirm whether the stem cells played any part in their progress. Scans of the patients' brains found that damage caused by the stroke had reduced over time, but similar improvements are often seen in stroke patients as part of the normal recovery process. At a six-month check-up, all of the patients fared better on standard measures of disability and impairment caused by stroke, but again their improvement may have happened with standard hospital care. The pilot study was designed to assess only the safety of the experimental therapy and with so few patients and no control group to compare them with, it is impossible to draw conclusions about the effectiveness of the treatment. Paul Bentley, a consultant neurologist at Imperial College London, said his group was applying for funding to run a more powerful randomised controlled trial on the therapy, which could see around 50 patients treated next year. "The improvements we saw in these patients are very encouraging, but it's too early to draw definitive conclusions about the effectiveness of the therapy," said Soma Banerjee, a lead author and consultant in stroke medicine at Imperial College Healthcare NHS Trust. "We need to do more tests to work out the best dose and timescale for treatment before starting larger trials." The five patients in the pilot study were treated within seven days of suffering a severe stroke. Each had a bone marrow sample taken, from which the scientists extracted stem cells that give rise to blood cells and blood vessel lining cells. These stem cells were infused into an artery that supplied blood to the brain. © 2014 Guardian News and Media Limited
Simon Makin Fish that have been exposed to a common anti-anxiety drug are more active and have better chances of survival than unexposed fish, researchers report in Environmental Research Letters1. The results suggest that standard methods for assessing the environmental impact of pharmaceuticals in waterways might miss some of the drugs' effects because they focus exclusively on harms, according to the authors. In the study, researchers led by Jonatan Klaminder at Umeå University in Sweden exposed Eurasian perch (Perca fluviatilis) to oxazepam, one of a widely used class of anti-anxiety drugs called benzodiazepines. Standard ecotoxicology experiments use unstressed, healthy fish that have been bred in labs. Control groups are designed to have 100% survival rates so that decreases in survival in the test group are easy to detect by comparison. But it is difficult to detect any increase in survival rates when the control group already has nearly complete survival. So Klaminder and his colleagues used the opposite approach. They exposed fish to the drug at two sensitive life stages: two-year-old wild fish taken from a Swedish lake that had only recently thawed after winter, and strings of roes — fish eggs that contain embryos undergoing development. These are more realistic conditions, the researchers say, as animals in the wild often experience high mortality. The researchers used oxazepam at a high concentration of 1,000 micrograms per litre and at a low concentration of 1 μg l−1. The low dose is relevant to aquatic environments in urban areas, because oxazepam concentrations of 1.9 μg l−1 have been measured in effluents from wastewater treatment plants. © 2014 Nature Publishing Group,
Link ID: 19928 - Posted: 08.09.2014
by Laura Sanders In their first year, babies grow and change in all sorts of obvious and astonishing ways. As their bodies become longer, heavier and stronger, so do their brains. Between birth and a child’s first birthday, her brain nearly triples in size as torrents of newborn nerve cells create neural pathways. This incredible growth can be influenced by a baby’s early life environment, scientists have found. Tragic cases of severe neglect or abuse can throw brain development off course, resulting in lifelong impairments. But in happier circumstances, warm caregivers influence a baby’s brain, too. A new study in rats provides a glimpse of how motherly actions influence a pup’s brain. Scientists recorded electrical activity in the brains of rat pups as their mamas nursed, licked and cared for their offspring. The results, published in the July 21 Current Biology, offer a fascinating minute-to-minute look at the effects of parenting. Researchers led by Emma Sarro of New York University’s medical school implanted electrodes near six pups’ brains to record neural activity. Video cameras captured mother-pup interactions, allowing the scientists to link specific maternal behaviors to certain sorts of brain activity. Two types of brain patterns emerged: a highly alert state and a sleepier, zoned-out state, Sarro and colleagues found. Pups’ brains were alert while they were drinking milk and getting groomed by mom. Pups’ brains’ were similarly aroused when the pups were separated from their mom and siblings. Some scientists think that these bursts of brain activity help young brains form the right connections between regions. © Society for Science & the Public 2000 - 2013.
By Emily Underwood The early signs of Creutzfeldt-Jakob disease (CJD)—a rare, incurable brain disorder caused by infectious, misshapen proteins called prions—are difficult to interpret. At first, people may simply feel depressed and can undergo personality changes or bouts of psychosis. By the time memory failure, blindness, and coma set in, typically within a year of infection, death is usually imminent. Now, researchers report that a simple nasal swab may help physicians detect the disease far more accurately and earlier than current methods. Finding simple, noninvasive diagnostic tests is “one of the holy grails” for CJD and other prion diseases, says biochemist Byron Caughey of the National Institute of Allergy and Infectious Diseases’ Rocky Mountain Laboratories in Hamilton, Montana, who helped lead the new work. Although there’s no cure for CJD, early diagnosis is important because it can help rule out other, treatable disorders, and it allows medical personnel to take precautions that prevent the disease from spreading to others through exposure to brain tissue or spinal fluid, he says. Researchers made a major stride toward better diagnostic methods in 2010, when Caughey and other researchers first described a new technique called the RT-QuIC test. The test requires removing cerebrospinal fluid (CSF) from patients by means of a spinal tap, putting samples into a bath of normally shaped prion proteins, and agitating the solution to encourage any abnormal prion “seeds” in the tissue to latch onto the regular proteins. If even trace amounts of pathogenic protein are present, they rapidly use the normal proteins to create millions of insoluble, fibrous amyloid strands. Researchers believe that these amyloid aggregates, also seen in other neurodegenerative diseases such as Alzheimer’s disease, ultimately cause CJD by interfering with or killing off neurons en masse. After death, the brains of people affected by CJD are so badly damaged that they often resemble Swiss cheese or sponges. © 2014 American Association for the Advancement of Science.
Link ID: 19926 - Posted: 08.07.2014
By Sandhya Somashekhar The first time Jeremy Clark met his 18-year-old client, the teenager was sitting in his vice principal’s office, the drawstrings of his black hoodie pulled tight. Jacob had recently disclosed to his friends on Facebook that he was hearing voices, and their reaction had been less than sympathetic. So Clark was relieved when a beaming Jacob showed up on time for their next meeting, at a comic book shop. As the pair bantered about “Star Wars” and a recent Captain America movie, however, Clark picked up troubling signs: Jacob said he was “detaching” from his family, often huddling alone in his room. As the visit ended, Clark gave the teen a bear hug and made a plan. “Let’s get together again next week,” he said. The visit was part of a new approach being used nationwide to find and treat teenagers and young adults with early signs of schizophrenia. The goal is to bombard them with help even before they have had a psychotic episode — a dramatic and often devastating break with reality that is a telltale sign of the disease. The program involves an intensive two-year course of socialization, family therapy, job and school assistance, and, in some cases, antipsychotic medication. What makes the treatment unique is that it focuses deeply on family relationships, and occurs early in the disease, often before a diagnosis. So far, the results have been striking: In Portland, Maine, where the treatment was pioneered, the rate of hospitalizations for first psychotic episodes fell by 34 percent over a six-year period, according to a March study. And just last month, a peer-reviewed study published in the journal Schizophrenia Bulletin found that young people undergoing the treatment at six sites around the country were more likely to be in school or working than adolescents who were not in the program. The research was funded by a $17 million grant from the Robert Wood Johnson Foundation.
Ian Sample, science correspondent The human brain can judge the apparent trustworthiness of a face from a glimpse so fleeting, the person has no idea they have seen it, scientists claim. Researchers in the US found that brain activity changed in response to how trustworthy a face appeared to be when the face in question had not been consciously perceived. Scientists made the surprise discovery during a series of experiments that were designed to shed light on the the neural processes that underpin the snap judgments people make about others. The findings suggest that parts of our brains are doing more complex subconscious processing of the outside world than many researchers thought. Jonathan Freeman at New York University said the results built on previous work that shows "we form spontaneous judgments of other people that can be largely outside awareness." The study focused on the activity of the amygdala, a small almond-shaped region deep inside the brain. The amygdala is intimately involved with processing strong emotions, such as fear. Its central nucleus sends out the signals responsible for the famous and evolutionarily crucial "fight-or-flight" response. Prior to the study, Freeman asked a group of volunteers to rate the trustworthiness of a series of faces. People tend to agree when they rank trustworthiness – faces with several key features, such as more furrowed brows and shallower cheekbones, are consistently rated as less trustworthy. Freeman then invited a different group of people to take part in the experiments. Each lay in an MRI scanner while images of faces flashed up on a screen before them. Each trustworthy or untrustworthy face flashed up for a matter of milliseconds. Though their eyes had glimpsed the images, the participants were not aware they had seen the faces. © 2014 Guardian News and Media Limited
Link ID: 19924 - Posted: 08.07.2014
Older people who have a severe vitamin D deficiency have an increased risk of developing dementia, a study has suggested. UK researchers, writing in Neurology, looked at about 1,650 people aged over 65. This is not the first study to suggest a link - but its authors say it is the largest and most robust. However, experts say it is still too early to say elderly people should take vitamin D as a preventative treatment. There are 800,000 people with dementia in the UK with numbers set to rise to more than one million by 2021. Vitamin D comes from foods - such as oily fish, supplements and exposing skin to sunlight. However older people's skin can be less efficient at converting sunlight into Vitamin D, making them more likely to be deficient and reliant on other sources. The international team of researchers, led by Dr David Llewellyn at the University of Exeter Medical School, followed people for six years. All were free from dementia, cardiovascular disease and stroke at the start of the study. At the end of the study they found the 1,169 with good levels of vitamin D had a one in 10 chance of developing dementia. Seventy were severely deficient - and they had around a one in five risk of dementia. 'Delay or even prevent' Dr Llewellyn said: "We expected to find an association between low vitamin D levels and the risk of dementia and Alzheimer's disease, but the results were surprising - we actually found that the association was twice as strong as we anticipated." He said further research was needed to establish if eating vitamin D rich foods such as oily fish - or taking vitamin D supplements - could "delay or even prevent" the onset of Alzheimer's disease and dementia. But Dr Llewellyn added: "We need to be cautious at this early stage and our latest results do not demonstrate that low vitamin D levels cause dementia. BBC © 2014
The gurgles made by a hungry belly are familiar to us all, but they are not just the side effect of an empty stomach. Brain cells not normally associated with communication send out a signal when they detect blood glucose levels are running low, and this triggers the stomach contractions. Richard Rogers of the Pennington Biomedical Research Center at Louisiana State University and colleagues used a drug called fluorocitrate to knock out the function of certain astrocytes and neurons in the brains of rats, blocking the sensation of hunger. Only when astrocyte function was restored did the gastric grumbles return, showing that it is these cells that respond to low glucose levels (Journal of Neuroscience, DOI: 10.1523/JNEUROSCI.1406-14.2014). The feeling of discomfort you get when hungry is called "hypoglycaemia awareness". "For most people this is only slightly unpleasant, but for diabetics whose glucose levels can drop significantly, [being hungry] can be dangerous," says Rogers. "It's important to understand how this mechanism works." © Copyright Reed Business Information Ltd.
by Bethany Brookshire Every day sees a new research article on addiction, be it cocaine, heroin, food or porn. Each one takes a specific angle on how addiction works in the brain. Perhaps it’s a disorder of reward, with drugs hijacking a natural system that is meant to respond to food, sex and friendship. Possibly addiction is a disorder of learning, where our brains learn bad habits and responses. Maybe we should think of addiction as a combination of an environmental stimulus and vulnerable genes. Or perhaps it’s an inappropriate response to stress, where bad days trigger a relapse to the cigarette, syringe or bottle. None of these views are wrong. But none of them are complete, either. Addiction is a disorder of reward, a disorder of learning. It has genetic, epigenetic and environmental influences. It is all of that and more. Addiction is a display of the brain’s astounding ability to change — a feature called plasticity — and it showcases what we know and don’t yet know about how brains adapt to all that we throw at them. “A lot of people think addiction is what happens when someone finds a drug to be the most rewarding thing they’ve ever experienced,” says neuroscientist George Koob, director of the National Institute on Alcohol Abuse and Alcoholism in Bethesda, Md. “But drug abuse is not just feeling good about drugs. Your brain is changed when you misuse drugs. It is changed in ways that perpetuate the problem.” The changes associated with drug use affect how addicts respond to drug cues, like the smell of a cigarette or the sight of a shot of vodka. Drug abuse also changes how other rewards, such as money or food are processed, decreasing their relative value. © Society for Science & the Public 2000 - 2013
|By Tori Rodriguez and Victoria Stern A growing number of people are seeking alternatives to antidepressant medications, and new research suggests that acupuncture could be a promising option. One new study found the traditional Chinese practice to be as effective as antidepressants, and a different study found that acupuncture may help treat the medications' side effects. In acupuncture, a practitioner inserts needles into the skin at points of the body thought to correspond with specific organs (right). Western research suggests the needles may activate natural painkillers in the brain; in traditional Chinese medicine, the process is believed to improve functioning by correcting energy blocks or imbalances in the organs. A study published last fall in the Journal of Alternative and Complementary Medicine found that electroacupuncture—in which a mild electric current is transmitted through the needles—was just as effective as fluoxetine (the generic name of Prozac) in reducing symptoms of depression. For six weeks, patients underwent either electroacupuncture five times weekly or a standard daily dose of fluoxetine. The researchers, the majority of whom specialize in traditional Chinese medicine, assessed participants' symptoms every two weeks and tracked their levels of glial cell line–derived neurotrophic factor (GDNF), a neuroprotective protein. Previous studies have found lower amounts of GDNF among patients with major depressive disorder, and in other research levels of the protein rose after treatment with antidepressant medication. © 2014 Scientific American,
Link ID: 19920 - Posted: 08.06.2014
Sarah C. P. Williams Every fall, grizzly bears pack on the pounds in preparation for their winter hibernation. In humans, such extreme weight gain would likely lead to diabetes or other metabolic diseases, but the bears manage to stay healthy year after year. Their ability to remain diabetes-free, researchers have now discovered, can be chalked up to the shutting down of a protein found in fat cells. The discovery could lead to new diabetes drugs that turn off the same pathway in humans. The findings are “provocative and interesting,” says biologist Sandy Martin of the University of Colorado, Denver, who was not involved in the new work. “They found a natural solution to a problem that we haven’t been able to solve.” As people gain weight, fat, liver, and muscle cells typically become less sensitive to the hormone insulin—which normally helps control blood sugar levels—and insulin levels rise. In turn, that increased insulin prevents the breakdown of fat cells, causing a vicious cycle that can lead to full-blown insulin resistance, or diabetes. Developing new diabetes drugs has been hampered by the fact that findings from many mouse models of diabetes have not translated to humans. So Kevin Corbit, a senior scientist at Thousand Oaks, California–based drug company Amgen, decided to start looking at obesity and metabolic disease in other animals. “When I was thinking about things that are quite fat, one of the first things I thought of was bears, and what they do to prepare to go into hibernation,” he says. “But of course you don’t see bears running around with diabetes and heart disease.” © 2014 American Association for the Advancement of Science
Link ID: 19919 - Posted: 08.06.2014
By DOUGLAS QUENQUA A tiny part of the brain keeps track of painful experiences and helps determine how we will react to them in the future, scientists say. The findings could be a boon to depression treatments. The habenula (pronounced ha-BEN-you-la), a part of the brain less than half the size of a pea, has been shown in animal studies to activate during painful or unpleasant episodes. Using M.R.I.s to produce powerful brain scans, researchers at University College London tracked the habenulas in subjects who were hooked up to electric shock machines. The subjects were presented with a series of photographs, some of which were followed by increasingly strong shocks. Soon, when the subjects were shown pictures associated with shocks, their habenulas would light up. “The habenula seems to track the associations with electric shocks becoming stronger and stronger,” said Jonathan Roiser, a neuroscientist at the college and an author of the study, published in The Proceedings of the National Academy of Sciences. The habenula appeared to have an effect on motivation, too. The subjects had been asked to occasionally press a button, just to show they were awake. They were much slower to do so when their habenula was active. In fact, the more slowly they responded, the more reliably their habenulas tracked associations with shocks. In animals, the habenula has been shown to suppress production of dopamine, a chemical that drives motivation. Perhaps, the researchers say, an overactive habenula can cause the feelings of impending doom and low motivation common in people with depression. © 2014 The New York Times Company
By Emily Underwood Old age may make us wiser, but it rarely makes us quicker. In addition to slowing down physically, most people lose points on intelligence tests as they enter their golden years. Now, new research suggests the loss of certain types of cognitive skills with age may stem from problems with basic sensory tasks, such as making quick judgments based on visual information. Although there’s no clear causal link between the two types of thinking yet, the new work could provide a simple, affordable way to track mental decline in senior citizens, scientists say. Since the 1970s, researchers who study intelligence have hypothesized that smartness, as measured on standard IQ tests, may hinge on the ability to quickly and efficiently sample sensory information from the environment, says Stuart Ritchie, a psychologist at the University of Edinburgh in the United Kingdom. Today it’s well known that people who score high on such tests do, indeed, tend to process such information more quickly than those who do poorly, but it’s not clear how these measures change with age, Ritchie says. Studying older people over time can be challenging given their uncertain health, but Ritchie and his colleagues had an unusual resource in the Lothian Birth Cohort, a group of people born in 1936 whose mental function has been periodically tested by the Scottish government since 1947—their first IQ test was at age 11. After recruiting more than 600 cohort members for their study, Ritchie and colleagues tracked their scores on a simple visual task three times over 10 years, repeating the test at the mean ages of 70, 73, and 76. © 2014 American Association for the Advancement of Science