Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Yves Frégnac & Gilles Laurent Launched in October 2013, the Human Brain Project (HBP) was sold by charismatic neurobiologist Henry Markram as a bold new path towards understanding the brain, treating neurological diseases and building information technology. It is one of two 'flagship' proposals funded by the European Commission's Future and Emerging Technologies programme (see go.nature.com/icotmi). Selected after a multiyear competition, the project seemed like an exciting opportunity to bring together neuroscience and IT to generate practical applications for health and medicine (see go.nature.com/2eocv8). Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. In recent months, the HBP executive board revealed plans to drastically reduce its experimental and cognitive neuroscience arm, provoking wrath in the European neuroscience community. The crisis culminated with an open letter from neuroscientists (including one of us, G.L.) to the European Commission on 7 July 2014 (see www.neurofuture.eu), which has now gathered more than 750 signatures. Many signatories are scientists in experimental and theoretical fields, and the list includes former HBP participants. The letter incorporates a pledge of non-participation in a planned call for 'partnering projects' that must raise about half of the HBP's total funding. This pledge could seriously lower the quality of the project's final output and leave the planned databases empty. © 2014 Nature Publishing Group
Keyword: Brain imaging
Link ID: 20033 - Posted: 09.04.2014
By MATTHEW PERRONE AP Health Writer WASHINGTON (AP) — The Food and Drug Administration says there is little evidence that testosterone-boosting drugs taken by millions of American men are beneficial, though the agency is also unconvinced by studies suggesting the hormone carries serious risks. The agency posted its review online Wednesday ahead of a public meeting to discuss the benefits and risks of treatments that raise levels of the male hormone. Regulators agreed to convene the September 17 meeting after two federally funded studies found links between testosterone therapy and heart problems in men. The scrutiny comes amid an industry marketing blitz for new pills, patches and formulations that has transformed testosterone a multibillion-dollar market. Advertisements for prescription gels like Fortesta and Androgel promise aging men relief from ‘‘Low-T,’’ a condition they link to low libido, fatigue and weight gain. But FDA reviewers state that ‘‘the need to replace testosterone in these older men remains debatable.’’ While testosterone levels naturally decline after age 40, it’s unclear whether those lower levels actually lead to the signs commonly associated with aging, including decreased energy and loss of muscle. The FDA first approved testosterone injections in the 1950s for men who had been diagnosed with hypogonadism, a form of abnormally low testosterone caused by injury or medical illness. But the recent advertising push is focused on otherwise healthy men who simply have lower-than-normal levels of testosterone.
By GRETCHEN REYNOLDS Amyotrophic lateral sclerosis has been all over the news lately because of the ubiquitous A.L.S. ice bucket challenge. That attention has also reinvigorated a long-simmering scientific debate about whether participating in contact sports or even vigorous exercise might somehow contribute to the development of the fatal neurodegenerative disease, an issue that two important new studies attempt to answer. Ever since the great Yankees first baseman Lou Gehrig died of A.L.S. in 1941 at age 37, many Americans have vaguely connected A.L.S. with athletes and sports. In Europe, the possible linkage has been more overtly discussed. In the past decade, several widely publicized studies indicated that professional Italian soccer players were disproportionately prone to A.L.S., with about a sixfold higher incidence than would have been expected numerically. Players were often diagnosed while in their 30s; the normal onset is after 60. These findings prompted some small, follow-up epidemiological studies of A.L.S. patients in Europe. To the surprise and likely consternation of the researchers, they found weak but measurable associations between playing contact sports and a heightened risk for A.L.S. The data even showed links between being physically active — meaning exercising regularly — and contracting the disease, raising concerns among scientists that exercise might somehow be inducing A.L.S. in susceptible people, perhaps by affecting brain neurons or increasing bodily stress. But these studies were extremely small and had methodological problems. So to better determine what role sports and exercise might play in the risk for A.L.S., researchers from across Europe recently combined their efforts into two major new studies. The results should reassure those of us who exercise. The numbers showed that physical activity — whether at work, in sports or during exercise — did not increase people’s risk of developing A.L.S. © 2014 The New York Times Company
Keyword: ALS-Lou Gehrig's Disease
Link ID: 20031 - Posted: 09.03.2014
By Kate Wong In 1871 Charles Darwin surmised that humans were evolutionarily closer to the African apes than to any other species alive. The recent sequencing of the gorilla, chimpanzee and bonobo genomes confirms that supposition and provides a clearer view of how we are connected: chimps and bonobos in particular take pride of place as our nearest living relatives, sharing approximately 99 percent of our DNA, with gorillas trailing at 98 percent. Yet that tiny portion of unshared DNA makes a world of difference: it gives us, for instance, our bipedal stance and the ability to plan missions to Mars. Scientists do not yet know how most of the DNA that is uniquely ours affects gene function. But they can conduct whole-genome analyses—with intriguing results. For example, comparing the 33 percent of our genome that codes for proteins with our relatives' genomes reveals that although the sum total of our genetic differences is small, the individual differences pervade the genome, affecting each of our chromosomes in numerous ways. © 2014 Scientific American
By Jonathan Webb Science reporter, BBC News Monkeys at the top and bottom of the social pecking order have physically different brains, research has found. A particular network of brain areas was bigger in dominant animals, while other regions were bigger in subordinates. The study suggests that primate brains, including ours, can be specialised for life at either end of the hierarchy. The differences might reflect inherited tendencies toward leading or following, or the brain adapting to an animal's role in life - or a little of both. Neuroscientists made the discovery, which appears in the journal Plos Biology, by comparing brain scans from 25 macaque monkeys that were already "on file" as part of ongoing research at the University of Oxford. "We were also looking at learning and memory and decision-making, and the changes that are going on in your brain when you're doing those things," explained Dr MaryAnn Noonan, the study's first author. The decision to look at the animals' social status produced an unexpectedly clear result, Dr Noonan said. "It was surprising. All our monkeys were of different ages and different genders - but with fMRI (functional magnetic resonance imaging) you can control for all of that. And we were consistently seeing these same networks coming out." BBC © 2014
|By Madhuvanthi Kannan We humans assume we are the smartest of all creations. In a world with over 8.7 million species, only we have the ability to understand the inner workings of our body while also unraveling the mysteries of the universe. We are the geniuses, the philosophers, the artists, the poets and savants. We amuse at a dog playing ball, a dolphin jumping rings, or a monkey imitating man because we think of these as remarkable acts for animals that, we presume, aren’t smart as us. But what is smart? Is it just about having ideas, or being good at language and math? Scientists have shown, time and again, that many animals have an extraordinary intellect. Unlike an average human brain that can barely recall a vivid scene from the last hour, chimps have a photographic memory and can memorize patterns they see in the blink of an eye. Sea lions and elephants can remember faces from decades ago. Animals also have a unique sense perception. Sniffer dogs can detect the first signs of colon cancer by the scents of patients, while doctors flounder in early diagnosis. So the point is animals are smart too. But that’s not the upsetting realization. What happens when, for just once, a chimp or a dog challenges man to one of their feats? Well, for one, a precarious face-off – like the one Matt Reeves conceived in the Planet of the Apes – would seem a tad less unlikely than we thought. In a recent study by psychologists Colin Camerer and Tetsuro Matsuzawa, chimps and humans played a strategy game – and unexpectedly, the chimps outplayed the humans. Chimps are a scientist’s favorite model to understand human brain and behavior. Chimp and human DNAs overlap by a whopping 99 percent, which makes us closer to chimps than horses to zebras. Yet at some point, we evolved differently. Our behavior and personalities, molded to some extent by our distinct societies, are strikingly different from that of our fellow primates. Chimps are aggressive and status-hungry within their hierarchical societies, knit around a dominant alpha male. We are, perhaps, a little less so. So the question arises whether competitive behavior is hard-wired in them. © 2014 Scientific American
By Virginia Morell Figaro, a Goffin’s cockatoo (Cacatua goffini) housed at a research lab in Austria, stunned scientists a few years ago when he began spontaneously making stick tools from the wooden beams of his aviary. The Indonesian parrots are not known to use tools in the wild, yet Figaro confidently employed his sticks to rake in nuts outside his wire enclosure. Wondering if Figaro’s fellow cockatoos could learn by watching his methods, scientists set up experiments for a dozen of them. One group watched as Figaro used a stick to reach a nut placed inside an acrylic box with a wire-mesh front panel; others saw “ghost demonstrators”—magnets that were hidden beneath a table and that the researchers controlled—displace the treats. Each bird was then placed in front of the box, with a stick just like Figaro’s lying nearby. The group of three males and three females that had watched Figaro also picked up the sticks, and made some efforts reminiscent of his actions. But only those three males, such as the one in the photo above, became proficient with the tool and successfully retrieved the nuts, the scientists report online today in the Proceedings of the Royal Society B. None of the females did so; nor did any of the birds, male or female, in the ghost demonstrator group. Because the latter group failed entirely, the study shows that the birds need living teachers, the scientists say. Intriguingly, the clever observers developed a better technique than Figaro’s for getting the treat. Thus, the cockatoos weren’t copying his exact actions, but emulating them—a distinction that implies some degree of creativity. Two of the successful cockatoos were later given a chance to make a tool of their own. One did so immediately (as in the video above), and the other succeeded after watching Figaro. It may be that by learning to use a tool, the birds are stimulated to make tools of their own, the scientists say. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20027 - Posted: 09.03.2014
by Chris Higgins Neuroscientists have pinpointed where imagination hides in the brain and found it to be functionally distinct from related processes such as memory. The team from Brigham Young University (BYU), Utah-- including research proposer, undergraduate student Stefania Ashby -- used functional Magnetic Resonance Imaging (fMRI) to observe brain activity when subjects were remembering specific experiences and putting themselves in novel ones. "I was thinking a lot about planning for my own future and imagining myself in the future, and I started wondering how memory and imagination work together," Ashby said. "I wondered if they were separate or if imagination is just taking past memories and combining them in different ways to form something I've never experienced before." The two processes of remembering and imagining have been previously proposed to be the same cognitive task, and so thought to be carried out by the same areas of the brain. However, the experiments derived by Ashby and her mentor (and coauthor) BYU professor Brock Kirwan have refuted these ideas. The studies -- published in the journal Cognitive Neuroscience -- required participants to submit 60 photographs of previous life events and use them to create prompts for the "remember" sections. They then carried out a questionnaire before putting the subject into the MRI scanner to determine what scenarios were the most novel to them and force them into imagination. Then, under fMRI testing, the subjects were prompted with various scenarios and the areas of their brain that became active during each scenario was correlated with each scene's familiarity -- pure memory, or imagination. © Condé Nast UK 2014
By JOHN ROGERS LOS ANGELES (AP) — The founder of a Los Angeles-based nonprofit that provides free music lessons to low-income students from gang-ridden neighborhoods began to notice several years ago a hopeful sign: Kids were graduating high school and heading off to UCLA, Tulane and other big universities. That’s when Margaret Martin asked how the children in the Harmony Project were beating the odds. Researchers at Northwestern University in Illinois believe that the students’ music training played a role in their educational achievement, helping as Martin noticed 90 percent of them graduate from high school while 50 percent or more didn’t from those same neighborhoods. A two-year study of 44 children in the program shows that the training changes the brain in ways that make it easier for youngsters to process sounds, according to results reported in Tuesday’s edition of The Journal of Neuroscience. That increased ability, the researchers say, is linked directly to improved skills in such subjects as reading and speech. But, there is one catch: People have to actually play an instrument to get smarter. They can’t just crank up the tunes on their iPod. Nina Kraus, the study’s lead researcher and director of Northwestern’s auditory neuroscience laboratory, compared the difference to that of building up one’s body through exercise. ‘‘I like to say to people: You’re not going to get physically fit just watching sports,’’ she said.
Greta Kaul, Stanford researchers say poor sleep may be an independent risk factor for suicide in adults over 65. Researchers used data from a previous epidemiological study to compare the sleep quality of 20 older adults who committed suicide and 400 who didn't, over 10 years. Researchers found that those who didn't sleep well were 1.4 times more likely to commit suicide within a decade. Older adults have disproportionately high suicide rates in the first place, especially older men. The Stanford researchers believe that on its own, sleeping poorly could be a risk factor for suicide later in life. It may even be a more powerful predictor of suicide risk than symptoms of depression. They found that the strongest predictor of suicide was the combination of bad sleep and depression. Unlike many biological, psychological and social risk factors for suicide, sleep disorders tend to be treatable, said Rebecca Bernert, the lead author of the study. Sleep disorders are also less stigmatized than other suicide risk factors. Bernert is now studying whether treating insomnia is effective in preventing depression and suicide. The study was published in JAMA Psychiatry in August. © 2014 Hearst Communications, Inc.
By JOHN MARKOFF STANFORD, Calif. — In factories and warehouses, robots routinely outdo humans in strength and precision. Artificial intelligence software can drive cars, beat grandmasters at chess and leave “Jeopardy!” champions in the dust. But machines still lack a critical element that will keep them from eclipsing most human capabilities anytime soon: a well-developed sense of touch. Consider Dr. Nikolas Blevins, a head and neck surgeon at Stanford Health Care who routinely performs ear operations requiring that he shave away bone deftly enough to leave an inner surface as thin as the membrane in an eggshell. Dr. Blevins is collaborating with the roboticists J. Kenneth Salisbury and Sonny Chan on designing software that will make it possible to rehearse these operations before performing them. The program blends X-ray and magnetic resonance imaging data to create a vivid three-dimensional model of the inner ear, allowing the surgeon to practice drilling away bone, to take a visual tour of the patient’s skull and to virtually “feel” subtle differences in cartilage, bone and soft tissue. Yet no matter how thorough or refined, the software provides only the roughest approximation of Dr. Blevins’s sensitive touch. “Being able to do virtual surgery, you really need to have haptics,” he said, referring to the technology that makes it possible to mimic the sensations of touch in a computer simulation. The software’s limitations typify those of robotics, in which researchers lag in designing machines to perform tasks that humans routinely do instinctively. Since the first robotic arm was designed at the Stanford Artificial Intelligence Laboratory in the 1960s, robots have learned to perform repetitive factory work, but they can barely open a door, pick themselves up if they fall, pull a coin out of a pocket or twirl a pencil. © 2014 The New York Times Company
|By Jill U. Adams Our noses are loaded with bitter taste receptors, but they're not helping us taste or smell lunch. Ever since researchers at the University of Iowa came to this conclusion in 2009, scientists have been looking for an explanation for why the receptors are there. One speculation is that they warn us of noxious substances. But they may play another role too: helping to fight infections. In addition to common bitter compounds, the nose's bitter receptors also react to chemicals that bacteria use to communicate. That got Noam Cohen, a University of Pennsylvania otolaryngologist, wondering whether the receptors detect pathogens that cause sinus infections. In a 2012 study, his team found that bacterial chemicals elicited two bacteria-fighting responses in cells from the nose and upper airways: movement of the cells' projections that divert noxious things out of the body and release of nitric oxide, which kills bacteria. The findings may have clinical applications. When Cohen recently analyzed bitter taste receptor genes from his patients with chronic sinus infections, he noticed that practically none were supertasters, even though supertasters make up an estimated 25 percent of the population. Supertasters are extra sensitive to bitter compounds in foods. People are either supertasters or nontasters, or somewhere in between, reflecting the genes they carry for a receptor known as T2R38. Cohen thinks supertasters react vigorously to bacterial bitter compounds in the nose and are thus resistant to sinus infections. In nontasters the reaction is weaker, bacteria thrive and sinus infections ensue. These results suggest that a simple taste test could be used to predict who is at risk for recurrent infections and might need more aggressive medical treatment. © 2014 Scientific American
By Meeri Kim The pervasive glow of electronic devices may be an impediment to a good night’s sleep. That’s particularly noticeable now, when families are adjusting to early wake-up times for school. Teenagers can find it especially hard to get started in the morning. For nocturnal animals, it spurs activity. For daytime species such as humans, melatonin signals that it’s time to sleep. As lamps switch off in teens’ bedrooms across America, the lights from their computer screens, smartphones and tablets often stay on throughout the night. These devices emit light of all colors, but it’s the blues in particular that pose a danger to sleep. Blue light is especially good at preventing the release of melatonin, a hormone associated with nighttime. Ordinarily, the pineal gland, a pea-size organ in the brain, begins to release melatonin a couple of hours before your regular bedtime. The hormone is no sleeping pill, but it does reduce alertness and make sleep more inviting. However, light — particularly of the blue variety — can keep the pineal gland from releasing melatonin, thus warding off sleepiness. You don’t have to be staring directly at a television or computer screen: If enough blue light hits the eye, the gland can stop releasing melatonin. So easing into bed with a tablet or a laptop makes it harder to take a long snooze, especially for sleep-deprived teenagers who are more vulnerable to the effects of light than adults. During adolescence, the circadian rhythm shifts, and teens feel more awake later at night. Switching on a TV show or video game just before bedtime will push off sleepiness even later even if they have to be up by 6 a.m. to get to school on time.
By RONI CARYN RABIN Pregnant women often go to great lengths to give their babies a healthy start in life. They quit smoking, skip the chardonnay, switch to decaf, forgo aspirin. They say no to swordfish and politely decline Brie. Yet they rarely wean themselves from popular selective serotonin reuptake inhibitor antidepressants like Prozac, Celexa and Zoloft despite an increasing number of studies linking prenatal exposure to birth defects, complications after birth and even developmental delays and autism. Up to 14 percent of pregnant women take antidepressants, and the Food and Drug Administration has issued strong warnings that one of them, paroxetine (Paxil), may cause birth defects. But the prevailing attitude among doctors has been that depression during pregnancy is more dangerous to mother and child than any drug could be. Now a growing number of critics are challenging that assumption. “If antidepressants made such a big difference, and women on them were eating better, sleeping better and taking better care of themselves, then one would expect to see better birth outcomes among the women who took medication than among similar women who did not,” said Barbara Mintzes, an associate professor at the University of British Columbia School of Population and Public Health. “What’s striking is that there’s no research evidence showing that.” On the contrary, she said, “when you look for it, all you find are harms.” S.S.R.I.s are believed to work in part by blocking reabsorption (or reuptake) of serotonin, altering levels of this important neurotransmitter in the brain and elsewhere in the body. Taken by a pregnant woman, the drugs cross the placental barrier, affecting the fetus. © 2014 The New York Times Company
Moheb Costandi Autism can be baffling, appearing in various forms and guises and thwarting our best attempts to understand the minds of people affected by it. Anything we know for sure about the disorder can probably be traced back to the pioneering research of the developmental psychologist Uta Frith. Frith was the first to propose that people with autism lack theory of mind, the ability to attribute beliefs, intentions and desires to others. She also recognized the superior perceptual abilities of many with the disorder — and their tendency to be unable to see the forest for the trees. Frith, now affiliated with the Institute of Cognitive Neuroscience at University College London (UCL), has shaped autism research for an entire generation of investigators. Meanwhile, her husband Chris Frith formulated a new view of schizophrenia, a mental illness marked by hallucinations, disordered thinking and apathy. His work explored how the disorder affects the experience of agency, the sense that we are in control of our bodies and responsible for our actions. And his innovations in brain imaging helped researchers examine the relationship between brain and mind. Independently, husband and wife explored the social and cognitive aspects of these psychiatric disorders. Together, they helped lay the foundations of cognitive neuroscience, the discipline that seeks to understand the biological basis of thought processes. Trevor Robbins, a cognitive neuroscientist at the University of Cambridge in the U.K., calls them “tremendously influential pioneers,” in particular because both brought a social perspective to cognitive neuroscience. © Copyright 2014 Simons Foundation
Link ID: 20019 - Posted: 09.02.2014
By ANAHAD O’CONNOR People who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight. The new study was financed by the National Institutes of Health and published in the Annals of Internal Medicine. It included a racially diverse group of 150 men and women — a rarity in clinical nutrition studies — who were assigned to follow diets for one year that limited either the amount of carbs or fat that they could eat, but not overall calories. “To my knowledge, this is one of the first long-term trials that’s given these diets without calorie restrictions,” said Dariush Mozaffarian, the dean of the Friedman School of Nutrition Science and Policy at Tufts University, who was not involved in the new study. “It shows that in a free-living setting, cutting your carbs helps you lose weight without focusing on calories. And that’s really important because someone can change what they eat more easily than trying to cut down on their calories.” © 2014 The New York Times Company
Link ID: 20018 - Posted: 09.02.2014
Carl Zimmer An unassuming single-celled organism called Toxoplasma gondii is one of the most successful parasites on Earth, infecting an estimated 11 percent of Americans and perhaps half of all people worldwide. It’s just as prevalent in many other species of mammals and birds. In a recent study in Ohio, scientists found the parasite in three-quarters of the white-tailed deer they studied. One reason for Toxoplasma’s success is its ability to manipulate its hosts. The parasite can influence their behavior, so much so that hosts can put themselves at risk of death. Scientists first discovered this strange mind control in the 1990s, but it’s been hard to figure out how they manage it. Now a new study suggests that Toxoplasma can turn its host’s genes on and off — and it’s possible other parasites use this strategy, too. Toxoplasma manipulates its hosts to complete its life cycle. Although it can infect any mammal or bird, it can reproduce only inside of a cat. The parasites produce cysts that get passed out of the cat with its feces; once in the soil, the cysts infect new hosts. Toxoplasma returns to cats via their prey. But a host like a rat has evolved to avoid cats as much as possible, taking evasive action from the very moment it smells feline odor. Experiments on rats and mice have shown that Toxoplasma alters their response to cat smells. Many infected rodents lose their natural fear of the scent. Some even seem to be attracted to it. Manipulating the behavior of a host is a fairly common strategy among parasites, but it’s hard to fathom how they manage it. A rat’s response to cat odor, for example, emerges from complex networks of neurons that detect an odor, figure out its source and decide on the right response in a given moment. © 2014 The New York Times Company
By JAMIE EDGIN and FABIAN FERNANDEZ LAST week the biologist Richard Dawkins sparked controversy when, in response to a woman’s hypothetical question about whether to carry to term a child with Down syndrome, he wrote on Twitter: “Abort it and try again. It would be immoral to bring it into the world if you have the choice.” In further statements, Mr. Dawkins suggested that his view was rooted in the moral principle of reducing overall suffering whenever possible — in this case, that of individuals born with Down syndrome and their families. But Mr. Dawkins’s argument is flawed. Not because his moral reasoning is wrong, necessarily (that is a question for another day), but because his understanding of the facts is mistaken. Recent research indicates that individuals with Down syndrome can experience more happiness and potential for success than Mr. Dawkins seems to appreciate. There are, of course, many challenges facing families caring for children with Down syndrome, including a high likelihood that their children will face surgery in infancy and Alzheimer’s disease in adulthood. But at the same time, studies have suggested that families of these children show levels of well-being that are often greater than those of families with children with other developmental disabilities, and sometimes equivalent to those of families with nondisabled children. These effects are prevalent enough to have been coined the “Down syndrome advantage.” In 2010, researchers reported that parents of preschoolers with Down syndrome experienced lower levels of stress than parents of preschoolers with autism. In 2007, researchers found that the divorce rate in families with a child with Down syndrome was lower on average than that in families with a child with other congenital abnormalities and in those with a nondisabled child. © 2014 The New York Times Company
Memory can be boosted by using a magnetic field to stimulate part of the brain, a study has shown. The effect lasts at least 24 hours after the stimulation is given, improving the ability of volunteers to remember words linked to photos of faces. Scientists believe the discovery could lead to new treatments for loss of memory function caused by ageing, strokes, head injuries and early Alzheimer's disease. Dr Joel Voss, from Northwestern University in Chicago, said: "We show for the first time that you can specifically change memory functions of the brain in adults without surgery or drugs, which have not proven effective. "This non-invasive stimulation improves the ability to learn new things. It has tremendous potential for treating memory disorders." The scientists focused on associative memory, the ability to learn and remember relationships between unrelated items. An example of associative memory would be linking someone to a particular restaurant where you both once dined. It involves a network of different brain regions working in concert with a key memory structure called the hippocampus, which has been compared to an "orchestra conductor" directing brain activity. Stimulating the hippocampus caused the "musicians" – the brain regions – to "play" more in time, thereby tightening up their performance. A total of 16 volunteers aged 21-40 took part in the study, agreeing to undergo 20 minutes of transcranial magnetic stimulation (TMS) every day for five days. © 2014 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 20015 - Posted: 08.30.2014
by Michael Slezak It's odourless, colourless, tasteless and mostly non-reactive – but it may help you forget. Xenon gas has been shown to erase fearful memories in mice, raising the possibility that it could be used to treat post-traumatic stress disorder (PTSD) if the results are replicated in a human trial next year. The method exploits a neurological process known as "reconsolidation". When memories are recalled, they seem to get re-encoded, almost like a new memory. When this process is taking place, the memories become malleable and can be subtly altered. This new research suggests that at least in mice, the reconsolidation process might be partially blocked by xenon, essentially erasing fearful memories. Among other things, xenon is used as an anaesthetic. Frozen in fear Edward Meloni and his colleagues at Harvard Medical School in Boston trained mice to be afraid of a sound by placing them in a cage and giving them an electric shock after the sound was played. Thereafter, if the mice heard the noise, they would become frightened and freeze. Later, the team played the sound and then gave the mice either a low dose of xenon gas for an hour or just exposed them to normal air. Mice that were exposed to xenon froze for less time in response to the sound than the other mice. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 20014 - Posted: 08.30.2014