Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 19512

by Chris Higgins Neuroscientists have pinpointed where imagination hides in the brain and found it to be functionally distinct from related processes such as memory. The team from Brigham Young University (BYU), Utah-- including research proposer, undergraduate student Stefania Ashby -- used functional Magnetic Resonance Imaging (fMRI) to observe brain activity when subjects were remembering specific experiences and putting themselves in novel ones. "I was thinking a lot about planning for my own future and imagining myself in the future, and I started wondering how memory and imagination work together," Ashby said. "I wondered if they were separate or if imagination is just taking past memories and combining them in different ways to form something I've never experienced before." The two processes of remembering and imagining have been previously proposed to be the same cognitive task, and so thought to be carried out by the same areas of the brain. However, the experiments derived by Ashby and her mentor (and coauthor) BYU professor Brock Kirwan have refuted these ideas. The studies -- published in the journal Cognitive Neuroscience -- required participants to submit 60 photographs of previous life events and use them to create prompts for the "remember" sections. They then carried out a questionnaire before putting the subject into the MRI scanner to determine what scenarios were the most novel to them and force them into imagination. Then, under fMRI testing, the subjects were prompted with various scenarios and the areas of their brain that became active during each scenario was correlated with each scene's familiarity -- pure memory, or imagination. © Condé Nast UK 2014

Keyword: Learning & Memory; Aggression
Link ID: 20026 - Posted: 09.03.2014

By JOHN ROGERS LOS ANGELES (AP) — The founder of a Los Angeles-based nonprofit that provides free music lessons to low-income students from gang-ridden neighborhoods began to notice several years ago a hopeful sign: Kids were graduating high school and heading off to UCLA, Tulane and other big universities. That’s when Margaret Martin asked how the children in the Harmony Project were beating the odds. Researchers at Northwestern University in Illinois believe that the students’ music training played a role in their educational achievement, helping as Martin noticed 90 percent of them graduate from high school while 50 percent or more didn’t from those same neighborhoods. A two-year study of 44 children in the program shows that the training changes the brain in ways that make it easier for youngsters to process sounds, according to results reported in Tuesday’s edition of The Journal of Neuroscience. That increased ability, the researchers say, is linked directly to improved skills in such subjects as reading and speech. But, there is one catch: People have to actually play an instrument to get smarter. They can’t just crank up the tunes on their iPod. Nina Kraus, the study’s lead researcher and director of Northwestern’s auditory neuroscience laboratory, compared the difference to that of building up one’s body through exercise. ‘‘I like to say to people: You’re not going to get physically fit just watching sports,’’ she said.

Keyword: Hearing; Aggression
Link ID: 20025 - Posted: 09.03.2014

Greta Kaul, Stanford researchers say poor sleep may be an independent risk factor for suicide in adults over 65. Researchers used data from a previous epidemiological study to compare the sleep quality of 20 older adults who committed suicide and 400 who didn't, over 10 years. Researchers found that those who didn't sleep well were 1.4 times more likely to commit suicide within a decade. Older adults have disproportionately high suicide rates in the first place, especially older men. The Stanford researchers believe that on its own, sleeping poorly could be a risk factor for suicide later in life. It may even be a more powerful predictor of suicide risk than symptoms of depression. They found that the strongest predictor of suicide was the combination of bad sleep and depression. Unlike many biological, psychological and social risk factors for suicide, sleep disorders tend to be treatable, said Rebecca Bernert, the lead author of the study. Sleep disorders are also less stigmatized than other suicide risk factors. Bernert is now studying whether treating insomnia is effective in preventing depression and suicide. The study was published in JAMA Psychiatry in August. © 2014 Hearst Communications, Inc.

Keyword: Depression; Aggression
Link ID: 20024 - Posted: 09.03.2014

By JOHN MARKOFF STANFORD, Calif. — In factories and warehouses, robots routinely outdo humans in strength and precision. Artificial intelligence software can drive cars, beat grandmasters at chess and leave “Jeopardy!” champions in the dust. But machines still lack a critical element that will keep them from eclipsing most human capabilities anytime soon: a well-developed sense of touch. Consider Dr. Nikolas Blevins, a head and neck surgeon at Stanford Health Care who routinely performs ear operations requiring that he shave away bone deftly enough to leave an inner surface as thin as the membrane in an eggshell. Dr. Blevins is collaborating with the roboticists J. Kenneth Salisbury and Sonny Chan on designing software that will make it possible to rehearse these operations before performing them. The program blends X-ray and magnetic resonance imaging data to create a vivid three-dimensional model of the inner ear, allowing the surgeon to practice drilling away bone, to take a visual tour of the patient’s skull and to virtually “feel” subtle differences in cartilage, bone and soft tissue. Yet no matter how thorough or refined, the software provides only the roughest approximation of Dr. Blevins’s sensitive touch. “Being able to do virtual surgery, you really need to have haptics,” he said, referring to the technology that makes it possible to mimic the sensations of touch in a computer simulation. The software’s limitations typify those of robotics, in which researchers lag in designing machines to perform tasks that humans routinely do instinctively. Since the first robotic arm was designed at the Stanford Artificial Intelligence Laboratory in the 1960s, robots have learned to perform repetitive factory work, but they can barely open a door, pick themselves up if they fall, pull a coin out of a pocket or twirl a pencil. © 2014 The New York Times Company

Keyword: Robotics; Aggression
Link ID: 20023 - Posted: 09.02.2014

|By Jill U. Adams Our noses are loaded with bitter taste receptors, but they're not helping us taste or smell lunch. Ever since researchers at the University of Iowa came to this conclusion in 2009, scientists have been looking for an explanation for why the receptors are there. One speculation is that they warn us of noxious substances. But they may play another role too: helping to fight infections. In addition to common bitter compounds, the nose's bitter receptors also react to chemicals that bacteria use to communicate. That got Noam Cohen, a University of Pennsylvania otolaryngologist, wondering whether the receptors detect pathogens that cause sinus infections. In a 2012 study, his team found that bacterial chemicals elicited two bacteria-fighting responses in cells from the nose and upper airways: movement of the cells' projections that divert noxious things out of the body and release of nitric oxide, which kills bacteria. The findings may have clinical applications. When Cohen recently analyzed bitter taste receptor genes from his patients with chronic sinus infections, he noticed that practically none were supertasters, even though supertasters make up an estimated 25 percent of the population. Supertasters are extra sensitive to bitter compounds in foods. People are either supertasters or nontasters, or somewhere in between, reflecting the genes they carry for a receptor known as T2R38. Cohen thinks supertasters react vigorously to bacterial bitter compounds in the nose and are thus resistant to sinus infections. In nontasters the reaction is weaker, bacteria thrive and sinus infections ensue. These results suggest that a simple taste test could be used to predict who is at risk for recurrent infections and might need more aggressive medical treatment. © 2014 Scientific American

Keyword: Chemical Senses (Smell & Taste); Aggression
Link ID: 20022 - Posted: 09.02.2014

By Meeri Kim The pervasive glow of electronic devices may be an impediment to a good night’s sleep. That’s particularly noticeable now, when families are adjusting to early wake-up times for school. Teenagers can find it especially hard to get started in the morning. For nocturnal animals, it spurs activity. For daytime species such as humans, melatonin signals that it’s time to sleep. As lamps switch off in teens’ bedrooms across America, the lights from their computer screens, smartphones and tablets often stay on throughout the night. These devices emit light of all colors, but it’s the blues in particular that pose a danger to sleep. Blue light is especially good at preventing the release of melatonin, a hormone associated with nighttime. Ordinarily, the pineal gland, a pea-size organ in the brain, begins to release melatonin a couple of hours before your regular bedtime. The hormone is no sleeping pill, but it does reduce alertness and make sleep more inviting. However, light — particularly of the blue variety — can keep the pineal gland from releasing melatonin, thus warding off sleepiness. You don’t have to be staring directly at a television or computer screen: If enough blue light hits the eye, the gland can stop releasing melatonin. So easing into bed with a tablet or a laptop makes it harder to take a long snooze, especially for sleep-deprived teenagers who are more vulnerable to the effects of light than adults. During adolescence, the circadian rhythm shifts, and teens feel more awake later at night. Switching on a TV show or video game just before bedtime will push off sleepiness even later even if they have to be up by 6 a.m. to get to school on time.

Keyword: Biological Rhythms; Aggression
Link ID: 20021 - Posted: 09.02.2014

By RONI CARYN RABIN Pregnant women often go to great lengths to give their babies a healthy start in life. They quit smoking, skip the chardonnay, switch to decaf, forgo aspirin. They say no to swordfish and politely decline Brie. Yet they rarely wean themselves from popular selective serotonin reuptake inhibitor antidepressants like Prozac, Celexa and Zoloft despite an increasing number of studies linking prenatal exposure to birth defects, complications after birth and even developmental delays and autism. Up to 14 percent of pregnant women take antidepressants, and the Food and Drug Administration has issued strong warnings that one of them, paroxetine (Paxil), may cause birth defects. But the prevailing attitude among doctors has been that depression during pregnancy is more dangerous to mother and child than any drug could be. Now a growing number of critics are challenging that assumption. “If antidepressants made such a big difference, and women on them were eating better, sleeping better and taking better care of themselves, then one would expect to see better birth outcomes among the women who took medication than among similar women who did not,” said Barbara Mintzes, an associate professor at the University of British Columbia School of Population and Public Health. “What’s striking is that there’s no research evidence showing that.” On the contrary, she said, “when you look for it, all you find are harms.” S.S.R.I.s are believed to work in part by blocking reabsorption (or reuptake) of serotonin, altering levels of this important neurotransmitter in the brain and elsewhere in the body. Taken by a pregnant woman, the drugs cross the placental barrier, affecting the fetus. © 2014 The New York Times Company

Keyword: Depression; Aggression
Link ID: 20020 - Posted: 09.02.2014

Moheb Costandi Autism can be baffling, appearing in various forms and guises and thwarting our best attempts to understand the minds of people affected by it. Anything we know for sure about the disorder can probably be traced back to the pioneering research of the developmental psychologist Uta Frith. Frith was the first to propose that people with autism lack theory of mind, the ability to attribute beliefs, intentions and desires to others. She also recognized the superior perceptual abilities of many with the disorder — and their tendency to be unable to see the forest for the trees. Frith, now affiliated with the Institute of Cognitive Neuroscience at University College London (UCL), has shaped autism research for an entire generation of investigators. Meanwhile, her husband Chris Frith formulated a new view of schizophrenia, a mental illness marked by hallucinations, disordered thinking and apathy. His work explored how the disorder affects the experience of agency, the sense that we are in control of our bodies and responsible for our actions. And his innovations in brain imaging helped researchers examine the relationship between brain and mind. Independently, husband and wife explored the social and cognitive aspects of these psychiatric disorders. Together, they helped lay the foundations of cognitive neuroscience, the discipline that seeks to understand the biological basis of thought processes. Trevor Robbins, a cognitive neuroscientist at the University of Cambridge in the U.K., calls them “tremendously influential pioneers,” in particular because both brought a social perspective to cognitive neuroscience. © Copyright 2014 Simons Foundation

Keyword: Autism
Link ID: 20019 - Posted: 09.02.2014

By ANAHAD O’CONNOR People who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight. The new study was financed by the National Institutes of Health and published in the Annals of Internal Medicine. It included a racially diverse group of 150 men and women — a rarity in clinical nutrition studies — who were assigned to follow diets for one year that limited either the amount of carbs or fat that they could eat, but not overall calories. “To my knowledge, this is one of the first long-term trials that’s given these diets without calorie restrictions,” said Dariush Mozaffarian, the dean of the Friedman School of Nutrition Science and Policy at Tufts University, who was not involved in the new study. “It shows that in a free-living setting, cutting your carbs helps you lose weight without focusing on calories. And that’s really important because someone can change what they eat more easily than trying to cut down on their calories.” © 2014 The New York Times Company

Keyword: Obesity
Link ID: 20018 - Posted: 09.02.2014

Carl Zimmer An unassuming single-celled organism called Toxoplasma gondii is one of the most successful parasites on Earth, infecting an estimated 11 percent of Americans and perhaps half of all people worldwide. It’s just as prevalent in many other species of mammals and birds. In a recent study in Ohio, scientists found the parasite in three-quarters of the white-tailed deer they studied. One reason for Toxoplasma’s success is its ability to manipulate its hosts. The parasite can influence their behavior, so much so that hosts can put themselves at risk of death. Scientists first discovered this strange mind control in the 1990s, but it’s been hard to figure out how they manage it. Now a new study suggests that Toxoplasma can turn its host’s genes on and off — and it’s possible other parasites use this strategy, too. Toxoplasma manipulates its hosts to complete its life cycle. Although it can infect any mammal or bird, it can reproduce only inside of a cat. The parasites produce cysts that get passed out of the cat with its feces; once in the soil, the cysts infect new hosts. Toxoplasma returns to cats via their prey. But a host like a rat has evolved to avoid cats as much as possible, taking evasive action from the very moment it smells feline odor. Experiments on rats and mice have shown that Toxoplasma alters their response to cat smells. Many infected rodents lose their natural fear of the scent. Some even seem to be attracted to it. Manipulating the behavior of a host is a fairly common strategy among parasites, but it’s hard to fathom how they manage it. A rat’s response to cat odor, for example, emerges from complex networks of neurons that detect an odor, figure out its source and decide on the right response in a given moment. © 2014 The New York Times Company

Keyword: Emotions; Aggression
Link ID: 20017 - Posted: 08.30.2014

By JAMIE EDGIN and FABIAN FERNANDEZ LAST week the biologist Richard Dawkins sparked controversy when, in response to a woman’s hypothetical question about whether to carry to term a child with Down syndrome, he wrote on Twitter: “Abort it and try again. It would be immoral to bring it into the world if you have the choice.” In further statements, Mr. Dawkins suggested that his view was rooted in the moral principle of reducing overall suffering whenever possible — in this case, that of individuals born with Down syndrome and their families. But Mr. Dawkins’s argument is flawed. Not because his moral reasoning is wrong, necessarily (that is a question for another day), but because his understanding of the facts is mistaken. Recent research indicates that individuals with Down syndrome can experience more happiness and potential for success than Mr. Dawkins seems to appreciate. There are, of course, many challenges facing families caring for children with Down syndrome, including a high likelihood that their children will face surgery in infancy and Alzheimer’s disease in adulthood. But at the same time, studies have suggested that families of these children show levels of well-being that are often greater than those of families with children with other developmental disabilities, and sometimes equivalent to those of families with nondisabled children. These effects are prevalent enough to have been coined the “Down syndrome advantage.” In 2010, researchers reported that parents of preschoolers with Down syndrome experienced lower levels of stress than parents of preschoolers with autism. In 2007, researchers found that the divorce rate in families with a child with Down syndrome was lower on average than that in families with a child with other congenital abnormalities and in those with a nondisabled child. © 2014 The New York Times Company

Keyword: Genes & Behavior; Aggression
Link ID: 20016 - Posted: 08.30.2014

Memory can be boosted by using a magnetic field to stimulate part of the brain, a study has shown. The effect lasts at least 24 hours after the stimulation is given, improving the ability of volunteers to remember words linked to photos of faces. Scientists believe the discovery could lead to new treatments for loss of memory function caused by ageing, strokes, head injuries and early Alzheimer's disease. Dr Joel Voss, from Northwestern University in Chicago, said: "We show for the first time that you can specifically change memory functions of the brain in adults without surgery or drugs, which have not proven effective. "This non-invasive stimulation improves the ability to learn new things. It has tremendous potential for treating memory disorders." The scientists focused on associative memory, the ability to learn and remember relationships between unrelated items. An example of associative memory would be linking someone to a particular restaurant where you both once dined. It involves a network of different brain regions working in concert with a key memory structure called the hippocampus, which has been compared to an "orchestra conductor" directing brain activity. Stimulating the hippocampus caused the "musicians" – the brain regions – to "play" more in time, thereby tightening up their performance. A total of 16 volunteers aged 21-40 took part in the study, agreeing to undergo 20 minutes of transcranial magnetic stimulation (TMS) every day for five days. © 2014 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 20015 - Posted: 08.30.2014

by Michael Slezak It's odourless, colourless, tasteless and mostly non-reactive – but it may help you forget. Xenon gas has been shown to erase fearful memories in mice, raising the possibility that it could be used to treat post-traumatic stress disorder (PTSD) if the results are replicated in a human trial next year. The method exploits a neurological process known as "reconsolidation". When memories are recalled, they seem to get re-encoded, almost like a new memory. When this process is taking place, the memories become malleable and can be subtly altered. This new research suggests that at least in mice, the reconsolidation process might be partially blocked by xenon, essentially erasing fearful memories. Among other things, xenon is used as an anaesthetic. Frozen in fear Edward Meloni and his colleagues at Harvard Medical School in Boston trained mice to be afraid of a sound by placing them in a cage and giving them an electric shock after the sound was played. Thereafter, if the mice heard the noise, they would become frightened and freeze. Later, the team played the sound and then gave the mice either a low dose of xenon gas for an hour or just exposed them to normal air. Mice that were exposed to xenon froze for less time in response to the sound than the other mice. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory
Link ID: 20014 - Posted: 08.30.2014

By GARY GREENBERG Joel Gold first observed the Truman Show delusion — in which people believe they are the involuntary subjects of a reality television show whose producers are scripting the vicissitudes of their lives — on Halloween night 2003 at Bellevue Hospital, where he was the chief attending psychiatrist. “Suspicious Minds,” which he wrote with his brother, Ian, an associate professor of philosophy and psychology at McGill University, is an attempt to use this delusion, which has been observed by many clinicians, to pose questions that have gone out of fashion in psychiatry over the last half-century: Why does a mentally ill person have the delusions he or she has? And, following the lead of the medical historian Roy Porter, who once wrote that “every age gets the lunatics it deserves,” what can we learn about ourselves and our times from examining the content of madness? The Golds’ answer is a dual broadside: against a psychiatric profession that has become infatuated with neuroscience as part of its longstanding attempt to establish itself as “real medicine,” and against a culture that has become too networked for its own good. Current psychiatric practice is to treat delusions as the random noise generated by a malfunctioning (and mindless) brain — a strategy that would be more convincing if doctors had a better idea of how the brain produced madness and how to cure it. According to the Golds, ignoring the content of delusions like T.S.D. can only make mentally ill people feel more misunderstood, even as it distracts the rest of us from the true significance of the delusion: that we live in a society that has put us all under surveillance. T.S.D. sufferers may be paranoid, but that does not mean they are wrong to think the whole world is watching. This is not to say they aren’t crazy. Mental illness may be “just a frayed, weakened version of mental health,” but what is in tatters for T.S.D. patients is something crucial to negotiating social life, and that, according to the Golds, is the primary purpose toward which our big brains have evolved: the ability to read other people’s intentions or, as cognitive scientists put it, to have a theory of mind. This capacity is double-edged. “The better you are at ToM,” they write, “the greater your capacity for friendship.” © 2014 The New York Times Company

Keyword: Schizophrenia
Link ID: 20013 - Posted: 08.30.2014

By Virginia Morell A dog’s bark may sound like nothing but noise, but it encodes important information. In 2005, scientists showed that people can tell whether a dog is lonely, happy, or aggressive just by listening to his bark. Now, the same group has shown that dogs themselves distinguish between the barks of pooches they’re familiar with and the barks of strangers and respond differently to each. The team tested pet dogs’ reactions to barks by playing back recorded barks of a familiar and unfamiliar dog. The recordings were made in two different settings: when the pooch was alone, and when he was barking at a stranger at his home’s fence. When the test dogs heard a strange dog barking, they stayed closer to and for a longer period of time at their home’s gate than when they heard the bark of a familiar dog. But when they heard an unknown and lonely dog barking, they stayed close to their house and away from the gate, the team reports this month in Applied Animal Behaviour Science. They also moved closer toward their house when they heard a familiar dog’s barks, and they barked more often in response to a strange dog barking. Dogs, the scientists conclude from this first study of pet dogs barking in their natural environment (their owners’ homes), do indeed pay attention to and glean detailed information from their fellows’ barks. © 2014 American Association for the Advancement of Science

Keyword: Animal Communication; Aggression
Link ID: 20012 - Posted: 08.30.2014

One of the best things about being a neuroscientist used to be the aura of mystery around it. It was once so mysterious that some people didn’t even know it was a thing. When I first went to university and people asked what I studied, they thought I was saying I was a “Euroscientist”, which is presumably someone who studies the science of Europe. I’d get weird questions such as “what do you think of Belgium?” and I’d have to admit that, in all honesty, I never think of Belgium. That’s how mysterious neuroscience was, once. Of course, you could say this confusion was due to my dense Welsh accent, or the fact that I only had the confidence to talk to strangers after consuming a fair amount of alcohol, but I prefer to go with the mystery. It’s not like that any more. Neuroscience is “mainstream” now, to the point where the press coverage of it can be studied extensively. When there’s such a thing as Neuromarketing (well, there isn’t actually such a thing, but there’s a whole industry that would claim otherwise), it’s impossible to maintain that neuroscience is “cool” or “edgy”. It’s a bad time for us neurohipsters (which are the same as regular hipsters, except the designer beards are on the frontal lobes rather than the jaw-line). One way that we professional neuroscientists could maintain our superiority was by correcting misconceptions about the brain, but lately even that avenue looks to be closing to us. The recent film Lucy is based on the most classic brain misconception: that we only use 10% of our brain. But it’s had a considerable amount of flack for this already, suggesting that many people are wise to this myth. We also saw the recent release of Susan Greenfield’s new book Mind Change, all about how technology is changing (damaging?) our brains. This is a worryingly evidence-free but very common claim by Greenfield. Depressingly common, as this blog has pointed out many times. But now even the non-neuroscientist reviewers aren’t buying her claims. © 2014 Guardian News and Media Limited

Keyword: Miscellaneous
Link ID: 20011 - Posted: 08.30.2014

By PAM BELLUCK Memories and the feelings associated with them are not set in stone. You may have happy memories about your family’s annual ski vacation, but if you see a tragic accident on the slopes, those feelings may change. You might even be afraid to ski that mountain again. Now, using a technique in which light is used to switch neurons on and off, neuroscientists at the Massachusetts Institute of Technology appear to have unlocked some secrets about how the brain attaches emotions to memories and how those emotions can be adjusted. Their research, published Wednesday in the journal Nature, was conducted on mice, not humans, so the findings cannot immediately be translated to the treatment of patients. But experts said the experiments may eventually lead to more effective therapies for people with psychological problems such as depression, anxiety or post-traumatic stress disorder. “Imagine you can go in and find a particular traumatic memory and turn it off or change it somehow,” said David Moorman, an assistant professor of psychological and brain sciences at the University of Massachusetts Amherst, who was not involved in the research. “That’s still science fiction, but with this we’re getting a lot closer to it.” The M.I.T. scientists labeled neurons in the brains of mice with a light-sensitive protein and used pulses of light to switch the cells on and off, a technique called optogenetics. Then they identified patterns of neurons activated when mice created a negative memory or a positive one. A negative memory formed when mice received a mild electric shock to their feet; a positive one was formed when the mice, all male, were allowed to spend time with female mice. © 2014 The New York Times Company

Keyword: Learning & Memory; Aggression
Link ID: 20010 - Posted: 08.28.2014

by Penny Sarchet Memory is a fickle beast. A bad experience can turn a once-loved coffee shop or holiday destination into a place to be avoided. Now experiments in mice have shown how such associations can be reversed. When forming a memory of a place, the details of the location and the associated emotions are encoded in different regions of the brain. Memories of the place are formed in the hippocampus, whereas positive or negative associations are encoded in the amygdala. In experiments with mice in 2012, a group led by Susumo Tonegawa of the Massachusetts Institute of Technology managed to trigger the fear part of a memory associated with a location when the animals were in a different location. They used a technique known as optogenetics, which involves genetically engineering mice so that their brains produce a light-sensitive protein in response to a certain cue. In this case, the cue was the formation of the location memory. This meant the team could make the mouse recall the location just by flashing pulses of light down an optical fibre embedded in the skull. The mice were given electric shocks while their memories of the place were was being formed, so that the animals learned to associate that location with pain. Once trained, the mice were put in a new place and a pulse of light was flashed into their brains. This activated the neurons associated with the original location memory and the mice froze, terrified of a shock, demonstrating that the emotion associated with the original location could be induced by reactivating the memory of the place. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Aggression
Link ID: 20009 - Posted: 08.28.2014

Learning is easier when it only requires nerve cells to rearrange existing patterns of activity than when the nerve cells have to generate new patterns, a study of monkeys has found. The scientists explored the brain’s capacity to learn through recordings of electrical activity of brain cell networks. The study was partly funded by the National Institutes of Health. “We looked into the brain and may have seen why it’s so hard to think outside the box,” said Aaron Batista, Ph.D., an assistant professor at the University of Pittsburgh and a senior author of the study published in Nature, with Byron Yu, Ph.D., assistant professor at Carnegie Mellon University, Pittsburgh. The human brain contains nearly 86 billion neurons, which communicate through intricate networks of connections. Understanding how they work together during learning can be challenging. Dr. Batista and his colleagues combined two innovative technologies, brain-computer interfaces and machine learning, to study patterns of activity among neurons in monkey brains as the animals learned to use their thoughts to move a computer cursor. “This is a fundamental advance in understanding the neurobiological patterns that underlie the learning process,” said Theresa Cruz, Ph.D., a program official at the National Center for Medical Rehabilitations Research at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “The findings may eventually lead to new treatments for stroke as well as other neurological disorders.”

Keyword: Learning & Memory; Aggression
Link ID: 20008 - Posted: 08.28.2014

Erin Allday It's well established that chronic pain afflicts people with more than just pain. With the pain come fatigue and sleeplessness, depression and frustration, and a noticeable disinterest in so many of the activities that used to fill a day. It makes sense that chronic pain would leave patients feeling weary and unmotivated - most people wouldn't want to go to work or shop for a week's worth of groceries or even meet friends for dinner when they're exhausted and in pain. But experts in pain and neurology say the connection between chronic pain and a lousy mood may be biochemical, something more complicated than a dour mood brought on from persistent, long-term discomfort alone. Now, a team of Stanford neurologists have found evidence that chronic pain triggers a series of molecular changes in the brain that may sap patients' motivation. "There is an actual physiologic change that happens," said Dr. Neil Schwartz, a post-doctoral scientist who helped lead the Stanford research. "The behavior changes seem quite primary to the pain itself. They're not just a consequence of living with it." Schwartz and his colleagues hope their work could someday lead to new treatments for the behavior changes that come with chronic pain. In the short term, the research improves understanding of the biochemical effects of chronic pain and may be a comfort to patients who blame themselves for their lack of motivation, pain experts said. © 2014 Hearst Communications, Inc.

Keyword: Pain & Touch; Aggression
Link ID: 20007 - Posted: 08.28.2014