Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 4889

|By Tori Rodriguez Imagining your tennis serve or mentally running through an upcoming speech might help you perform better, studies have shown, but the reasons why have been unclear. A common theory is that mental imagery activates some of the same neural pathways involved in the actual experience, and a recent study in Psychological Science lends support to that idea. Scientists at the University of Oslo conducted five experiments investigating whether eye pupils adjust to imagined light as they do to real light, in an attempt to see whether mental imagery can trigger automatic neural processes such as pupil dilation. Using infrared eye-tracking technology, they measured the diameter of participants' pupils as they viewed shapes of varying brightness and as they imagined the shapes they viewed or visualized a sunny sky or a dark room. In response to imagined light, pupils constricted 87 percent as much as they did during actual viewing, on average; in response to imagined darkness, pupils dilated to 56 percent of their size during real perception. Two other experiments ruled out the possibility that participants were able to adjust their pupil size at will or that pupils were changing in response to mental effort, which can cause dilation. The finding helps to explain why imagined rehearsals can improve your game. The mental picture activates and strengthens the very neural circuits—even subconscious ones that control automated processes like pupil dilation—that you will need to recruit when it is time to perform. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20176 - Posted: 10.08.2014

By Sarah C. P. Williams If you sailed through school with high grades and perfect test scores, you probably did it with traits beyond sheer smarts. A new study of more than 6000 pairs of twins finds that academic achievement is influenced by genes affecting motivation, personality, confidence, and dozens of other traits, in addition to those that shape intelligence. The results may lead to new ways to improve childhood education. “I think this is going to end up being a really classic paper in the literature,” says psychologist Lee Thompson of Case Western Reserve University in Cleveland, Ohio, who has studied the genetics of cognitive skills and who was not involved in the work. “It’s a really firm foundation from which we can build on.” Researchers have previously shown that a person’s IQ is highly influenced by genetic factors, and have even identified certain genes that play a role. They’ve also shown that performance in school has genetic factors. But it’s been unclear whether the same genes that influence IQ also influence grades and test scores. In the new study, researchers at King’s College London turned to a cohort of more than 11,000 pairs of both identical and nonidentical twins born in the United Kingdom between 1994 and 1996. Rather than focus solely on IQ, as many previous studies had, the scientists analyzed 83 different traits, which had been reported on questionnaires that the twins, at age 16, and their parents filled out. The traits ranged from measures of health and overall happiness to ratings of how much each teen liked school and how hard they worked. © 2014 American Association for the Advancement of Science

Keyword: Genes & Behavior; Aggression
Link ID: 20170 - Posted: 10.07.2014

By LAWRENCE K. ALTMAN A British-American scientist and a pair of Norwegian researchers were awarded this year’s Nobel Prize in Physiology or Medicine on Monday for discovering “an inner GPS in the brain” that enables virtually all creatures to navigate their surroundings. John O’Keefe, 75, a British-American scientist, will share the prize of $1.1 million with May-Britt Moser, 51, and Edvard I. Moser, 52, only the second married couple to win a Nobel in medicine, who will receive the other half. The three scientists’ discoveries “have solved a problem that has occupied philosophers and scientists for centuries — how does the brain create a map of the space surrounding us and how can we navigate our way through a complex environment?” said the Karolinska Institute in Sweden, which chooses the laureates. The positioning system they discovered helps us know where we are, find our way from place to place and store the information for the next time, said Goran K. Hansson, secretary of the Karolinska’s Nobel Committee. The researchers documented that certain cells are responsible for the higher cognitive function that steers the navigational system. Dr. O’Keefe began using neurophysiological methods in the late 1960s to study how the brain controls behavior and sense of direction. In 1971, he discovered the first component of the inner navigational system in rats. He identified nerve cells in the hippocampus region of the brain that were always activated when a rat was at a certain location. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20169 - Posted: 10.07.2014

By Gretchen Vogel Research on how the brain knows where it is has bagged the 2014 Nobel Prize in Physiology or Medicine, the Nobel Committee has announced from Stockholm. One half of the prize goes to John O'Keefe, director of the Sainsbury Wellcome Centre in Neural Circuits and Behaviour at University College London. The other is for a husband-wife couple: May-Britt Moser, who is director of the Centre for Neural Computation in Trondheim, and Edvard Moser, director of the Kavli Institute for Systems Neuroscience in Trondheim. "In 1971, John O´Keefe discovered the first component of this positioning system," the Nobel Committee says in a statement that was just released. "He found that a type of nerve cell in an area of the brain called the hippocampus that was always activated when a rat was at a certain place in a room. Other nerve cells were activated when the rat was at other places. O´Keefe concluded that these “place cells” formed a map of the room." "More than three decades later, in 2005, May‐Britt and Edvard Moser discovered another key component of the brain’s positioning system," the statement goes on to explain. "They identified another type of nerve cell, which they called “grid cells”, that generate a coordinate system and allow for precise positioning and pathfinding. Their subsequent research showed how place and grid cells make it possible to determine position and to navigate." © 2014 American Association for the Advancement of Science

Keyword: Learning & Memory
Link ID: 20163 - Posted: 10.06.2014

Alison Abbott The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.” If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape. While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London. In 2007, while still only in their mid-40s, they won a competition by the Kavli Foundation of Oxnard, California, to build and direct one of only 17 Kavli Institutes around the world. The Mosers are now minor celebrities in their home country, and their institute has become a magnet for other big thinkers in neuroscience. “It is definitely intellectually stimulating to be around them,” says neurobiologist Nachum Ulanovsky from the Weizmann Institute of Science in Rehovot, Israel, who visited the Trondheim institute for the first time in September. © 2014 Nature Publishing Grou

Keyword: Learning & Memory
Link ID: 20162 - Posted: 10.06.2014

By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company

Keyword: Stress; Aggression
Link ID: 20161 - Posted: 10.04.2014

by Michael Marshall When we search for the seat of humanity, are we looking at the wrong part of the brain? Most neuroscientists assume that the neocortex, the brain's distinctive folded outer layer, is the thing that makes us uniquely human. But a new study suggests that another part of the brain, the cerebellum, grew much faster in our ape ancestors. "Contrary to traditional wisdom, in the human lineage the cerebellum was the part of the brain that accelerated its expansion most rapidly, rather than the neocortex," says Rob Barton of Durham University in the UK. With Chris Venditti of the University of Reading in the UK, Barton examined how the relative sizes of different parts of the brain changed as primates evolved. During the evolution of monkeys, the neocortex and cerebellum grew in tandem, a change in one being swiftly followed by a change in the other. But starting with the first apes around 25 million years ago through to chimpanzees and humans, the cerebellum grew much faster. As a result, the cerebellums of apes and humans contain far more neurons than the cerebellum of a monkey, even if that monkey were scaled up to the size of an ape. "The difference in ape cerebellar volume, relative to a scaled monkey brain, is equal to 16 billion extra neurons," says Barton. "That's the number of neurons in the entire human neocortex." © Copyright Reed Business Information Ltd.

Keyword: Evolution; Aggression
Link ID: 20160 - Posted: 10.04.2014

|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American

Keyword: Learning & Memory; Aggression
Link ID: 20159 - Posted: 10.04.2014

By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014

Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.

Keyword: Emotions; Aggression
Link ID: 20156 - Posted: 10.04.2014

By Fredrick Kunkle Years ago, many scientists assumed that a woman’s heart worked pretty much the same as a man’s. But as more women entered the male-dominated field of cardiology, many such assumptions vanished, opening the way for new approaches to research and treatment. A similar shift is underway in the study of Alzheimer’s disease. It has long been known that more women than men get the deadly neurodegenerative disease, and an emerging body of research is challenging the common wisdom as to why. Although the question is by no means settled, recent findings suggest that biological, genetic and even cultural influences may play heavy roles. Of the more than 5 million people in the United States who have been diagnosed with Alzheimer’s, the leading cause of dementia, two-thirds are women. Because advancing age is considered the biggest risk factor for the disease, researchers largely have attributed that disparity to women’s longer life spans. The average life expectancy for women is 81 years, compared with 76 for men. Yet “even after taking age into account, women are more at risk,” said Richard Lipton, a physician who heads the Einstein Aging Study at Albert Einstein College of Medicine in New York. With the number of Alzheimer’s cases in the United States expected to more than triple by 2050, some researchers are urging a greater focus on understanding the underlying reasons women are more prone to the disease and on developing gender-specific treatments. .

Keyword: Alzheimers; Aggression
Link ID: 20155 - Posted: 10.04.2014

By Smitha Mundasad Health reporter, BBC News Measuring people's sense of smell in later life could help doctors predict how likely they are to be alive in five years' time, a PLOS One study suggests. A survey of 3,000 adults found 39% with the poorest sense of smell were dead within five years - compared to just 10% who identified odours correctly. Scientists say the loss of smell sense does not cause death directly, but may be an early warning sign. They say anyone with long-lasting changes should seek medical advice. Researchers from the University of Chicago asked a representative sample of adults between the ages of 57-85 to take part in a quick smell test. The assessment involved identifying distinct odours encased on the tips of felt-tip pens. The smells included peppermint, fish, orange, rose and leather. Five years later some 39% of adults who had the lowest scores (4-5 errors) had passed away, compared with 19% with moderate smell loss and just 10% with a healthy sense of smell (0-1 errors). And despite taking issues such as age, nutrition, smoking habits, poverty and overall health into account, researchers found those with the poorest sense of smell were still at greatest risk. Lead scientist, Prof Jayant Pinto, said: "We think loss of the sense of smell is like the canary in the coal mine. BBC © 2014

Keyword: Chemical Senses (Smell & Taste); Aggression
Link ID: 20149 - Posted: 10.02.2014

By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.

Keyword: Alzheimers; Aggression
Link ID: 20148 - Posted: 10.02.2014

|By Brian Bienkowski and Environmental Health News Babies born to mothers with high levels of perchlorate during their first trimester are more likely to have lower IQs later in life, according to a new study. The research is the first to link pregnant women's perchlorate levels to their babies’ brain development. It adds to evidence that the drinking water contaminant may disrupt thyroid hormones that are crucial for proper brain development. Perchlorate, which is both naturally occurring and manmade, is used in rocket fuel, fireworks and fertilizers. It has been found in 4 percent of U.S. public water systems serving an estimated 5 to 17 million people, largely near military bases and defense contractors in the U.S. West, particularly around Las Vegas and in Southern California. “We would not recommend action on perchlorate levels from this study alone, although our report highlights a pressing need for larger studies of perchlorate levels from the general pregnant population and those with undetected hypothyroidism,” the authors from the United Kingdom, Italy and Boston wrote in the study published in The Journal of Clinical Endocrinology & Metabolism. The Environmental Protection Agency for decades has debated setting a national drinking water standard for perchlorate. The agency in 2011 announced it would start developing a standard, reversing an earlier decision. In the meantime, two states, California and Massachusetts, have set their own standards. © 2014 Scientific American

Keyword: Development of the Brain; Aggression
Link ID: 20143 - Posted: 10.01.2014

Wild marmosets in the Brazilian forest can learn quite successfully from video demonstrations featuring other marmosets, Austrian scientists have reported, showing not only that marmosets are even better learners than previously known, but that video can be used successfully in experiments in the wild. Tina Gunhold, a cognitive biologist at the University of Vienna, had worked with a population of marmoset monkeys in a bit of Brazilian forest before this particular experiment. The forest is not wilderness. It lies near some apartment complexes, and the marmosets are somewhat used to human beings. But the monkeys are wild, and each extended family group has its own foraging territory. Dr. Gunhold and her colleagues reported in the journal Biology Letters this month that they had tested 12 family groups, setting up a series of video monitors, each with a kind of complicated box that they called an “artificial fruit.” All the boxes contained food. Six of the monitors showed just an unchanging image of a marmoset near a similar box. Three of them showed a marmoset opening the box by pulling a drawer, and three others a marmoset lifting a lid to get at the food. Marmosets are very territorial and would not tolerate a strange individual on their turf, but the image of a strange marmoset on video didn’t seem to bother them. Individual marmosets “differed in their reactions to the video,” Dr. Gunhold said. “Some were more shy, some more bold. The younger ones were more attracted to the video, perhaps because of greater curiosity.” © 2014 The New York Times Company

Keyword: Learning & Memory; Aggression
Link ID: 20138 - Posted: 09.30.2014

By David Z. Hambrick, Fernanda Ferreira, and John M. Henderson A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012. What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.” One player needed 22 times more deliberate practice than another player to become a master. © 2014 The Slate Group LLC.

Keyword: Learning & Memory
Link ID: 20136 - Posted: 09.30.2014

By Gary Stix If it’s good for the heart, it could also be good for the neurons, astrocytes and oligodendrocytes, cells that make up the main items on the brain’s parts list. The heart-brain adage comes from epidemiological studies that show that people with cardiovascular risk factors such as high-blood pressure and elevated cholesterol levels, may be more at risk for Alzheimer’s and other dementias. This connection between heart and brain has also led to some disappointments: clinical trials of lipid-lowering statins have not helped patients diagnosed with Alzheimer’s, although epidemiological studies suggest that long-term use of the drugs may help prevent Alzheimer’s and other dementias. The link between head and heart is still being pursued because new Alzheimer’s drugs have failed time and again. One approach that is now drawing some interest looks at the set of proteins that carry around fats in the brain. These lipoproteins could potentially act as molecular sponges that mop up the amyloid-beta peptide that clogs up connections among brain cells in Alzheimer’s. One of these proteins—Apolipoprotein J, also known as clusterin—intrigues researchers because of the way it interacts with amyloid-beta and the status of its gene as a risk factor for Alzheimer’s. A researcher from the University of Minnesota, Ling Li, recently presented preliminary work at the Alzheimer’s Disease Drug Discovery Foundation annual meeting that showed that, at least in a lab dish, a molecule made up of a group of amino acids from APOJ is capable of protecting against the toxicity of the amyloid-beta peptide. It also quelled inflammation and promoted the health of synapses—the junctions where one brain cell encounters another. Earlier work by another group showed that the peptide prevented the development of lesions in the blood vessels of animals.

Keyword: Alzheimers
Link ID: 20135 - Posted: 09.30.2014

by Elijah Wolfson @elijahwolfson The class was the most difficult of the fall 2013 semester, and J.D. Leadam had missed all but one lecture. His grandfather’s health had worsened, and he left San Jose State, where he was studying for a degree in business, to return home to Los Angeles to help out. Before he knew it, midterm exams had almost arrived. At this point, Leadam had, for a while, been playing around with transcranial direct-current stimulation, or tDCS, an experimental treatment for all sorts of health issues that, at its most basic, involves running a very weak electric current through the brain. When he first came across tDCS, Leadam was immediately intrigued but thought, “There’s no way I’m gonna put electrodes on my head. It’s just not going to happen.” After extensive research, though, he changed his mind. He looked into buying a device online, but there wasn’t much available — just one extremely expensive machine and then a bare-bones $40 device that didn’t even have a switch. So he dug around online and figured he could build one himself. He bought all the pieces he needed and put it together. He tried it a few times, but didn’t notice much, so he put it aside. But now, with the test looming, he picked it back up. The professor had written a book, and Leadam knew all the information he’d be tested on was written in its pages. “But I’m an auditory learner,” he said, “so I knew it wouldn’t work to just read it.” He strapped on the device, turned it on and read the chapters. “Nothing,” he thought. But when he got to the classroom and put pen to paper, he had a revelation. “I could remember concepts down to the exact paragraphs in the textbook,” Leadam said. “I actually ended up getting an A on the test. I couldn’t believe it.”

Keyword: Learning & Memory
Link ID: 20130 - Posted: 09.29.2014

By Smitha Mundasad Health reporter, BBC News A spice commonly found in curries may boost the brain's ability to heal itself, according to a report in the journal Stem Cell Research and Therapy. The German study suggests a compound found in turmeric could encourage the growth of nerve cells thought to be part of the brain's repair kit. Scientists say this work, based in rats, may pave the way for future drugs for strokes and Alzheimer's disease. But they say more trials are needed to see whether this applies to humans. Researchers from the Institute of Neuroscience and Medicine in Julich, Germany, studied the effects of aromatic-turmerone - a compound found naturally in turmeric. Rats were injected with the compound and their brains were then scanned. Particular parts of the brain, known to be involved in nerve cell growth, were seen to be more active after the aromatic-turmerone infusion. Scientists say the compound may encourage a proliferation of brain cells. In a separate part of the trial, researchers bathed rodent neural stem cells (NSCs) in different concentrations of aromatic-tumerone extract. NSCs have the ability to transform into any type of brain cell and scientists suggest they could have a role in repair after damage or disease. Dr Maria Adele Rueger, who was part of the research team, said: "In humans and higher developed animals their abilities do not seem to be sufficient to repair the brain but in fish and smaller animals they seem to work well." Picture of the spice turmeric Turmeric belongs to the same plant family as ginger BBC © 2014

Keyword: Alzheimers; Aggression
Link ID: 20119 - Posted: 09.27.2014

By Dick Miller, CBC News Dan Campbell felt the bullets whiz past his head. The tracer rounds zipped between his legs. It was his first firefight as a Canadian soldier in Afghanistan. "I was completely frightened and scared like I’d never been before in my life,” he says. As the attack continued, the sights, sounds and smells started to form memories inside his brain. The fear he felt released the hormone norepinephrine, and in the complex chemistry of the brain, the memories of the battle became associated with the fear. 'I think one day, hopefully in the not-too-distant future, we will be able to delete a memory.'- Dr. Sheena Josselyn, senior scientist, Hospital For Sick Children Research Institute Six years later, a sight or sound such as a firecracker or car backfiring can remind him of that night in 2008. The fear comes back and he relives rather than remembers the moments. "It can be hard. Physically, you know, there’s the tapping foot, my heart beating,” he says. Like so many soldiers and victims of assault or people who have experienced horrific accidents, Campbell was diagnosed with post traumatic stress disorder. Now a newspaper reporter in Yellowknife, Campbell thinks one day he may get therapy. But for now he is working on his own to control the fear and anger the memories bring. © CBC 2014

Keyword: Stress; Aggression
Link ID: 20111 - Posted: 09.24.2014