Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1148

Aftab Ali People who were born prematurely are less intelligent later on in life and earn less money as a result, according to a new study by the University of Warwick. Researchers at the Coventry-based institution said they found a link which connects pre-term birth with low reading and, in particular, maths skills which affect the amount of wealth accumulated as adults. Funded by the Nuffield Foundation, the researchers examined data from two other large studies, following children born more than a decade apart, with one group from 1958 and the other from 1970. In total, more than 15,000 individuals were surveyed – which recruited all children born in a single week in England, Scotland, and Wales. Data were examined for all individuals who were born at between 28 and 42 weeks gestational age, and who had available wealth information at the age of 42. Those participants who were born pre-term – at less than 37 weeks – were compared with those who were born full-term to find both groups’ mathematical ability in childhood had a direct effect on how much they earned as an adult, regardless of later educational qualifications. In order to measure adult wealth, the researchers looked at factors including: family income and social class, housing and employment status, and their own perceptions of their financial situation. In regards to academic abilities, they examined: validated measures for mathematics, reading, and intelligence, along with ratings from teachers and parents. © independent.co.uk

Keyword: Development of the Brain; Intelligence
Link ID: 21375 - Posted: 09.02.2015

By Gretchen Reynolds At the age of 93, Olga Kotelko — one of the most successful and acclaimed nonagenarian track-and-field athletes in history — traveled to the University of Illinois to let scientists study her brain. Ms. Kotelko held a number of world records and had won hundreds of gold medals in masters events. But she was of particular interest to the scientific community because she hadn’t begun serious athletic training until age 77. So scanning her brain could potentially show scientists what late-life exercise might do for brains. Ms. Kotelko died last year at the age of 95, but the results of that summer brain scan were published last month in Neurocase. And indeed, Ms. Kotelko’s brain looked quite different from those of other volunteers aged 90-plus who participated in the study, the scans showed. The white matter of her brain — the cells that connect neurons and help to transmit messages from one part of the brain to another — showed fewer abnormalities than the brains of other people her age. And her hippocampus, a portion of the brain involved in memory, was larger than that of similarly aged volunteers (although it was somewhat shrunken in comparison to the brains of volunteers decades younger than her). Over all, her brain seemed younger than her age. But because the scientists didn’t have a scan showing Ms. Kotelko’s brain before she began training, it’s impossible to know whether becoming an athlete late in life improved her brain’s health or whether her naturally healthy brain allowed her to become a stellar masters athlete. © 2015 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 21372 - Posted: 09.02.2015

By SINDYA N. BHANOO The human eye has a blind spot, though few of us realize it. Now, a new study suggests that it is possible to reduce the spot with training. The optic nerve, which carries visual signals to the brain, passes through the retina, a light-sensitive layer of tissue. There are no so-called photoreceptors at the point where the optic nerve intersects the retina. The right eye generally compensates for the left eye’s blind spot and vice versa, so the spot is hardly noticed. Researchers trained 10 people using a computer monitor and an eye patch. The participants were shown a waveform in the visual field of their blind spot day after day. After 20 days of this repeated stimulation, the blind spot shrunk by about 10 percent. The researchers believe that neurons at the periphery of the blind spot became more responsive, effectively reducing the extent of functional blindness. The findings add to a growing body of research suggesting that the human eye can be trained, said Paul Miller, a psychologist at the University of Queensland in Australia and an author of the study, which appeared in the journal Current Biology. This kind of training may help researchers develop better treatments for visual impairments like macular degeneration. “This is the leading cause of blindness in the western world,” Mr. Miller said. © 2015 The New York Times Company

Keyword: Vision; Learning & Memory
Link ID: 21367 - Posted: 09.01.2015

We all have days when we feel like our brain is going at a snail’s pace, when our neurons forgot to get out of bed. And psychologists have shown that IQ can fluctuate day to day. So if we’re in good health and don’t have a sleep deficit from last night’s shenanigans to blame, what’s the explanation? Sophie von Stumm, a psychologist at Goldsmiths University, London, set about finding out. In particular, she wanted to know whether mood might explain the brain’s dimmer switch. Although it seems intuitively obvious that feeling low could compromise intellectual performance, von Stumm says research to date has been inconclusive, with some studies finding an effect and others not. “On bad mood days, we tend to feel that our brains are lame and work or study is particularly challenging. But scientists still don’t really know if our brains work better when we are happy compared to when we are sad.” To see if she could pin down mood’s effect on IQ more convincingly, von Stumm recruited 98 participants. Over five consecutive days they completed questionnaires to assess their mood, as well as tests to measure cognitive functions, such as short-term memory, working memory and processing speed. Surprisingly, being in a bad mood didn’t translate into worse cognitive performance. However, when people reported feeling positive, von Stumm saw a modest boost in their processing speed. © Copyright Reed Business Information Ltd.

Keyword: Depression; Intelligence
Link ID: 21359 - Posted: 08.29.2015

By Laura Sanders By tweaking a single gene, scientists have turned average mice into supersmart daredevils. The findings are preliminary but hint at therapies that may one day ease the symptoms of such disorders as Alzheimer’s disease and schizophrenia, scientists report August 14 in Neuropsychopharmacology. The altered gene provides instructions for a protein called phosphodiesterase-4B, or PDE4B, which has been implicated in schizophrenia. It’s too early to say whether PDE4B will turn out to be a useful target for drugs that treat these disorders, cautions pharmacologist Ernesto Fedele of the University of Genoa in Italy. Nonetheless, the protein certainly deserves further investigation, he says. The genetic change interfered with PDE4B’s ability to do its job breaking down a molecular messenger called cAMP. Mice designed to have this disabled form of PDE4B showed a suite of curious behaviors, including signs of smarts, says study coauthor Alexander McGirr of the University of British Columbia. Compared with normal mice, these mice more quickly learned which objects in a cage had been moved to a new location, for instance, and could better recognize a familiar mouse after 24 hours. “The system is primed and ready to learn, and it doesn’t require the same kind of input as a normal mouse,” McGirr says. These mice also spent more time than usual exploring brightly lit spaces, spots that normal mice avoid. But this devil-may-care attitude sometimes made the “smart” mice blind to risky situations. The mice were happy to spend time poking around an area that had been sprinkled with bobcat urine. “Not being afraid of cat urine is not a good thing for a mouse,” McGirr says. © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory; Schizophrenia
Link ID: 21338 - Posted: 08.26.2015

By Zoe Kleinman Technology reporter, BBC News More than 200 academics have signed an open letter criticising controversial new research suggesting a link between violent video games and aggression. The findings were released by the American Psychological Association. It set up a taskforce that reviewed hundreds of studies and papers published between 2005 and 2013. The American Psychological Association concluded while there was "no single risk factor" to blame for aggression, violent video games did contribute. "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression," said the report. "It is the accumulation of risk factors that tends to lead to aggressive or violent behaviour. The research reviewed here demonstrates that violent video game use is one such risk factor." However, a large group of academics said they felt the methodology of the research was deeply flawed as a significant part of material included in the study had not been subjected to peer review. "I fully acknowledge that exposure to repeated violence may have short-term effects - you would be a fool to deny that - but the long-term consequences of crime and actual violent behaviour, there is just no evidence linking violent video games with that," Dr Mark Coulson, associate professor of psychology at Middlesex University and one of the signatories of the letter told the BBC. "If you play three hours of Call of Duty you might feel a little bit pumped, but you are not going to go out and mug someone." © 2015 BBC

Keyword: Aggression
Link ID: 21310 - Posted: 08.19.2015

By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.

Keyword: Learning & Memory; Genes & Behavior
Link ID: 21306 - Posted: 08.18.2015

Alison Abbott The octopus genome offers clues to how cephalopods evolved intelligence to rival the craftiest vertebrates. With its eight prehensile arms lined with suckers, camera-like eyes, elaborate repertoire of camouflage tricks and spooky intelligence, the octopus is like no other creature on Earth. Added to those distinctions is an unusually large genome, described in Nature1 on 12 August, that helps to explain how a mere mollusc evolved into an otherworldly being. “It’s the first sequenced genome from something like an alien,” jokes neurobiologist Clifton Ragsdale of the University of Chicago in Illinois, who co-led the genetic analysis of the California two-spot octopus (Octopus bimaculoides). The work was carried out by researchers from the University of Chicago, the University of California, Berkeley, the University of Heidelberg in Germany and the Okinawa Institute of Science and Technology in Japan. The scientists also investigated gene expression in twelve different types of octopus tissue. “It’s important for us to know the genome, because it gives us insights into how the sophisticated cognitive skills of octopuses evolved,” says neurobiologist Benny Hochner at the Hebrew University of Jerusalem in Israel, who has studied octopus neurophysiology for 20 years. Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs. © 2015 Nature Publishing Group

Keyword: Intelligence; Genes & Behavior
Link ID: 21295 - Posted: 08.13.2015

Ashley Yeager A mouse scurries across a round table rimmed with Dixie cup–sized holes. Without much hesitation, the rodent heads straight for the hole that drops it into a box lined with cage litter. Any other hole would have led to a quick fall to the floor. But this mouse was more than lucky. It had an advantage — human glial cells were growing in its brain. Glia are thought of as the support staff for the brain’s nerve cells, or neurons, which transmit and receive the brain’s electrical and chemical signals. Named for the Greek term for “glue,” glia have been known for nearly 170 years as the cells that hold the brain’s bits together. Some glial cells help feed neurons. Other glia insulate nerve cell branches with myelin. Still others attack brain invaders responsible for infection or injury. Glial cells perform many of the brain’s most important maintenance jobs. But recent studies suggest they do a lot more. Glia can shape the conversation between neurons, speeding or slowing the electrical signals and strengthening neuron-to-neuron connections. When scientists coaxed human glia to grow in the brains of baby mice, the mice grew up to be supersmart, navigating tabletops full of holes and mastering other tasks much faster than normal mice. This experiment and others suggest that glia may actually orchestrate learning and memory, says neuroscientist R. Douglas Fields. “Glia aren’t doing vibrato. That’s for the neurons,” says Fields, of the National Institute of Child Health and Human Development in Bethesda, Md. “Glia are the conductors.” © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory; Glia
Link ID: 21289 - Posted: 08.12.2015

Could taking iodine pills in pregnancy help to raise children’s IQ? Some researchers suggest women in the UK should take such supplements, but others say the evidence is unclear, and that it could even harm development. Iodine is found in dairy foods and fish, and is used in the body to make thyroid hormone, which is vital for brain development in the womb. In some parts of the world, such as inland areas where little fish is consumed or the soil is low in iodine, severe deficiencies can markedly lower intelligence in some people. In most affected areas, iodine is now added to salt. The UK was not thought to need this step, but in 2013 a large study of urine samples from pregnant women found that about two-thirds had mild iodine deficiency, and that the children of those with the lowest levels had the lowest IQs. Now another team has combined data from this study with other data to calculate that if all women in the UK were given iodine supplements from three months before pregnancy until they finished breastfeeding, average IQ would increase by 1.2 points per child. And the children of mothers who were most iodine deficient would probably benefit more, says Kate Jolly of the University of Birmingham, who was involved in the study. “We are talking about very small differences but on a population basis it could mean quite a lot,” she says. The team calculated that providing these iodine supplements would be worth the cost to the UK’s National Health Service because it would boost the country’s productivity. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain; Intelligence
Link ID: 21286 - Posted: 08.12.2015

April Dembosky Developers of a new video game for your brain say theirs is more than just another get-smarter-quick scheme. Akili, a Northern California startup, insists on taking the game through a full battery of clinical trials so it can get approval from the Food and Drug Administration — a process that will take lots of money and several years. So why would a game designer go to all that trouble when there's already a robust market of consumers ready to buy games that claim to make you smarter and improve your memory? Think about all the ads you've heard for brain games. Maybe you've even passed a store selling them. There's one at the mall in downtown San Francisco — just past the cream puff stand and across from Jamba Juice — staffed on my visit by a guy named Dominic Firpo. "I'm a brain coach here at Marbles: The Brain Store," he says. Brain coach? "Sounds better than sales person," Firpo explains. "We have to learn all 200 games in here and become great sales people so we can help enrich peoples' minds." He heads to the "Word and Memory" section of the store and points to one product that says it will improve your focus and reduce stress in just three minutes a day. "We sold out of it within the first month of when we got it," Firpo says. The market for these "brain fitness" games is worth about $1 billion and is expected to grow to $6 billion in the next five years. Game makers appeal to both the young and the older with the common claim that if you exercise your memory, you'll be able to think faster and be less forgetful. Maybe bump up your IQ a few points. "That's absurd," says psychology professor Randall Engle from the Georgia Institute of Technology. © 2015 NPR

Keyword: ADHD; Intelligence
Link ID: 21278 - Posted: 08.10.2015

Tina Hesman Saey Memory Transfer Seen — Experiments with rats, showing how chemicals from one rat brain influence the memory of an untrained animal, indicate that tinkering with the brain of humans is also possible. In the rat tests, brain material from an animal trained to go for food either at a light flash or at a sound signal was injected into an untrained rat. The injected animals then "remembered" whether light or sound meant food. — Science News Letter, August 21, 1965 Update: After this report, scientists from eight labs attempted to repeat the memory transplants. They failed, as they reported in Science in 1966. Science fiction authors and futurists often predict that a person’s memories might be transferred to another person or a computer, but the idea is likely to remain speculation, says neuroscientist Eric Kandel, who won a Nobel Prize in 2000 for his work on memory. Brain wiring is too intricate and complicated to be exactly replicated, and scientists are still learning about how memories are made, stored and retrieved. W. L. Byrne et al. Technical Comments: Memory Transfer. Science Vol. 153, August 5, 1966, p. 658. doi:10.1126/science.153.3736.658 © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory
Link ID: 21271 - Posted: 08.08.2015

By Christian Jarrett One of the saddest things about loneliness is that it leads to what psychologists call a “negative spiral.” People who feel isolated come to dread bad social experiences and they lose faith that it’s possible to enjoy good company. The usual result, as Melissa Dahl recently noted, is more loneliness. This hardly seems adaptive, but experts say it’s because we’ve evolved to enter a self-preservation mode when we’re alone. Without the backup of friends and family, our brains become alert to threat, especially the potential danger posed by strangers. Until now, much of the evidence to support this account has come from behavioral studies. For example, when shown a video depicting a social scene, lonely people spend more time than others looking at signs of social threat, such as a person being ignored by their friends or one person turning their back on another. Unpublished work also shows that lonely people’s attention seems to be grabbed more quickly by words that pertain to social threat, such as rejected or unwanted. Now the University of Chicago’s husband-and-wife research team of Stephanie and John Cacioppo — leading authorities on the psychology and neuroscience of loneliness — have teamed up with their colleague, Stephen Balogh, to provide the first evidence that lonely people’s brains, compared to the non-lonely, are exquisitely alert to the difference between social and nonsocial threats. The finding, reported online in the journal Cortex, supports their broader theory that, for evolutionary reasons, loneliness triggers a cascade of brain-related changes that put us into a socially nervous, vigilant mode. The researchers used a loneliness questionnaire to recruit 38 very lonely people and 32 people who didn’t feel lonely (note that loneliness was defined here as the subjective feeling of isolation, as opposed to the number of friends or close relatives one has). Next, the researchers placed an electrode array of 128 sensors on each of the participants’ heads, allowing them to record the participants’ brain waves using an established technique known as electro-encephalography (EEG) that’s particularly suited to measuring brain activity changes over very short time periods. © 2015, New York Media LLC.

Keyword: Depression; Learning & Memory
Link ID: 21267 - Posted: 08.05.2015

Steve Connor A computer game designed by neuroscientists has helped patients with schizophrenia to recover their ability to carry out everyday tasks that rely on having good memory, a study has found. Patients who played the game regularly for a month were four times better than non-players at remembering the kind of things that are critical for normal, day-to-day life, researchers said. The computer game was based on scientific principles that are known to “train” the brain in episodic memory, which helps people to remember events such as where they parked a car or placed a set of keys, said Professor Barbara Sahakian of Cambridge University, the lead author of the study. People recovering from schizophrenia suffer serious lapses in episodic memory which prevent them from returning to work or studying at university, so anything that can improve the ability of the brain to remember everyday events will help them to lead a normal life, Professor Sahakian said. Schizophrenia affects about one in every hundred people and results in hallucinations and delusions (Rex) Schizophrenia affects about one in every hundred people and results in hallucinations and delusions (Rex) “This kind of memory is essential for everyday learning and everything we do really both at home and at work. We have formulated an iPad game that could drive the neural circuitry behind episodic memory by stimulating the ability to remember where things were on the screen,” Professor Sahakian said. © independent.co.uk

Keyword: Schizophrenia; Learning & Memory
Link ID: 21251 - Posted: 08.02.2015

Michael Sullivan It's 5:45 in the morning, and in a training field outside Siem Reap, home of Angkor Wat, Cambodia's demining rats are already hard at work. Their noses are close to the wet grass, darting from side to side, as they try to detect explosives buried just beneath the ground. Each rat is responsible for clearing a 200-square-meter (239-square-yard) patch of land. Their Cambodian supervisor, Hulsok Heng, says they're good at it. "They are very good," he says. "You see this 200 square meters? They clear in only 30 minutes or 35 minutes. If you compare that to a deminer, maybe two days or three days. The deminer will pick up all the fragmentation, the metal in the ground, but the rat picks up only the smell of TNT. Not fragmentation or metal or a nail or a piece of crap in the ground." That's right: Someone using a metal-detecting machine will take a lot longer to detect a land mine than a rat using its nose. There's plenty of work for the rats here in Cambodia. The government estimates there are 4 million to 6 million land mines or other pieces of unexploded ordnance — including bombs, shells and grenades — littering the countryside, remnants of decades of conflict. Neighboring Vietnam and Laos also have unexploded ordnance left over from the Vietnam War. Dozens of people are killed or maimed in the region every year — and there's a financial toll as well, since the presence of these potentially deadly devices decreases the amount of land available to farmers. © 2015 NPR

Keyword: Chemical Senses (Smell & Taste); Learning & Memory
Link ID: 21246 - Posted: 08.01.2015

By Robert Gebelhoff Just in case sea snails aren't slow enough, new research has found that they get more sluggish when they grow old — and the discovery is helping us to understand how memory loss happens in humans. It turns out that the sea snail, which has a one-year lifespan, is actually a good model to study nerve cells and how the nervous system works in people. How neurons work is fundamentally identical in almost all animals, and the simplicity of the snail's body gives researchers the chance to view how different the system works more directly. "You can count the number of nerve cells that are relevant to a reflex," said Lynne Fieber, a professor at the University of Miami who leads research with the snails at the school. She and a team of researchers have been using the slimy little critters to learn how nerve cells respond to electric shock. They "taught" the snails to quickly contract their muscle tails by administering electric shocks and then poking the tails, a process called "sensitization." They then studied the responses at various ages. The scientists, whose work was published this week in the journal PlOS One, found that as the senior citizen specimens do not learn to contract from the shock very well. As the snails grow older, their tail startle reflex lessened, and then disappeared. So I guess you could say the frail snails' tails fail to avail (okay, I'll stop).

Keyword: Learning & Memory; Development of the Brain
Link ID: 21245 - Posted: 08.01.2015

By Bret Stetka The brain is extraordinarily good at alerting us to threats. Loud noises, noxious smells, approaching predators: they all send electrical impulses buzzing down our sensory neurons, pinging our brain’s fear circuitry and, in some cases, causing us to fight or flee. The brain is also adept at knowing when an initially threatening or startling stimulus turns out to be harmless or resolved. But sometimes this system fails and unpleasant associations stick around, a malfunction thought to be at the root of post-traumatic stress disorder (PTSD). New research has identified a neuronal circuit responsible for the brain’s ability to purge bad memories, findings that could have implications for treating PTSD and other anxiety disorders. Like most emotions, fear is neurologically complicated. But previous work has consistently implicated two specific areas of the brain as contributing to and regulating fear responses. The amygdala, two small arcs of brain tissue deep beneath our temples, is involved in emotional reactions, and it flares with activity when we are scared. If a particular threat turns out to be harmless, a brain region behind the forehead called the prefrontal cortex steps in and the fright subsides. Our ability to extinguish painful memories is known to involve some sort of coordinated effort between the amygdala and the prefrontal cortex. The new study, led by Andrew Holmes at the National Institutes of Health, however, confirms that a working connection between the two brain regions is necessary to do away with fear. Normally mice that repeatedly listen to a sound previously associated with a mild foot shock will learn that on its own the tone is harmless, and they will stop being afraid. Using optogenetic stimulation technology, or controlling specific neurons and animal behavior using light, the authors found that disrupting the amygdala–prefrontal cortex connection prevents mice from overcoming the negative association with the benign tone. In neurobiology speak, memory “extinction” fails to occur. They also found that the opposite is true—that stimulating the circuit results in increased extinction of fearful memories. © 2015 Scientific American

Keyword: Learning & Memory; Stress
Link ID: 21243 - Posted: 08.01.2015

By Neuroskeptic According to British biochemist Donald R. Forsdyke in a new paper in Biological Theory, the existence of people who seem to be missing most of their brain tissue calls into question some of the “cherished assumptions” of neuroscience. I’m not so sure. Forsdyke discusses the disease called hydrocephalus (‘water on the brain’). Some people who suffer from this condition as children are cured thanks to prompt treatment. Remarkably, in some cases, these post-hydrocephalics turn out to have grossly abnormal brain structure: huge swathes of their brain tissue are missing, replaced by fluid. Even more remarkably, in some cases, these people have normal intelligence and display no obvious symptoms, despite their brains being mostly water. This phenomenon was first noted by a British pediatrician called John Lorber. Lorber never published his observations in a scientific journal, although a documentary was made about them. However, his work was famously discussed in Science in 1980 by Lewin in an article called “Is Your Brain Really Necessary?“. There have been a number of other more recent published cases. Forsdyke argues that such cases pose a problem for mainstream neuroscience. If a post-hydrocephalic brain can store the same amount of information as a normal brain, he says, then “brain size does not scale with information quantity”, therefore, “it would seem timely to look anew at possible ways our brains might store their information.”

Keyword: Consciousness; Learning & Memory
Link ID: 21227 - Posted: 07.29.2015

Chris Woolston A study that did not find cognitive benefits of musical training for young children triggered a “media firestorm”. Researchers often complain about inaccurate science stories in the popular press, but few air their grievances in a journal. Samuel Mehr, a PhD student at Harvard University in Cambridge, Massachusetts, discussed in a Frontiers in Psychology article1 some examples of media missteps from his own field — the effects of music on cognition. The opinion piece gained widespread attention online. Arseny Khakhalin, a neuroscientist at Bard College in Annandale-on-Hudson, New York, tweeted: Mehr gained first-hand experience of the media as the first author of a 2013 study in PLoS ONE2. The study involved two randomized, controlled trials of a total of 74 four-year-olds. For children who did six weeks of music classes, there was no sign that musical activities improved scores on specific cognitive tests compared to children who did six weeks of art projects or took part in no organized activities. The authors cautioned, however, that the lack of effect of the music classes could have been a result of how they did the studies. The intervention in the trials was brief and not especially intensive — the children mainly sang songs and played with rhythm instruments — and older children might have had a different response than the four-year-olds. There are many possible benefits of musical training, Mehr said in an interview, but finding them was beyond the scope of the study. © 2015 Nature Publishing Group

Keyword: Hearing; Intelligence
Link ID: 21216 - Posted: 07.25.2015

Kashmira Gander Performing well at school and going on to have a complex job could lower the risk of dementia, scientists have found. On the contrary, loneliness, watching too much TV and a sedentary lifestyle can make a person’s cognitive abilities decline more quickly, according to new research being presented to experts at the international Alzheimer's Association International Conference in Washington DC. Researchers are also due to show attendees the results from trials Solanezumab – believed to be the first drug to halt the progression of the disease if a patient is diagnosed early enough. One study involving 7,500 people aged 65 and above in Sweden over a 20-year period showed that dementia rates were 21 per cent higher in those whose grades were in the bottom fifth of the population. Meanwhile, participants with complex jobs involving data and numbers saw their chance of developing the disease cut by 23 per cent. Read more: Why fish oil pills may not be so healthy after all Proof that dementia risk can be reduced by improving lifestyle Charity warns of a 'worrying' lack of support for dementia patients Dementia research: Drug firms despair of finding cure and withdraw funding after a catalogue of failures For separate study in Sweden, scientists followed the lives of 440 people aged 75 or over for nine years, and discovered that those in the bottom fifth for school grades were found to have a 50 per cent increase in the risk of developing dementia. © independent.co.uk

Keyword: Alzheimers; Learning & Memory
Link ID: 21195 - Posted: 07.21.2015