Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By JOHN MARKOFF PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head. The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete. The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals. In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming. Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon. “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits. © 2013 The New York Times Company
Tomas Jivanda Being pulled into the world of a gripping novel can trigger actual, measurable changes in the brain that linger for at least five days after reading, scientists have said. The new research, carried out at Emory University in the US, found that reading a good book may cause heightened connectivity in the brain and neurological changes that persist in a similar way to muscle memory. The changes were registered in the left temporal cortex, an area of the brain associated with receptivity for language, as well as the the primary sensory motor region of the brain. Neurons of this region have been associated with tricking the mind into thinking it is doing something it is not, a phenomenon known as grounded cognition - for example, just thinking about running, can activate the neurons associated with the physical act of running. “The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” said neuroscientist Professor Gregory Berns, lead author of the study. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.” 21 students took part in the study, with all participants reading the same book - Pompeii, a 2003 thriller by Robert Harris, which was chosen for its page turning plot. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” said Prof Berns. “It depicts true events in a fictional and dramatic way. It was important to us that the book had a strong narrative line.” © independent.co.uk
By CARL ZIMMER There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain. The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking. The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big. Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings. But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation. In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits. Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences. “I think it presents some pretty exciting ideas,” said Chet C. Sherwood, an expert on human brain evolution at George Washington University who was not involved in the research. Dr. Buckner and Dr. Krienen developed their hypothesis after making detailed maps of the connections in the human brain using f.M.R.I. scanners. When they compared their maps with those of other species’ brains, they saw some striking differences. © 2013 The New York Times Company
By Regina Harrell and Pulse, I am a primary-care doctor who makes house calls in and around Tuscaloosa, Ala. Today my rounds start at a house located down a dirt road a few miles outside town. Gingerly, I cross the front walk; Mrs. Edgars told me that she killed a rattlesnake in her flowerbed last year. She is at the door, expecting my visit. Mr. Edgars sits on the couch, unable to recall that I am his doctor, or even that I am a doctor, but happy to see me nonetheless. We chat about the spring garden and the rain, then we move on to Mr. Edgars’s arthritis. Earlier on in his dementia, he wandered the woods, and his wife was afraid he would get lost and die, although the entire family agreed that this was how he would want it. Now, in a strange twist, his knee arthritis has worsened enough that it has curtailed his wanderings. I suspect that Mrs. Edgars is undertreating the pain to decrease the chance that he’ll wander off again. We talk about how anxious he grows whenever she’s out of his sight and how one of his children comes to sit with him so that she can run errands. She shows me a quilt remnant found in a log cabin on their property; it likely belonged to her husband’s grandfather, making the rough-edged fabric about a century old. I leave carrying a parting gift from her — a jar of homegrown pickled okra. When I get back to the office, I turn on the computer to write a progress note in Mr. Edgars’s electronic health record, or EHR. In addition to recording the details of our visit, I must try to meet the new federal criteria for “meaningful use,” criteria that have been adopted by my office with threats that I won’t get paid for my work if I don’t. © 1996-2013 The Washington Post
Link ID: 19067 - Posted: 12.24.2013
Helen Shen The ability to erase memory may jump from the realm of film fantasy (such as Eternal Sunshine of the Spotless Mind, shown here) to reality. In the film Eternal Sunshine of the Spotless Mind, unhappy lovers undergo an experimental brain treatment to erase all memories of each other from their minds. No such fix exists for real-life couples, but researchers report today in Nature Neuroscience that a targeted medical intervention helps to reduce specific negative memories in patients who are depressed1. "This is one time I would say that science is better than art," says Karim Nader, a neuroscientist at McGill University in Montreal, Canada, who was not involved in the research. "It's a very clever study." The technique, called electroconvulsive (ECT) or electroshock therapy, induces seizures by passing current into the brain through electrode pads placed on the scalp. Despite its sometimes negative reputation, ECT is an effective last-resort treatment for severe depression, and is used today in combination with anaesthesia and muscle relaxants. Marijn Kroes, a neuroscientist at Radboud University Nijmegen in the Netherlands, and his colleagues found that by strategically timing ECT bursts, they could target and disrupt patients' memory of a disturbing episode. A matter of time The strategy relies on a theory called memory reconsolidation, which proposes that memories are taken out of 'mental storage' each time they are accessed and 're-written' over time back onto the brain's circuits. Results from animal studies and limited evidence in humans suggest that during reconsolidation, memories are vulnerable to alteration or even erasure2–4. © 2013 Nature Publishing Group
By Alexandra Sifferlin It’s always been conventional wisdom that girls reach maturity more quickly than boys, but now scientists have provided some proof. In new research published in the journal Cerebral Cortex, an international group of researchers led by a team from Newcastle University in England found that girls’ brains march through the reorganization and pruning typical of normal brain development earlier than boys’ brains. In the study, in which 121 people between ages 4 to 40 were scanned using MRIs, the scientists documented the ebb and flow of new neural connections, and found that some brain fibers that bridged far-flung regions of the brain tended to remain stable, while shorter connections, many of which were redundant, were edited away. And the entire reorganization seemed to occur sooner in girls’ brains than in boys’ brains. Females also tended to have more connections across the two hemispheres of the brain. The researchers believe that the earlier reorganization in girls makes the brain work more efficiently, and therefore reach a more mature state for processing the environment. What drives the gender-based difference in timing isn’t clear from the current study, but the results suggest that may be a question worth investigating. © 2013 Time Inc.
Don’t worry about watching all those cat videos on the Internet. You’re not wasting time when you are at your computer—you’re honing your fine-motor skills. A study of people’s ability to translate training that involves clicking and twiddling a computer mouse reveals that the brain can apply that expertise to other fine-motor tasks requiring the hands. We know that computers are altering the way that people think. For example, using the Internet changes the way that you remember information. But what about use of the computer itself? You probably got to this story by using a computer mouse, for example, and that is a bizarre task compared with the activities that we’ve encountered in our evolutionary history. You made tiny movements of your hand in a horizontal plane to cause tiny movements of a cursor in a completely disconnected vertical plane. But with daily practice—the average computer user makes more than 1000 mouse clicks per day—you have become such an expert that you don’t even think about this amazing feat of dexterity. Scientists would love to know if that practice affects other aspects of your brain’s control of your body. The problem is finding people with no computer experience. So Konrad Kording, a psychologist at Northwestern University’s Rehabilitation Institute of Chicago in Illinois, and his former postdoc Kunlin Wei, now at Peking University in Beijing, turned to migrant Chinese workers. The country’s vast population covers the whole socioeconomic spectrum, from elite computer hackers to agricultural laborers whose lifestyles have changed little over the past century. The country’s economic boom is bringing people in waves from the countryside to cities in search of employment. © 2013 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 19060 - Posted: 12.21.2013
By Felicity Muth This might seem perplexing to some, but I’ve just spent two days listening to talks and meeting with people who all work on social insects. And it was great. I was at Royal Holloway, University of London, where the IUSSI meeting was taking place. The IUSSI is the ‘International Union for the Study of Social Insects’, although they seem to let people in who work on social spiders too (a nice inclusive attitude if you ask me). This meeting was specifically for researchers who are in the UK and North-West Europe, of which there are a surprisingly large number. The talks were really good, sharing a lot of the recent research that’s happened using social insects, and I thought I’d share my highlight of first day’s events here. One of my favourite talks from the first day was from Elli Leadbeater who spoke about work carried out primarily by Erika Dawson. I’ve written before about ‘social learning’ in monkeys and whales, where one animal can learn something from observing another animal, normally of the same species. Dawson and her colleagues were looking specifically at whether there is actually anything ‘social’ about ‘social learning’, or whether it can be explained with the same mechanism as other types of learning. In the simplest form of learning, associative learning, an animal learns to associate a particular stimulus (for example a particular colour, smell or sound) with a reward (usually food). The classic example of this was Pavlov’s dogs, who learned to associate the sound of a metronome with food. When Pavlov then sounded the metronome, the dogs salivated even when there was no food present. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 19058 - Posted: 12.21.2013
Amanda Mascarelli In children with certain gene variants, symptoms similar to common learning disabilities could be omens of serious psychiatric conditions. People who carry high-risk genetic variants for schizophrenia and autism have impairments reminiscent of disorders such as dyslexia, even when they do not yet have a mental illness, a new study has found. The findings offer a window into the brain changes that precede severe mental illness and hold promise for early intervention and even prevention, researchers say. Rare genetic alterations called copy number variants (CNVs), in which certain segments of the genome have an abnormal number of copies, play an important part in psychiatric disorders: Individuals who carry certain CNVs have a several-fold increased risk of developing schizophrenia or autism1. But previous studies were based on individuals who already have a psychiatric disorder, and until now, no one had looked at what effects these CNVs have in the general population. In a study published today in Nature2, researchers report that people with these variants but no diagnosis of autism or a mental illness still show subtle brain changes and impairments in cognitive function. “In psychiatry we always have the problem that disorders are defined by symptoms that patients experience or tell us about, or that we observe,” says study co-author Andreas Meyer-Lindenberg, a psychiatrist and the director the Central Institute of Mental Health in Mannheim, Germany, affiliated with the University of Heidelberg. This work, on the other hand, provides a glimpse into the biological underpinnings of people who are at risk of psychiatric disorders, he says. The team searched a genealogical database of more than 100,000 Icelanders, focusing on 26 genetic variants that have been shown to increase the risk of schizophrenia or autism. They found that 1,178 people in the database, or 1.16% of the sample, carried one or more of these CNVs. © 2013 Nature Publishing Group
By DANNY HAKIM LONDON — European food regulators said on Tuesday that a class of pesticides linked to the deaths of large numbers of honey bees might also harm human health, and they recommended that the European Commission further restrict their use. The commission, which requested the review, has already taken a tougher stance than regulators in other parts of the world against neonicotinoids, a relatively new nicotine-derived class of pesticide. Earlier this year, some were temporarily banned for use on many flowering crops in Europe that attract honey bees, an action that the pesticides’ makers are opposing in court. Now European Union regulators say the same class of pesticides “may affect the developing human nervous system” of children. They focused on two specific versions of the pesticide, acetamiprid and imidacloprid, saying they were safe to use only in smaller amounts than currently allowed. Imidacloprid was one of the pesticides placed under a two-year ban this year. The review was prompted by a Japanese study that raised similar concerns last year. Imidacloprid is one of the most popular insecticides, and is used in agricultural and consumer products. It was developed by Bayer, the German chemicals giant, and is the active ingredient in products like Bayer Advanced Fruit, Citrus & Vegetable Insect Control, which can be purchased at stores internationally, including Home Depot in the United States. Acetamiprid is sold by Nisso Chemical, a German branch of a Japanese company, though it was developed with Bayer’s help. It is used in consumer products like Ortho Flower, Fruit & Vegetable Insect Killer. The action by European regulators could affect the entire category of neonicotinoid pesticides, however. James Ramsay, a spokesman for the European Food Safety Authority, which conducted the review, said the agency was recommending a mandatory submission of studies related to developmental neurotoxicity “as part of the authorization process in the E.U.” © 2013 The New York Times Company
By DONALD G. McNEIL Jr. A long-awaited study has confirmed the fears of Somali residents in Minneapolis that their children suffer from higher rates of a disabling form of autism compared with other children there. The study — by the University of Minnesota, the Centers for Disease Control and Prevention, and the research and advocacy group Autism Speaks — found high rates of autism in two populations: About one Somali child in 32 and one white child in 36 in Minneapolis were on the autism spectrum. The national average is one child in 88, according to Coleen A. Boyle, who directs the C.D.C.’s Center on Birth Defects and Developmental Disabilities. But the Somali children were less likely than the whites to be “high-functioning” and more likely to have I.Q.s below 70. (The average I.Q. score is 100.) The study offered no explanation of the statistics. “We do not know why more Somali and white children were identified,” said Amy S. Hewitt, the project’s primary investigator and director of the University of Minnesota’s Research and Training Center on Community Living. “This project was not designed to answer these questions.” The results echoed those of a Swedish study published last year finding that children from immigrant families in Stockholm — many of them Somali — were more likely to have autism with intellectual disabilities. The Minneapolis study also found that Somali children with autism received their diagnoses late. Age 5 was the average, while autism and learning disabilities can be diagnosed as early as age 2, and children get the most benefit from behavioral treatment when it is started early. Black American-born children and Hispanic children in Minneapolis had much lower autism rates: one in 62 for the former and one in 80 for the latter. © 2013 The New York Times Company
Link ID: 19044 - Posted: 12.17.2013
by Bethany Brookshire “You are what you eat.” We’ve all heard that one. What we eat can affect our growth, life span and whether we develop disease. These days, we know that we also are what our mother eats. Or rather, what our mothers ate while we were in the womb. But are we also what our father eats? A new study shows that in mice, a dietary deficiency in dad can be a big downer for baby. The dietary staple in the study was folic acid, or folate. Folate is one of the B vitamins and is found in dark leafy greens (eat your kale!) and has even been added to some foods like cereals. It is particularly essential to get in the diet because we cannot synthesize it on our own. And it plays roles in DNA repair and DNA synthesis, as well as methylation of DNA. It’s particularly important during development. Without adequate folate, developing fetuses are prone to neural tube disorders, such as spina bifida. Some of the neural tube disorders caused by folate deficiency could result from breaks in the DNA itself. But folic acid is also important in the epigenome. Epigenetics is a mechanism that allows cells to change how genes are used without changing the genes themselves. Instead of altering the DNA itself, epigenetic alterations put chemical “marks” or “notes” —methyl or acetyl groups — on the DNA and the proteins associated with it. The marks can either make a gene more accessible (acetylation) or less accessible (methylation), making it more or less likely to be made into a protein. This means that each cell type can have a different epigenome, allowing a neuron to function differently than a muscle cell, even though they contain the same DNA. Folate affects DNA synthesis, but it can also affect DNA methylation. In fact, DNA methylation requires the presence of folate. So low folate could affect whether genes are turned off or on and by how much. In a developing fetus, that could contribute to developmental problems. © Society for Science & the Public 2000 - 2013.
A study in mice shows how a breakdown of the brain’s blood vessels may amplify or cause problems associated with Alzheimer’s disease. The results published in Nature Communications suggest that blood vessel cells called pericytes may provide novel targets for treatments and diagnoses. “This study helps show how the brain’s vascular system may contribute to the development of Alzheimer’s disease,” said study leader Berislav V. Zlokovic, M.D. Ph.D., director of the Zilkha Neurogenetic Institute at the Keck School of Medicine of the University of Southern California, Los Angeles. The study was co-funded by the National Institute of Neurological Diseases and Stroke (NINDS) and the National Institute on Aging (NIA), parts of the National Institutes of Health. Alzheimer’s disease is the leading cause of dementia. It is an age-related disease that gradually erodes a person’s memory, thinking, and ability to perform everyday tasks. Brains from Alzheimer’s patients typically have abnormally high levels of plaques made up of accumulations of beta-amyloid protein next to brain cells, tau protein that clumps together to form neurofibrillary tangles inside neurons, and extensive neuron loss. Vascular dementias, the second leading cause of dementia, are a diverse group of brain disorders caused by a range of blood vessel problems. Brains from Alzheimer’s patients often show evidence of vascular disease, including ischemic stroke, small hemorrhages, and diffuse white matter disease, plus a buildup of beta-amyloid protein in vessel walls. Furthermore, previous studies suggest that APOE4, a genetic risk factor for Alzheimer’s disease, is linked to brain blood vessel health and integrity.
Link ID: 19033 - Posted: 12.14.2013
Skepticism about repressed traumatic memories has increased over time, but new research shows that psychology researchers and practitioners still tend to hold different beliefs about whether such memories occur and whether they can be accurately retrieved. The findings are published in Psychological Science, a journal of the Association for Psychological Science. “Whether repressed memories are accurate or not, and whether they should be pursued by therapists, or not, is probably the single most practically important topic in clinical psychology since the days of Freud and the hypnotists who came before him,” says researcher Lawrence Patihis of the University of California, Irvine. According to Patihis, the new findings suggest that there remains a “serious split in the field of psychology in beliefs about how memory works.” Controversy surrounding repressed memory – sometimes referred to as the “memory wars” – came to a head in the 1990s. While some believed that traumatic memories could be repressed for years only to be recovered later in therapy, others questioned the concept, noting that lack of scientific evidence in support of repressed memory. Spurred by impressions that both researchers and clinicians believed the debate had been resolved, Patihis and colleagues wanted to investigate whether and how beliefs about memory may have changed since the 1990s. To find out, the researchers recruited practicing clinicians and psychotherapists, research psychologists, and alternative therapists to complete an online survey. © Association for Psychological Science
Smoking tobacco or marijuana, taking prescription painkillers, or using illegal drugs during pregnancy is associated with double or even triple the risk of stillbirth, according to research funded by the National Institutes of Health. Researchers based their findings on measurements of the chemical byproducts of nicotine in maternal blood samples; and cannabis, prescription painkillers and other drugs in umbilical cords. Taking direct measurements provided more precise information than did previous studies of stillbirth and substance use that relied only on women’s self-reporting. The study findings appear in the journal Obstetrics & Gynecology. “Smoking is a known risk factor for stillbirth, but this analysis gives us a much clearer picture of the risks than before,” said senior author Uma M. Reddy, M.D., MPH, of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH institute that supported the study. “Additionally, results from the latest findings also showed that likely exposure to secondhand smoke can elevate the risk of stillbirth.” Dr. Reddy added, “With the legalization of marijuana in some states, it is especially important for pregnant women and health care providers to be aware that cannabis use can increase stillbirth risk.” The study enrolled women between March 2006 and September 2008 in five geographically defined areas delivering at 59 hospitals participating in the Stillbirth Collaborative Research Network External Web Site Policy. Women who experienced a stillbirth and those who gave birth to a live infant participated in the study. The researchers tested blood samples at delivery from the two groups of women and the umbilical cords from their deliveries to measure the exposure to the fetus. They also asked participants to self-report smoking and drug use during pregnancy.
by Rowan Hooper BIOENGINEERS dream of growing spare parts for our worn-out or diseased bodies. They have already succeeded with some tissues, but one has always eluded them: the brain. Now a team in Sweden has taken the first step towards this ultimate goal. Growing artificial body parts in the lab starts with a scaffold. This acts as a template on which to grow cells from the patient's body. This has been successfully used to grow lymph nodes, heart cells and voice boxes from a person's stem cells. Bioengineers have even grown and transplanted an artificial kidney in a rat. Growing nerve tissue in the lab is much more difficult, though. In the brain, new neural cells grow in a complex and specialised matrix of proteins. This matrix is so important that damaged nerve cells don't regenerate without it. But its complexity is difficult to reproduce. To try to get round this problem, Paolo Macchiarini and Silvia Baiguera at the Karolinska Institute in Stockholm, Sweden, and colleagues combined a scaffold made from gelatin with a tiny amount of rat brain tissue that had already had its cells removed. This "decellularised" tissue, they hoped, would provide enough of the crucial biochemical cues to enable seeded cells to develop as they would in the brain. When the team added mesenchymal stem cells – taken from another rat's bone marrow – to the mix, they found evidence that the stem cells had started to develop into neural cells (Biomaterials, doi.org/qfh). The method has the advantage of combining the benefits of natural tissue with the mechanical properties of an artificial matrix, says Alex Seifalian, a regenerative medicine specialist at University College London, who wasn't involved in the study. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 19029 - Posted: 12.12.2013
By Janelle Weaver Children with a large vocabulary experience more success at school and in the workplace. How much parents talk to their children plays a major role, but new research shows that it is not just the quantity but also the quality of parental input that matters. Helpful gestures and meaningful glances may allow kids to grasp concepts more easily than they otherwise would. In a study published in June in the Proceedings of the National Academy of Sciences USA, Erica Cartmill of the University of Chicago and her collaborators videotaped parents in their homes as they read books and played games with their 14- or 18-month-old children. The researchers created hundreds of 40-second muted video clips of these interactions. Another set of study participants watched the videos and used clues from the scenes to guess which nouns the parents were saying at various points in the sequences. The researchers used the accuracy of these guesses to rate how well a parent used nonverbal cues, such as gesturing toward and looking at objects, to clarify a word's meaning. Cartmill and her team found that the quality of parents' nonverbal signaling predicted the size of their children's vocabulary three years later. Surprisingly, socioeconomic status did not play a role in the quality of the parents' nonverbal signaling. This result suggests that the well-known differences in children's vocabulary size across income levels are likely the result of how much parents talk to their children, which is known to differ by income, rather than how much nonverbal help they offer during those interactions. © 2013 Scientific American
Ian Sample, science correspondent Differences in children's exam results at secondary school owe more to genetics than teachers, schools or the family environment, according to a study published yesterday. The research drew on the exam scores of more than 11,000 16-year-olds who sat GCSEs at the end of their secondary school education. In the compulsory core subjects of English, maths and science, genetics accounted for on average 58% of the differences in scores that children achieved. Grades in the sciences, such as physics, biology and chemistry, were more heritable than those in humanities subjects, such as art and music, at 58% and 42% respectively. The findings do not mean that children's performance at school is determined by their genes, or that schools and the child's environment have no influence. The overall effect of a child's environment – including their home and school life – accounted for 36% of the variation seen in students' exam scores across all subjects, the study found. "The question we are asking is why do children differ in their GCSE scores? People immediately think it's schools. But if schools accounted for all the variance, then children in one classroom would all be the same," said Robert Plomin, an expert in behavioural genetics who led the study at King's College London. To tease out the genetic contribution to children's school grades, the researchers studied GCSE scores of identical twins (who share 100% of their genes) and non-identical twins (who share on average half of the genes that normally vary between people). Both groups share their environments to a similar extent. © 2013 Guardian News and Media Limited
Taking some heartburn medications for more than two years is linked to a higher risk of vitamin B12 deficiency in adults, a U.S. study suggests. Left untreated, vitamin B12 deficiency can lead to dementia, neurological damage, anemia, and other complications. Knowing that stomach acid aids in vitamin B12 absorption, researchers set out to test whether suppressing the acids can lead to vitamin deficiency. The drugs in question are known as proton pump inhibitors and they include such well known brands as Losec, Nexium, Prabacid and Pariet. Doses of more than 1.5 pills per day were more strongly associated with vitamin D deficiency than doses of less than 0.75 pills per day, Dr. Douglas Corley, a gastroenterologist and research scientist with the Kaiser Permanente Division of Research in Broadway, Calif. and his co-authors said in Wednesday's issue of the Journal of the American Medical Association. "This research raises the question of whether people who are taking acid-depressing medications long term should be screened for vitamin B12 deficiency," Corley said in a release. "It's a relatively simple blood test, and vitamin supplements are an effective way of managing the vitamin deficiency, if it is found." For the study, researchers looked at electronic health records of 25,956 adults diagnosed with vitamin B12 deficiency in Northern California between January 1997 and June 2011, and compared them with 184,199 patients without B12 deficiency during the same time period. Among the 25,956 patients who had vitamin B12 deficiency, 12 per cent used proton pump inhibitors for at least two years, compared with 7.2 per cent of those in the control group. © CBC 2013
By Ingfei Chen The way doctors diagnose Alzheimer's disease may be starting to change. Traditionally clinicians have relied on tests of memory and reasoning skills and reports of social withdrawal to identify patients with Alzheimer's. Such assessments can, in expert hands, be fairly conclusive—but they are not infallible. Around one in five people who are told they have the neurodegenerative disorder actually have other forms of dementia or, sometimes, another problem altogether, such as depression. To know for certain that someone has Alzheimer's, doctors must remove small pieces of the brain, examine the cells under a microscope and count the number of protein clumps called amyloid plaques. An unusually high number of plaques is a key indicator of Alzheimer's. Because such a procedure risks further impairing a patient's mental abilities, it is almost always performed posthumously. In the past 10 years, however, scientists have developed sophisticated brain scans that can estimate the amount of plaque in the brain while people are still alive. In the laboratory, these scans have been very useful in studying the earliest stages of Alzheimer's, before overt symptoms appear. The results are reliable enough that last year the Food and Drug Administration approved one such test called Amyvid to help evaluate patients with memory deficits or other cognitive difficulties. Despite the FDA's approval, lingering doubts about the exact role of amyloid in Alzheimer's and ambivalence about the practical value of information provided by the scan have fueled debate about when to order an Amyvid test. Not everyone who has an excessive amount of amyloid plaque develops Alzheimer's, and at the moment, there is generally no way to predict whom the unlucky ones will be. Recent studies have shown that roughly one third of older citizens in good mental health have moderate to high levels of plaque, with no noticeable ill effects. And raising the specter of the disorder in the absence of symptoms may upset more people than it helps because no effective treatments exist—at least not yet. © 2013 Scientific American