Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 5893

By Anil Ananthaswamy To understand human consciousness, we need to know why it exists in the first place. New experimental evidence suggests it may have evolved to help us learn and adapt to changing circumstances far more rapidly and effectively. We used to think consciousness was a uniquely human trait, but neuroscientists now believe we share it with many other animals, including mammals, birds and octopuses. While plants and arguably some animals like jellyfish seem able to respond to the world around them without any conscious awareness, many other animals consciously experience and perceive their environment. Read more: Why be conscious – The improbable origins of our unique mind In the 19th century, Thomas Henry Huxley and others argued that such consciousness is an “epiphenomenon” – a side effect of the workings of the brain that has no causal influence, the way a steam whistle has no effect on the way a steam engine works. More recently, neuroscientists have suggested that consciousness enables us to integrate information from different senses or keep such information active for long enough in the brain that we can experience the sight and sound of car passing by, for example, as one unified perception, even though sound and light travel at different speeds. © Copyright New Scientist Ltd.

Keyword: Consciousness; Learning & Memory
Link ID: 23785 - Posted: 06.28.2017

/ By Rod McCullom Facebook has problem — a very significant problem — with the violent and gruesome content which has quickly found its way, in numerous instances, onto the social network and its Facebook Live feature, which was introduced to American users in January 2016. The disturbing litany of murders, suicides and assaults have already become macabre technological milestones. These include Robert Godwin Sr., the 74-year-old father of nine and grandfather of 14 who was selected by a gunman at random and then murdered in a video posted to Facebook in mid-April. One week later, a man in Thailand streamed the murder of his 11-month old daughter on Facebook Live before taking his own life. The beating and torture of an 18-year-old man with intellectual and development disabilities was live-streamed on the service in January, and the tragic shooting death of two-year-old Lavontay White Jr. followed a month later on Valentine’s Day. “At least 45 instances of violence — shootings, rapes, murders, child abuse, torture, suicides, and attempted suicides — have been broadcast via Live [since] December 2015,” Buzzfeed’s Alex Kantrowitz reported this month. “That’s an average rate of about two instances per month.” Copyright 2017 Undark

Keyword: Aggression; Robotics
Link ID: 23778 - Posted: 06.27.2017

Rebecca Hersher The first problem with the airplane bathroom was its location. It was March. Greg O'Brien and his wife, Mary Catherine, were flying back to Boston from Los Angeles, sitting in economy seats in the middle of the plane. "We're halfway, probably over Chicago," Greg remembers, "and Mary Catherine said, 'Go to the bathroom.' " "It just sounded like my mother," Greg says. So I said 'no.' " Mary Catherine persisted, urging her husband of 40 years to use the restroom. People started looking at them. "It was kind of funny," says Greg. Mary Catherine was more alarmed than amused. Greg has early-onset Alzheimer's, which makes it increasingly hard for him to keep track of thoughts and feelings over the course of minutes or even seconds. It's easy to get into a situation where you feel like you need to use the bathroom, but then forget. And they had already been on the plane for hours. Finally, Greg started toward the restroom at the back of the plane, only to find the aisle was blocked by an attendant serving drinks. Mary Catherine gestured to him. "Use the one in first class!" At that point, on top of the mild anxiety most people feel when they slip into first class to use the restroom, Greg was feeling overwhelmed by the geography of the plane. He pulled back the curtain dividing the seating sections. "This flight attendant looks at me like she has no use for me. I just said 'Look, I really have to go the bathroom,' and she says 'OK, just go.' " © 2017 npr

Keyword: Alzheimers; Learning & Memory
Link ID: 23772 - Posted: 06.26.2017

Andrea Hsu Intuitively, we tend to think of forgetting as failure, as something gone wrong in our ability to remember. Now, Canadian neuroscientists with the University of Toronto are challenging that notion. In a paper published Wednesday in the journal Neuron, they review the current research into the neurobiology of forgetting and hypothesize that our brains purposefully work to forget information in order to help us live our lives. I spoke with Blake Richards, one of the co-authors of the paper, who applies artificial intelligence theories to his study of how the brain learns. He says that in the AI world, there's something called over-fitting — a phenomenon in which a machine stores too much information, hindering its ability to behave intelligently. He hopes that greater understanding of how our brains decide what to keep and what to forget will lead to better AI systems that are able to interact with the world and make decisions in the way that we do. We hear a lot about the study of memory. Is the study of forgetting a relatively new thing? Within psychology, there's a long history of work examining forgetting. So that's not a new field of study. But the neuroscientists — those of us who work with the biology of how the brain works — have not really examined forgetting much in the past. Generally, the focus for the last few decades in neuroscience has been the question of how do the cells in our brains change themselves in order to store information and remember things. It's only been in the last few years that there's been an upswing in scientific studies looking at what's happening inside our brains at the cellular level that might actually produce forgetting. © 2017 npr

Keyword: Learning & Memory
Link ID: 23771 - Posted: 06.24.2017

by Laura Sanders When we brought our first baby home from the hospital, our pediatrician advised us to have her sleep in our room. We put our tiny new roommate in a crib near our bed (though other containers that were flat, firm and free of blankets, pillows or stuffed animals would have worked, too). The advice aims to reduce the risk of sleep-related deaths, including sudden infant death syndrome, or SIDS. Studies suggest that in their first year of life, babies who bunk with their parents (but not in the same bed) are less likely to die from SIDS than babies who sleep in their own room. The reasons aren’t clear, but scientists suspect it has to do with lighter sleep: Babies who sleep near parents might more readily wake themselves up and avoid the deep sleep that’s a risk factor for SIDS. That’s an important reason to keep babies close. Room sharing also makes sense from a logistical standpoint. Middle of the night feedings and diaper changes are easier when there’s less distance between you and the babe. But babies get older. They start snoring a little louder and eating less frequently, and it’s quite natural to wonder how long this room sharing should last. That’s a question without a great answer. In November 2016, the American Academy of Pediatrics task force on SIDS updated its sleep guidelines. The earlier recommendation was that babies ought to sleep in parents’ bedrooms for an entire year. The new suggestion softens that a bit to say infants should be there for “ideally for the first year of life, but at least for the first 6 months.” © Society for Science & the Public 2000 - 2017

Keyword: Sleep; Drug Abuse
Link ID: 23766 - Posted: 06.23.2017

Staring down a packed room at the Hyatt Regency Hotel in downtown San Francisco this March, Randy Gallistel gripped a wooden podium, cleared his throat, and presented the neuroscientists sprawled before him with a conundrum. “If the brain computed the way people think it computes," he said, "it would boil in a minute." All that information would overheat our CPUs. Humans have been trying to understand the mind for millennia. And metaphors from technology—like cortical CPUs—are one of the ways that we do it. Maybe it’s comforting to frame a mystery in the familiar. In ancient Greece, the brain was a hydraulics system, pumping the humors; in the 18th century, philosophers drew inspiration from the mechanical clock. Early neuroscientists from the 20th century described neurons as electric wires or phone lines, passing signals like Morse code. And now, of course, the favored metaphor is the computer, with its hardware and software standing in for the biological brain and the processes of the mind. In this technology-ridden world, it’s easy to assume that the seat of human intelligence is similar to our increasingly smart devices. But the reliance on the computer as a metaphor for the brain might be getting in the way of advancing brain research. As Gallistel continued his presentation to the Cognitive Neuroscience Society, he described the problem with the computer metaphor. If memory works the way most neuroscientists think it does—by altering the strength of connections between neurons—storing all that information would be way too energy-intensive, especially if memories are encoded in Shannon information, high fidelity signals encoded in binary.

Keyword: Learning & Memory; Consciousness
Link ID: 23764 - Posted: 06.23.2017

By Sharon Begley, STAT To anyone who’s aware that efforts to develop Alzheimer’s drug treatments have met failure after failure, and to have therefore decided that prevention is the only hope, a U.S. panel of experts issued a sobering message on Thursday: Don’t count on it. From physical activity to avoiding high blood pressure to brain training, a 17-member committee assembled by the National Academies of Sciences concluded, no interventions are “supported by high-strength evidence.” Instead, some high-quality studies found that one or another intervention worked, but other equally rigorous studies found they didn’t. 1. Cognitive training The evidence for programs aimed at boosting reasoning, problem-solving, memory, and speed of processing does include randomized trials that reported benefits from brain training, but the report calls that evidence “low to moderate strength.” One problem: There seemed to be benefits for two years, but not after five or 10. Results in other randomized studies were even more equivocal. There are also data from studies that are less rigorous, leading the committee to conclude that brain training (computer-based or not) can delay or slow age-related cognitive decline—but not Alzheimer’s. 2. Controlling blood pressure Evidence that this helps is weaker still. © 2017 Scientific American

Keyword: Alzheimers
Link ID: 23763 - Posted: 06.23.2017

Ian Sample Science editor Older men tend to have “geekier” sons who are more aloof, have higher IQs and a more intense focus on their interests than those born to younger fathers, researchers claim. The finding, which emerged from a study of nearly 8,000 British twins, suggests that having an older father may benefit children and boost their performance in technical subjects at secondary school. Researchers in the UK and the US analysed questionnaires from 7,781 British twins and scored them according to their non-verbal IQ at 12 years old, as well as parental reports on how focused and socially aloof they were. The scientists then combined these scores into an overall “geek index”. Magdalena Janecka at King’s College London said the project came about after she and her colleagues had brainstormed what traits and skills helped people to succeed in the modern age. “If you look at who does well in life right now, it’s geeks,” she said. Drawing on the twins’ records, the scientists found that children born to older fathers tended to score slightly higher on the geek index. For a father aged 25 or younger, the average score of the children was 39.6. That figure rose to 41 in children with fathers aged 35 to 44, and to 47 for those with fathers aged over 50. The effect was strongest in boys, where the geek index rose by about 1.5 points for every extra five years of paternal age. The age of the children’s mothers seemed to have almost no effect on the geek index. © 2017 Guardian News and Media Limited

Keyword: Epigenetics; Development of the Brain
Link ID: 23757 - Posted: 06.21.2017

Heidi Ledford By 13 weeks of gestation, human fetuses have developed a much more unusual immune system than previously thought. A human fetus in its second trimester is extraordinarily busy. It is developing skin and bones, the ability to hear and swallow, and working on its first bowel movement. Now, a study published on 14 June in Nature finds that fetuses are also acquiring a functioning immune system — one that can recognize foreign proteins, but is less inclined than a mature immune system to go on the attack (N. McGovern et al. Nature http://dx.doi.org/10.1038/nature22795; 2017). The results add to a growing body of literature showing that the fetal immune system is more active than previously appreciated. “In general textbooks, you see this concept of a non-responsive fetus is still prevailing,” says immunologist Jakob Michaelsson at the Karolinska Institute in Stockholm. But the fetal immune system is unique, he says. “It’s not just immature, it’s special.” A developing fetus is constantly exposed to foreign proteins and cells, which are transferred from the mother through the placenta. In humans, this exposure is more extensive than in many other mammals, says immunologist Mike McCune at the University of California, San Francisco. As a result, laboratory mice have proved a poor model for studying the developing human fetal immune system. But fully understanding that development could reveal the reasons for some miscarriages, as well as explain conditions such as pre-eclampsia, which is associated with abnormal immune responses to pregnancy and causes up to 40% of premature births. © 2017 Macmillan Publishers Limited,

Keyword: Development of the Brain; Neuroimmunology
Link ID: 23746 - Posted: 06.15.2017

Jon Hamilton Researchers are working to revive a radical treatment for Parkinson's disease. The treatment involves transplanting healthy brain cells to replace cells killed off by the disease. It's an approach that was tried decades ago and then set aside after disappointing results. Now, groups in Europe, the U.S. and Asia are preparing to try again, using cells they believe are safer and more effective. "There have been massive advances," says Claire Henchcliffe, a neurologist at Weill Cornell Medicine in New York. "I'm optimistic." "We are very optimistic about ability of [the new] cells to improve patients' symptoms," says Viviane Tabar, a neurosurgeon and stem cell biologist at Memorial Sloan Kettering Cancer Center in New York. Henchcliffe and Tabar joined several other prominent scientists to describe plans to revive brain cell transplants during a session Tuesday at the International Society for Stem Cell Research meeting in Boston. Their upbeat message marks a dramatic turnaround for the approach. During the 1980s and 1990s, researchers used cells taken directly from the brains of aborted fetuses to treat hundreds of Parkinson's patients. The goal was to halt the disease. © 2017 npr

Keyword: Parkinsons; Stem Cells
Link ID: 23738 - Posted: 06.14.2017

By Lenny Bernstein A mother’s fever during pregnancy, especially in the second trimester, is associated with a higher risk that her child will be diagnosed with autism spectrum disorder, researchers reported Tuesday. Three or more fevers after 12 weeks of gestation may be linked to an even greater risk of the condition. The study by researchers at Columbia University’s Mailman School of Public Health adds support for the theory that infectious agents that trigger a pregnant woman’s immune response may disrupt a fetus’s brain development and lead to disorders such as autism. “Fever seems to be the driving force here,” not the infection itself, said Mady Hornig, director of translational research at the school’s Center for Infection and Immunity. Fever can be part of the body’s immune response to an infection, and molecules produced by a mother’s immune system may be crossing into the baby’s neurological system at a critical time, she said. The research, published in the journal Molecular Psychiatry, comes at a time when the scientifically discredited theory that some childhood vaccines cause autism has gained new attention. President Trump has promoted this myth, energizing some anti-vaccine groups. Some families say that their children developed autism after vaccinations. The timing is a coincidence, however; symptoms of autism typically become clear at around two years of age, which happens to be the age when children get certain vaccines. © 1996-2017 The Washington Post

Keyword: Autism
Link ID: 23737 - Posted: 06.13.2017

By Kerry Grens Memory theories The theory goes that as memories form, they set up temporary shop in the hippocampus, a subcortical region buried deep in the brain, but over time find permanent storage in the cortex. The details of this process are sketchy, so Takashi Kitamura, a researcher in Susumu Tonegawa’s MIT lab, and colleagues wanted to pinpoint the time memories spend in each of these regions. Total recall As mice were subjected to a fearful experience, the team labeled so-called memory engram cells—neurons that are stimulated during the initial exposure and whose later activity drives recollection of the original stimulus (in this case, indicated by a freezing response). Using optogenetics, Kitamura turned off these cells in the prefrontal cortex (PFC) when the memory first formed as mice were exposed to a foot shock. Short-term memory was unaffected, but a couple of weeks later, the animals could not recall the event, indicating that PFC engrams formed contemporaneously with those in the hippocampus, not later, as some had suspected, and that this early memory trace in the cortex was critical for long-term retrieval. Going dark Over time, as untreated mice recalled the fearful event, engrams in the hippocampus became silent as PFC engrams became more active. “It’s a see-saw situation,” says Kitamura, “this maturation of prefrontal engrams and dematuration of hippocampal engrams.” Circuit dynamics Stephen Maren, who researches memory at Texas A&M University and was not part of the study, says the results reveal that the network circuitry involved in memory consolidation (of which Kitamura’s team dissected just one component) is much more dynamic than previously appreciated. “It’s the most sophisticated circuit-level analysis we have to date of these processes.” © 1986-2017 The Scientist

Keyword: Learning & Memory
Link ID: 23735 - Posted: 06.13.2017

By Clare Wilson Would you have pig cells implanted in your brain? Some people with Parkinson’s disease have, in the hope it will stop their disease progressing. The approach is still in the early stages of testing, but initial results from four people look promising, with all showing some improvement 18 months after surgery. People with Parkinson’s disease, which causes tremors and difficulty moving, usually get worse over time. The disease is caused by the gradual loss of brain cells that make dopamine, a compound that helps control our movements. Current medicines replace the missing dopamine, but their effectiveness wears off over the years. So Living Cell Technologies, based in Auckland, New Zealand, has been developing a treatment that uses cells from the choroid plexus in pigs. This brain structure makes a cocktail of growth factors and signalling molecules known to help keep nerve cells healthy. Last month, surgery was completed on a further 18 people in a placebo-controlled trial, using the choroid plexus cell implants. The hope is that compounds made by these cells will nourish the remaining dopamine-producing cells in the patients’ brains, slowing further loss. © Copyright New Scientist Ltd.

Keyword: Parkinsons; Stem Cells
Link ID: 23731 - Posted: 06.12.2017

By David Noonan Sight and hearing get all the glory, but the often overlooked and underappreciated sense of smell—or problems with it—is a subject of rapidly growing interest among scientists and clinicians who battle Alzheimer’s and Parkinson’s diseases. Impaired smell is one of the earliest and most common symptoms of both, and researchers hope a better understanding will improve diagnosis and help unlock some of the secrets of these incurable conditions. The latest offering from the burgeoning field is a paper published this month in Lancet Neurology. It proposes neurotransmitter dysfunction as a possible cause of smell loss in a number of neurodegenerative diseases, including Alzheimer’s and Parkinson’s. More than 90 percent of Parkinson’s patients report some level of olfactory dysfunction. And because problems with smell progress in Alzheimer’s, nearly all of those diagnosed with moderate to severe forms of the illness have odor identification issues. “It’s important, not just because it’s novel and interesting and simple but because the evidence is strong,” says Davangere Devanand, a professor of psychiatry and neurology at Columbia University. His most recent paper on the subject, a review, was published in The American Journal of Geriatric Psychiatry in December. Studies have shown impaired smell to be even stronger than memory problems as a predictor of cognitive decline in currently healthy adults. It is especially useful for forecasting the progression from mild cognitive impairment (MCI) to full-blown Alzheimer’s. According to the Alzheimer’s Association, approximately 15 to 20 percent of people over 65 have MCI. About half of them go on to develop Alzheimer’s, Devanand says—and the sooner they are identified, the earlier doctors can begin interventions, including treatment with the few existing Alzheimer’s drugs. © 2017 Scientific American

Keyword: Alzheimers; Chemical Senses (Smell & Taste)
Link ID: 23729 - Posted: 06.12.2017

By RICHARD SANDOMIR Isabelle Rapin, a Swiss-born child neurologist who helped establish autism’s biological underpinnings and advanced the idea that autism was part of a broad spectrum of disorders, died on May 24 in Rhinebeck, N.Y. She was 89. The cause was pneumonia, said her daughter Anne Louise Oaklander, who is also a neurologist. “Calling her one of the founding mothers of autism is very appropriate,” said Dr. Thomas Frazier II, a clinical psychologist and chief science officer of Autism Speaks, an advocacy group for people with autism and their families. “With the gravity she carried, she moved us into a modern understanding of autism.” Dr. Rapin (pronounced RAP-in) taught at the Albert Einstein College of Medicine in the Bronx and over a half-century there built a reputation for rigorous scholarship. She retired in 2012 but continued working at her office and writing journal papers. The neurologist Oliver Sacks, a close friend and colleague, called her his “scientific conscience.” In his autobiography, “On the Move: A Life” (2015), Dr. Sacks wrote: “Isabelle would never permit me, any more than she permitted herself, any loose, exaggerated, uncorroborated statements. ‘Give me the evidence,’ she always says.” Dr. Rapin’s focus on autism evolved from her studies of communications and metabolic disorders that cause mental disabilities and diminish children’s ability to navigate the world. For decades she treated deaf children, whose difficulties in communicating limited their path to excelling in school and forced some into institutions. “Communications disorders were the overarching theme of my mother’s career,” Dr. Oaklander said in an interview. In a short biography written for the Journal of Child Neurology in 2001, Dr. Rapin recalled a critical moment in her work on autism. “After evaluating hundreds of autistic children,” she wrote, “I became convinced that the report by one-third of parents of autistic preschoolers, of a very early language and behavioral regression, is real and deserving of biologic investigation.” © 2017 The New York Times Company

Keyword: Autism
Link ID: 23727 - Posted: 06.12.2017

Maria Temming Fascination with faces is nature, not nurture, suggests a new study of third-trimester fetuses. Scientists have long known that babies like looking at faces more than other objects. But research published online June 8 in Current Biology offers evidence that this preference develops before birth. In the first-ever study of prenatal visual perception, fetuses were more likely to move their heads to track facelike configurations of light projected into the womb than nonfacelike shapes. Past research has shown that newborns pay special attention to faces, even if a “face” is stripped down to its bare essentials — for instance, a triangle of three dots: two up top for eyes, one below for a mouth or nose. This preoccupation with faces is considered crucial to social development. “The basic tendency to pick out a face as being different from other things in your environment, and then to actually look at it, is the first step to learning who the important people are in your world,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. Using a 4-D ultrasound, the researchers watched how 34-week-old fetuses reacted to seeing facelike triangles compared with seeing triangles with one dot above and two below. They projected triangles of red light in both configurations through a mother’s abdomen into the fetus’s peripheral vision. Then, they slid the light across the mom’s belly, away from the fetus’s line of sight, to see if it would turn its head to continue looking at the image. |© Society for Science & the Public 2000 - 2017

Keyword: Development of the Brain; Attention
Link ID: 23726 - Posted: 06.09.2017

By PHILIP S. GUTIS My husband, Tim, and a duo of Jack Russell terriers arrived in my life 13 years ago. They were a package deal that included Osceola Jack, a champion Frisbee player who once was the Mighty Dog actor in the famous commercials, and his pup, the equally mighty Samantha. Later our family grew with Beatrice, a sweet cattle dog mix from Florida who belonged to Tim’s brother but needed a new home. As an introvert, I have not always had the best people skills, but my ability to connect with animals has never flagged. Many of my best memories involve animals. But now things are changing. Last summer, at age 54, I learned I had early onset Alzheimer’s. Amid the many worries that accompany this diagnosis, I am afraid that I will lose my cherished ability to bond with — or even remember — my animal companions much longer. Since my 20s and 30s, I’ve had some weird memory gaps. I once forgot that a childhood best friend worked for me at the school newspaper at Penn State. I wrote off these memory holes to a busy life and career. I worked long days, spent hours on airplanes and trains, managed dozens of people and grappled with complicated issues. I told myself that all of that work, stress and the sheer volume of information that I was expected to retain had to take a toll on my ability to remember everything. But a few years ago, I started to notice that I just wasn’t performing as well as I used to. Keeping track of big projects became increasingly difficult. Skills that were sometimes challenging (simple math, remembering names, understanding maps and directions) became all but impossible. Some days my memory was so bad that I wanted to wear a shirt that said, “Sorry, I just cannot remember your name.” My sister found an online advertisement for people concerned about memory loss. I called the phone number and scheduled an in-person screening. Bring someone familiar with you, the woman on the phone said. I brought Tim, who stayed close as a neurologist poked and prodded me, and vials and vials of blood were drawn. And then came the memory tests. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23724 - Posted: 06.09.2017

By Anil Ananthaswamy A machine-learning algorithm has analysed brain scans of 6-month-old children and predicted with near-certainty whether they will show signs of autism when they reach the age of 2. The finding means we may soon be able to intervene before symptoms appear, although whether that would be desirable is a controversial issue. “We have been trying to identify autism as early as possible, most importantly before the actual behavioural symptoms of autism appear,” says team member Robert Emerson of the University of North Carolina at Chapel Hill. Previous work has identified that bundles of nerve fibres in the brain develop differently in infants with older siblings with autism from how they do in infants without this familial risk factor. The changes in these white matter tracts in the brain are visible at 6 months. For the new study, Emerson and his team did fMRI brain scans of 59 sleeping infants, all of whom were aged 6 months and had older siblings with autism, which means they are more likely to develop autism themselves. The scans collected data from 230 brain regions, showing the 26,335 connections between them. When the team followed-up with the children at the age of 2, 11 had been diagnosed with an autism-like condition. The team used the brain scans from when the babies were 6 months old and behavioural data from when the children were 2 years old to train a machine-learning program to identify any brain connectivity patterns that might be linked to later signs of autism, such as repetitive behaviour, difficulties with language, or problems relating socially to others. © Copyright New Scientist Ltd.

Keyword: Autism; Brain imaging
Link ID: 23722 - Posted: 06.08.2017

Children born to women with gestational diabetes whose diet included high proportions of refined grains may have a higher risk of obesity by age 7, compared to children born to women with gestational diabetes who ate low proportions of refined grains, according to results from a National Institutes of Health study. These findings, which appear online in the American Journal of Clinical Nutrition, were part of the Diabetes & Women’s Health Study, a research project led by NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). Gestational diabetes, or high blood sugar during pregnancy, affects about 5 percent of all pregnancies in the United States and may lead to health problems for mothers and newborns. The authors noted that previous studies have linked diets high in refined grains — such as white rice — to obesity, type 2 diabetes and heart disease. The researchers compared records from 918 mother-child pairs who took part in the Danish National Birth Cohort, a study that followed the pregnancies of more than 91,000 women in Denmark. They found that children born to women with gestational diabetes who consumed the most refined grain (more than 156 grams per day) were twice as likely to be obese at age 7, compared to children born to women with gestational diabetes who ate the least amount of refined grain (less than 37 grams per day). The link between maternal grain consumption during pregnancy and obesity by age 7 still persisted when the researchers controlled for factors that could potentially influence the children’s weight — such as physical activity level and consumption of vegetables, fruit and sweets. The authors called for additional studies to confirm their results and to follow children through later childhood, adolescence and adulthood to see if the obesity risk persists later in life.

Keyword: Obesity; Development of the Brain
Link ID: 23720 - Posted: 06.08.2017

By NICHOLAS BAKALAR Chronic pain may be linked to an increasing risk for dementia. Researchers interviewed 10,065 people over 62 in 1998 and 2000, asking whether they suffered “persistent pain,” defined as being often troubled with moderate or severe pain. Then they tracked their health through 2012. After adjusting for many variables, they found that compared with those who reported no pain problems, people who reported persistent pain in both 1998 and 2000 had a 9 percent more rapid decline in memory performance. Moreover, the probability of dementia increased 7.7 percent faster in those with persistent pain compared with those without. The study, in JAMA Internal Medicine, does not prove cause and effect. But chronic pain may divert attention from other mental activity, leading to poor memory, and some studies have found that allaying pain with opioids can lead to cognitive improvements. Still, the lead author, Dr. Elizabeth L. Whitlock, an anesthesiologist at the University of California at San Francisco, acknowledged that treatment with opioids is problematic, and that safely controlling chronic pain is a problem that so far has no satisfactory solution. “I’d encourage clinicians to be aware of the cognitive implications of a simple report of pain,” she said. “It’s a simple question to ask, and the answer can be used to identify a population at high risk of functional and cognitive problems.” © 2017 The New York Times Company

Keyword: Alzheimers; Pain & Touch
Link ID: 23719 - Posted: 06.08.2017