Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Emily Underwood Old age may make us wiser, but it rarely makes us quicker. In addition to slowing down physically, most people lose points on intelligence tests as they enter their golden years. Now, new research suggests the loss of certain types of cognitive skills with age may stem from problems with basic sensory tasks, such as making quick judgments based on visual information. Although there’s no clear causal link between the two types of thinking yet, the new work could provide a simple, affordable way to track mental decline in senior citizens, scientists say. Since the 1970s, researchers who study intelligence have hypothesized that smartness, as measured on standard IQ tests, may hinge on the ability to quickly and efficiently sample sensory information from the environment, says Stuart Ritchie, a psychologist at the University of Edinburgh in the United Kingdom. Today it’s well known that people who score high on such tests do, indeed, tend to process such information more quickly than those who do poorly, but it’s not clear how these measures change with age, Ritchie says. Studying older people over time can be challenging given their uncertain health, but Ritchie and his colleagues had an unusual resource in the Lothian Birth Cohort, a group of people born in 1936 whose mental function has been periodically tested by the Scottish government since 1947—their first IQ test was at age 11. After recruiting more than 600 cohort members for their study, Ritchie and colleagues tracked their scores on a simple visual task three times over 10 years, repeating the test at the mean ages of 70, 73, and 76. © 2014 American Association for the Advancement of Science
Claudia M. Gold In the course of working on my new book about listening to parents and children, I have had the pleasure of immersing myself in the writing of D.W. Winnicott, pediatrician turned psychoanalyst. Winnicott's professional life included both caring for countless young children and families as a pediatrician, and psychoanalytic practice, where his adult patients "regressed to dependence," giving him an opportunity to interact with their infantile qualities, but with adult capacities for communication. This combination of experiences gave him a unique vantage point from which to make his many brilliant observations about children and the nature of the parent-child relationship. A recent New York Times Magazine article on autism prompted me to share his words of wisdom on the subject, which, though written in 1966, still have relevance today. The following is from a collection of papers, Thinking About Children: From my point of view the invention of the term autism was a mixed blessing...I would like to say that once this term has been invented and applied, the stage was set for something which is slightly false, i.e. the discovery of a disease…Pediatricians and physically minded doctors as a whole like to think in terms of diseases which gives a tidy look to the textbooks... The unfortunate thing is that in matters psychological things are not like that. Winnicott implores the reader to instead understand the child in relational and developmental context. He writes: The subject quickly becomes one not of autism and not of the early roots of a disorder that might develop in to autism, but rather one of the whole story of human emotional development and the relationship of the process in the individual child to the environmental provision which may or may not in any one particular case facilitate the maturational process. ©2014 Boston Globe Media Partners, LLC
Link ID: 19915 - Posted: 08.05.2014
By RUTH PADAWER At first, everything about L.'s baby boy seemed normal. He met every developmental milestone and delighted in every discovery. But at around 12 months, B. seemed to regress, and by age 2, he had fully retreated into his own world. He no longer made eye contact, no longer seemed to hear, no longer seemed to understand the random words he sometimes spoke. His easygoing manner gave way to tantrums and head-banging. “He had been this happy, happy little guy,” L. said. “All of a sudden, he was just fading away, falling apart. I can’t even describe my sadness. It was unbearable.” More than anything in the world, L. wanted her warm and exuberant boy back. A few months later, B. received a diagnosis of autism. His parents were devastated. Soon after, L. attended a conference in Newport, R.I., filled with autism clinicians, researchers and a few desperate parents. At lunch, L. (who asked me to use initials to protect her son’s privacy) sat across from a woman named Jackie, who recounted the disappearance of her own boy. She said the speech therapist had waved it off, blaming ear infections and predicting that Jackie’s son, Matthew, would be fine. She was wrong. Within months, Matthew acknowledged no one, not even his parents. The last word he had was “Mama,” and by the time Jackie met L., even that was gone. In the months and years that followed, the two women spent hours on the phone and at each other’s homes on the East Coast, sharing their fears and frustrations and swapping treatment ideas, comforted to be going through each step with someone who experienced the same terror and confusion. When I met with them in February, they told me about all the treatments they had tried in the 1990s: sensory integration, megadose vitamins, therapeutic horseback riding, a vile-tasting powder from a psychologist who claimed that supplements treated autism. None of it helped either boy. Together the women considered applied behavior analysis, or A.B.A. — a therapy, much debated at the time, that broke down every quotidian action into tiny, learnable steps, acquired through memorization and endless repetition; they rejected it, afraid it would turn their sons into robots. But just before B. turned 3, L. and her husband read a new book by a mother claiming that she used A.B.A. on her two children and that they “recovered” from autism. © 2014 The New York Times Company
Link ID: 19913 - Posted: 08.02.2014
By Fredrick Kunkle The way older people walk may provide a reliable clue about how well their brain is aging and could eventually allow doctors to determine whether they are at risk of Alzheimer’s, researchers have found. The study, involving thousands of older people in several countries, suggests that those whose walking pace begins to slow and who also have cognitive complaints are more than twice as likely to develop dementia within 12 years. The findings are among the latest attempts to find and develop affordable, inexpensive diagnostic tools to determine whether a person is at risk for dementia. Last month, researchers attending the Alzheimer’s Association International Conference in Copenhagen presented several studies focused on locating biomarkers of dementia in its earliest stages. Among other things, scientists reported a connection between dementia and sense of smell that suggested a common scratch-and-sniff test could be used to help identify onset of dementia, while other researchers suggested that eye scans could also be useful someday be able to detect Alzheimer’s. Different studies found a new abnormal protein linked to Alzheimer’s and a possible link between sleep disorders and the onset of dementia. Now, researchers at the Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center say that a simple test to measure a patient’s cognitive abilities and walking speed could provide a new diagnostic tool to identify people at risk for dementia. It could be especially important tool in low- and middle-income countries with less access to sophisticated and costly technology, the scientists said.
Link ID: 19910 - Posted: 08.02.2014
By PAULA SPAN Call me nuts, but I want to talk more about sleeping pill use. Hold your fire for a few paragraphs, please. Just a week after I posted here about medical efforts to help wean older patients off sleeping pills — causing a flurry of comments, many taking exception to the whole idea as condescending or dismissive of the miseries of insomnia — researchers at the Centers for Disease Control and Prevention and Johns Hopkins published findings that reinforce concerns about these drugs. I say “reinforce” because geriatricians and other physicians have fretted for years about the use of sedative-hypnotic medications, including benzodiazepines (like Ativan, Klonopin, Xanax and Valium) and the related “Z-drugs” (like Ambien) for treating insomnia. “I’m not comfortable writing a prescription for these medications,” said Dr. Cara Tannenbaum, the geriatrician at the University of Montreal who led the weaning study. “I haven’t prescribed a sedative-hypnotic in 15 years.” In 2013, the American Geriatrics Society put sedative-hypnotics on its first Choosing Wisely campaign list of “Five Things Physicians and Patients Should Question,” citing heightened fall and fracture risks and automobile accidents in older patients who took them. Now the C.D.C. has reported that a high number of emergency room visits are associated with psychiatric medications in general, and zolpidem — Ambien — in particular. They’re implicated in 90,000 adult E.R. visits annually because of adverse reactions, the study found; more than 19 percent of those visits result in hospital admissions. Among those taking sedatives and anxiety-reducing drugs, “a lot of visits were because people were too sleepy or hard to arouse, or confused,” said the lead author, Dr. Lee Hampton, a medical officer at the C.D.C. “And there were also a lot of falls.” © 2014 The New York Times Company
|By Annie Sneed It's easy to recall events of decades past—birthdays, high school graduations, visits to Grandma—yet who can remember being a baby? Researchers have tried for more than a century to identify the cause of “infantile amnesia.” Sigmund Freud blamed it on repression of early sexual experiences, an idea that has been discredited. More recently, researchers have attributed it to a child's lack of self-perception, language or other mental equipment required to encode memories. Neuroscientists Paul Frankland and Sheena Josselyn, both at the Hospital for Sick Children in Toronto, do not think linguistics or a sense of self offers a good explanation, either. It so happens that humans are not the only animals that experience infantile amnesia. Mice and monkeys also forget their early childhood. To account for the similarities, Frankland and Josselyn have another theory: the rapid birth of many new neurons in a young brain blocks access to old memories. In a new experiment, the scientists manipulated the rate at which hippocampal neurons grew in young and adult mice. The hippocampus is the region in the brain that records autobiographical events. The young mice with slowed neuron growth had better long-term memory. Conversely, the older mice with increased rates of neuron formation had memory loss. Based on these results, published in May in the journal Science, Frankland and Josselyn think that rapid neuron growth during early childhood disrupts the brain circuitry that stores old memories, making them inaccessible. Young children also have an underdeveloped prefrontal cortex, another region of the brain that encodes memories, so infantile amnesia may be a combination of these two factors. © 2014 Scientific American
By DOUGLAS QUENQUA Like Pavlov’s dogs, most organisms can learn to associate two events that usually occur together. Now, a team of researchers says they have identified a gene that enables such learning. The scientists, at the University of Tokyo, found that worms could learn to avoid unpleasant situations as long as a specific insulin receptor remained intact. Roundworms were exposed to different concentrations of salt; some received food during the initial exposure, others did not. Later, when exposed to various concentrations of salt again, the roundworms that had been fed during the first stage gravitated toward their initial salt concentrations, while those that had been starved avoided them. But the results changed when the researchers repeated the experiment using worms with a defect in a particular receptor for insulin, a protein crucial to metabolism. Those worms could not learn to avoid the salt concentrations associated with starvation. “We looked for different forms of the receptor and found that a new one, which we named DAF-2c, functions in taste-aversion learning,” said Masahiro Tomioka, a geneticist at the University of Tokyo and an author of the study, which was published in the journal Science. “It turned out that only this form of the receptor can support learning” in roundworms. While human insulin receptors bear some resemblance to those of a roundworm, more study is needed to determine if it plays a similar role in memory and decision-making, Dr. Tomioka said. But studies have suggested a link between insulin levels and Alzheimer’s disease in humans. © 2014 The New York Times Company
Sara Reardon Broad population studies are shedding light on the genetic causes of mental disorders. Researchers seeking to unpick the complex genetic basis of mental disorders such as schizophrenia have taken a huge step towards their goal. A paper1 published in Nature this week ties 108 genetic locations to schizophrenia — most for the first time. The encouraging results come on the same day as a US$650-million donation to expand research into psychiatric conditions. Philanthropist Ted Stanley gave the money to the Stanley Center for Psychiatric Research at the Broad Institute in Cambridge, Massachusetts. The institute describes the gift as the largest-ever donation for psychiatric research. “The assurance of a very long life of the centre allows us to take on ambitious long-term projects and intellectual risks,” says its director, Steven Hyman. The centre will use the money to fund genetic studies as well as investigations into the biological pathways involved in conditions such as schizophrenia, autism and bipolar disorder. The research effort will also seek better animal and cell models for mental disorders, and will investigate chemicals that might be developed into drugs. The Nature paper1 was produced by the Psychiatric Genomics Consortium (PGC) — a collaboration of more than 80 institutions, including the Broad Institute. Hundreds of researchers from the PGC pooled samples from more than 150,000 people, of whom 36,989 had been diagnosed with schizophrenia. This enormous sample size enabled them to spot 108 genetic locations, or loci, where the DNA sequence in people with schizophrenia tends to differ from the sequence in people without the disease. “This paper is in some ways proof that genomics can succeed,” Hyman says. © 2014 Nature Publishing Group
Most of the genetic risk for autism comes from versions of genes that are common in the population rather than from rare variants or spontaneous glitches, researchers funded by the National Institutes of Health have found. Heritability also outweighed other risk factors in this largest study of its kind to date. About 52 percent of the risk for autism was traced to common and rare inherited variation, with spontaneous mutations contributing a modest 2.6 percent of the total risk. “Genetic variation likely accounts for roughly 60 percent of the liability for autism, with common variants comprising the bulk of its genetic architecture,” explained Joseph Buxbaum, Ph.D., of the Icahn School of Medicine at Mount Sinai (ISMMS), New York City. “Although each exerts just a tiny effect individually, these common variations in the genetic code add up to substantial impact, taken together.” Buxbaum, and colleagues of the Population-Based Autism Genetics and Environment Study (PAGES) Consortium, report on their findings in a unique Swedish sample in the journal Nature Genetics, July 20, 2014. “Thanks to the boost in statistical power that comes with ample sample size, autism geneticists can now detect common as well as rare genetic variation associated with risk,” said Thomas R. Insel, M.D., director of the NIH’s National Institute of Mental Health (NIMH). “Knowing the nature of the genetic risk will reveal clues to the molecular roots of the disorder. Common variation may be more important than we thought.”
By Meeri Kim Babies start with simple vowel sounds — oohs and aahs. A mere months later, the cooing turns into babbling — “bababa” — showing off a newfound grasp of consonants. A new study has found that a key part of the brain involved in forming speech is firing away in babies as they listen to voices around them. This may represent a sort of mental rehearsal leading up to the true milestone that occurs after only a year of life: baby’s first words. Any parent knows how fast babies learn how to comprehend and use language. The skill develops so rapidly and seemingly without much effort, but how do they do it? Researchers at the University of Washington are a step closer to unraveling the mystery of how babies learn how to speak. They had a group of 7- and 11-month-old infants listen to a series of syllables while sitting in a brain scanner. Not only did the auditory areas of their brains light up as expected but so did a region crucial to forming higher-level speech, called Broca’s area. A year-old baby sits in a brain scanner, called magnetoencephalography -- a noninvasive approach to measuring brain activity. The baby listens to speech sounds like "da" and "ta" played over headphones while researchers record her brain responses. (Institute for Learning and Brain Sciences, University of Washington) These findings may suggest that even before babies utter their first words, they may be mentally exercising the pivotal parts of their brains in preparation. Study author and neuroscientist Patricia Kuhl says that her results reinforce the belief that talking and reading to babies from birth is beneficial for their language development, along with exaggerated speech and mouth movements (“Hiii cuuutie! How are youuuuu?”). © 1996-2014 The Washington Post
Tania Browne As a teenager, I lost my grandfather. But he wasn't dead. He still had his favourite music, he still loved to walk in the woods and name the flowers and plants, and he loved his soap operas. He was alive, but gone. A dignified man, a former aircraft engineer and oil company salesman, reduced to the status of a bewildered toddler lost in a shopping centre. When he died, our family felt an odd mix of relief, then guilt at the relief. The man we loved had left his body years before the body gave out. This was 30 years ago. But while a cure is still far away, two new techniques may at least be able to forewarn us of dementia, and allow us to plan treatment for ourselves or loved ones before any outward symptoms are apparent. According to Alzheimer's Research UK, my experience is currently shared by 24m relatives and close friends of the 800 000 diagnosed dementia sufferers in the UK. In December last year, a G8 summit was told by Alzheimer's Disease International that the worldwide figure was 44m and set to treble by 2050, as the life expectancy of people in middle and lower income countries soars – precisely the countries who have either depleted or non-existent healthcare systems. Dementia is a serious time bomb. “Dementia” covers about 100 conditions, all resulting from large scale brain cell death. People often think that when they're diagnosed they're in the early stages. Yet cell death can be occurring for 10-15 years or more before any outward symptoms occur, and by the time they're diagnosed many dementia patients have already lost one fifth of their memory cells. © 2014 Guardian News and Media Limited
Link ID: 19856 - Posted: 07.21.2014
|By Nidhi Subbaraman and SFARI.org A team at Duke University in Durham, North Carolina, is set to launch a $40 million clinical trial to explore stem cells from umbilical cord blood as a treatment for autism. But experts caution that the trial is premature. A $15 million grant from the Marcus Foundation, a philanthropic funding organization based in Atlanta, will bankroll the first two years of the five-year trial, which also plans to test stem cell therapy for stroke and cerebral palsy. The autism arm of the trial aims to enroll 390 children and adults. Joanne Kurtzberg, the trial’s lead investigator, has extensive experience studying the effectiveness of cord blood transplants for treating various disorders, such as leukemia and sickle cell anemia. Most recently, she showed that cord blood transplants can improve the odds of survival for babies deprived of oxygen at birth. A randomized trial of the approach for this condition is underway. “To really sort out if [stem] cells can treat these children, we need to do randomized, controlled trials that are well designed and well controlled, and that’s what we intend to do,” says Kurtzberg, professor of pediatrics and pathology at Duke. “We firmly believe we should be moving ahead in the clinic.” Early animal studies have shown that stem cells isolated from umbilical cord blood can stimulate cells in the spinal cord to regrow their myelin layers, and in doing so help restore connections with surrounding cells. Autism is thought to result from impaired connectivity in the brain. Because of this, some groups of children with the disorder may benefit from a stem cell transplant, Kurtzberg says. © 2014 Scientific American
Associated Press The rate of Alzheimer's disease and other dementias is falling in the United States and some other rich countries - good news about an epidemic that is still growing simply because more people are living to an old age, new studies show. An American over age 60 today has a 44 percent lower chance of developing dementia than a similar-aged person did roughly 30 years ago, the longest study of these trends in the U.S. concluded. Dementia rates also are down in Germany, a study there found. "For an individual, the actual risk of dementia seems to have declined," probably because of more education and control of health factors such as cholesterol and blood pressure, said Dr. Kenneth Langa. He is a University of Michigan expert on aging who discussed the studies Tuesday at the Alzheimer's Association International Conference in Copenhagen. The opposite is occurring in some poor countries that have lagged on education and health, where dementia seems to be rising. More than 5.4 million Americans and 35 million people worldwide have Alzheimer's, the most common form of dementia. It has no cure, and current drugs only temporarily ease symptoms. A drop in rates is a silver lining in the so-called silver tsunami - the expected wave of age-related health problems from an older population. Alzheimer's will remain a major public health issue, but countries where rates are dropping may be able to lower current projections for spending and needed services, experts said. © 2014 Hearst Communications, Inc.
Link ID: 19838 - Posted: 07.16.2014
By PAULA SPAN What we really want, if we’re honest, is a pill or a shot that would allow us to stop worrying about ever sinking into dementia. Instead, what we’re hearing about preventing dementia is, in many ways, the same stuff we hear about preventing other kinds of illnesses. Healthy lifestyles. Behavioral modification. Stress reduction. At the Alzheimer’s Association International Conference in Copenhagen this week, researchers from Montefiore Medical Center and the Albert Einstein College of Medicine were among the scientists presenting findings that had little to do with amyloid in the brain and a lot to do with how people feel and act and cope with life. “A number of people have been interested in modifiable lifestyle factors for years,” said Richard Lipton, a neurologist at the college and director of the Einstein Aging Study, which has tracked cognition in elderly Bronx residents since the 1980s. But interest has increased lately, he said: “It’s at least in part a reflection of disappointing drug trials.” Medications have failed, over and over, to prevent or cure or substantially slow the ravages of dementing diseases. What else might help? Dr. Lipton and his colleagues, who monitor about 600 people aged 70 to 105, have been exploring the impact of stress. More specifically, they have been measuring “perceived stress,” a metric not so much about unpleasant things happening as how people respond to them. They use a scale based on the answers to 13 questions like, “In the past month, how often have you felt confident about your ability to handle your personal problems?” and “In the past month, how often have you felt difficulties were piling up so high you could not overcome them?” © 2014 The New York Times Company
Link ID: 19837 - Posted: 07.16.2014
By Gary Stix Popular neuroscience books have made much in recent years of the possibility that the adult brain is capable of restoring lost function or even enhancing cognition through sustained mental or physical activities. One piece of evidence often cited is a 14-year-old study that that shows that London taxi drivers have enlarged hippocampi, brain areas that store a mental map of one’s surroundings. Taxi drivers, it is assumed, have better spatial memory because they must constantly distinguish the streets and landmarks of Shepherd’s Bush from those of Brixton. A mini-industry now peddles books with titles like The Brain that Changes Itself or Rewire Your Brain: Think Your Way to a Better Life. Along with self-help guides, the value of games intended to enhance what is known as neuroplasticity are still a topic of heated debate because no one knows for sure whether or not they improve intelligence, memory, reaction times or any other facet of cognition. Beyond the controversy, however, scientists have taken a number of steps in recent years to start to answer the basic biological questions that may ultimately lead to a deeper understanding of neuroplasticity. This type of research does not look at whether psychological tests used to assess cognitive deficits can be refashioned with cartoonlike graphics and marketed as games intended to improve mental skills. Rather, these studies attempt to provide a simple definition of how mutable the brain really is at all life stages, from infancy onward into adulthood. One ongoing question that preoccupies the basic scientists pursuing this line of research is how routine everyday activities—sleep, wakefulness, even any sort of movement—may affect the ability to perceive things in the surrounding environment. One of the leaders in these efforts is Michael Stryker, who researches neuroplasticity at the University of California San Francisco. Stryker headed a group that in 2010 published a study on what happened when mice run on top of a Styrofoam ball floating on air. They found that neurons in a brain region that processes visual signals—the visual cortex—nearly doubled their firing rate when the mice ran on the ball. © 2014 Scientific American
By ALEX STONE Last summer, in a failed attempt at humor, Clorox ran an online ad that declared, “Like dogs or other house pets, new dads are filled with good intentions but lacking the judgment and fine motor skills to execute well.” Although the company pulled the ad amid a flurry of scorn from the online commentariat, it nevertheless played to a remarkably widespread stereotype — that fathers are somehow unfit to raise children. In “Do Fathers Matter?” — spoiler alert: they do — the veteran science writer Paul Raeburn jumps to Dad’s defense, drawing on several decades of research and his own experience as a five-time father. What emerges is a thought-provoking field piece on the science of fatherhood, studded with insights on how to apply it in the real world. Historically, developmental psychologists have largely dismissed fathers as irrelevant. Nearly half the articles on child and adolescent psychology published in leading journals from 1997 to 2005, for example, make no mention of fathers; before 1970, when fathers weren’t even allowed in delivery rooms, less than a fifth of the research on parental bonding took them into account. This bias reflects a deeply ingrained assumption that fathers play a marginal role in how their children turn out, a belief enshrined in the theory of infant attachment, which grew out of the work of the British psychiatrist John Bowlby in the second half of the 20th century. “It focused exclusively on mothers,” Mr. Raeburn writes. “The role of the father, Bowlby believed, was to provide support for the mother. In the drama of childhood, he was merely a supporting actor.” This was more or less the established view until a few decades ago, when psychologists, motivated in part by the growing number of women entering the work force, finally started paying attention to fathers. © 2014 The New York Times Company
|By Maria Burke and ChemistryWorld The world needs to tackle head-on the market failures undermining dementia research and drug development, UK Prime Minister David Cameron told a summit of world health and finance leaders in London in June. He announced an investigation into how to get medicines to patients earlier, extend patents and facilitate research collaborations, to report this autumn. But just how much difference will these sorts of measures make when scientists are still grappling with exactly what causes different types of dementia? Added to these problems is that dementia has become a graveyard for a large number of promising drugs. A recent study looked at how 244 compounds in 413 clinical trials fared for Alzheimer's disease between 2002 and 2012. The researchers findings paint a gloomy picture. Of those 244 compounds, only one was approved. The researchers report that this gives Alzheimer's disease drug candidates one of the highest failures rates of any disease area – 99.6%, compared with 81% for cancer. ‘Dementia is a ticking bomb costing the global economy £350 billion and yet progress with research is achingly slow,’ warned the World Dementia Envoy, Dennis Gillings. Businesses need incentives to invest in research and bring in faster, cheaper clinical trials, or the world won’t meet the ambition to find a cure or disease-modifying therapy by 2025, he added. ‘We need to free up regulation so that we can test ground-breaking new drugs, and examine whether the period for market exclusivity could be extended.’ © 2014 Scientific American
Link ID: 19828 - Posted: 07.15.2014
By Fredrick Kunkle Sleep disturbances such as apnea may increase the risk of Alzheimer’s disease, while moderate exercise in middle age and mentally stimulating games, such as crossword puzzles, may prevent the onset of the dementia-causing disease, according to new research to be presented Monday. The findings — which are to be introduced during the six-day Alzheimer’s Association International Conference in Copenhagen — bolster previous studies that suggest sleep plays a critical role in the aging brain’s health, perhaps by allowing the body to cleanse itself of Alzheimer's-related compounds during down time. The studies also add to a growing body of literature that suggests keeping the brain busy keeps it healthy. The battle against Alzheimer’s disease has become more urgent for the United States and other developing nations as their populations turn increasingly gray. The disease is the leading cause of dementia in older people and afflicts more than 5 million Americans. At its current pace, the number is expected to soar to 16 million people by 2050. In 2012, the United States adopted a national plan to combat the disease and the G-8 nations last year adopted a goal of providing better treatment and prevention by 2025. Erin Heintz, a spokeswoman for the Alzheimer’s Association, said U.S. government funding to combat the disease now stands at about $500 million a year. To reach its 2025 goal, the United States should be spending $2 billion a year, she said.
One in three cases of Alzheimer's disease worldwide is preventable, according to research from the University of Cambridge. The main risk factors for the disease are a lack of exercise, smoking, depression and poor education, it says. Previous research from 2011 put the estimate at one in two cases, but this new study takes into account overlapping risk factors. Alzheimer's Research UK said age was still the biggest risk factor. Writing in The Lancet Neurology, the Cambridge team analysed population-based data to work out the main seven risk factors for Alzheimer's disease. These are: Diabetes Mid-life hypertension Mid-life obesity Physical inactivity Depression Smoking Low educational attainment They worked out that a third of Alzheimer's cases could be linked to lifestyle factors that could be modified, such as lack of exercise and smoking. The researchers then looked at how reducing these factors could affect the number of future Alzheimer's cases. They found that by reducing each risk factor by 10%, nearly nine million cases of the disease could be prevented by 2050. In the UK, a 10% reduction in risk factors would reduce cases by 8.8%, or 200,000, by 2050, they calculated. BBC © 2014
Link ID: 19824 - Posted: 07.14.2014
By Fredrick Kunkle A simple test of a person’s ability to identify odors and noninvasive eye exams might someday help doctors learn whether their patients are at risk of Alzheimer’s disease, according to research to be presented Sunday. With Alzheimer’s disease growing fast among the world’s aging population, researchers are increasingly focused on the search for new ways to detect and treat the brain-killing disease in its earliest stages. In two separate studies on the connection between dementia and sense of smell, teams of researchers found that a decreased ability to detect odors in older people, as determined by a common scratch-and-sniff test, could point to brain cell loss and the onset of dementia. In two other studies, researchers showed that noninvasive eye exams also might offer a way to identify Alzheimer’s in its early stages. The findings — which are to be presented at the Alzheimer’s Association International Conference in Copenhagen on Sunday — raise hopes that doctors could develop simple, inexpensive diagnostic tools that would hunt down reliable biomarkers of a disease that affects more than 5 million people in the United States. Alzheimer’s is a progressive and incurable disease that begins in areas of the brain associated with memory. It is the leading cause of dementia in older people, usually striking after the age of 65. It robs people of their cognitive abilities, speech and, ultimately, their identities. Eventually, it shuts down the most basic body functions, resulting in death.
Link ID: 19823 - Posted: 07.14.2014