Links for Keyword: Alzheimers

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1012

By ROBERT PEAR WASHINGTON — Federal investigators say they have found evidence of widespread overuse of psychiatric drugs by older Americans with Alzheimer’s disease, and are recommending that Medicare officials take immediate action to reduce unnecessary prescriptions. The findings will be released Monday by the Government Accountability Office, an arm of Congress, and come as the Obama administration has already been working with nursing homes to reduce the inappropriate use of antipsychotic medications like Abilify, Risperdal, Zyprexa and clozapine. But in the study, investigators said officials also needed to focus on overuse of such drugs by people with dementia who live at home or in assisted living facilities. The Department of Health and Human Services “has taken little action” to reduce the use of antipsychotic drugs by older adults living outside nursing homes, the report said. Doctors sometimes prescribe antipsychotic drugs to calm patients with dementia who display disruptive behavior like hitting, yelling or screaming, the report said. Researchers said this was often the case in nursing homes that had inadequate numbers of employees. Dementia is most commonly associated with a decline in memory, but doctors say it can also cause changes in mood or personality and, at times, agitation or aggression. Experts have raised concern about the use of antipsychotic drugs to address behavioral symptoms of Alzheimer’s and other forms of dementia. The Food and Drug Administration says antipsychotic drugs are often associated with an increased risk of death when used to treat older adults with dementia who also have psychosis. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 12: Psychopathology: Biological Basis of Behavioral Disorders
Link ID: 20638 - Posted: 03.02.2015

By Michelle Roberts Health editor, BBC News online Scientists have proposed a new idea for detecting brain conditions including Alzheimer's - a skin test. Their work, which is at an early stage, found the same abnormal proteins that accumulate in the brain in such disorders can also be found in skin. Early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years. But experts said even more advanced tests, including ones of spinal fluid, were still not ready for clinic. If they were, then doctors could treatment at the earliest stages, before irreversible brain damage or mental decline has taken place. Brain biomarker Investigators have been hunting for suitable biomarkers in the body - molecules in blood or exhaled breath, for example, that can be measured to accurately and reliably signal if a disease or disorder is present. Dr Ildefonso Rodriguez-Leyva and colleagues from the University of San Luis Potosi, Mexico, believe skin is a good candidate for uncovering hidden brain disorders. Skin has the same origin as brain tissue in the developing embryo and might, therefore, be a good window to what's going on in the mind in later life - at least at a molecular level - they reasoned. Post-mortem studies of people with Parkinson's also reveal that the same protein deposits which occur in the brain with this condition also accumulate in the skin. To test if the same was true in life as after death, the researchers recruited 65 volunteers - 12 who were healthy controls and the remaining 53 who had either Parkinson's disease, Alzheimer's or another type of dementia. They took a small skin biopsy from behind the ear of each volunteer to test in their laboratory for any telltale signs of disease. Specifically, they looked for the presence of two proteins - tau and alpha-synuclein. © 2015 BBC.

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 5: The Sensorimotor System
Link ID: 20612 - Posted: 02.25.2015

|By Esther Landhuis Whereas cholesterol levels measured in a routine blood test can serve as a red flag for heart disease, there’s no simple screen for impending Alzheimer’s. A new Silicon Valley health start-up hopes to change that. A half million Americans die of Alzheimer’s disease each year. Most are diagnosed after a detailed medical workup and extensive neurological and psychological tests that gauge mental function and rule out other causes of dementia. Yet things begin going awry some 10 to 15 years before symptoms show. Spinal fluid analyses and positron emission tomography (PET) scans can detect a key warning sign—buildup of amyloid-beta protein in the brain. Studies suggest that adults with high brain amyloid have elevated risk for Alzheimer’s and stand the best chance of benefiting from treatments should they become available. Getting Alzheimer’s drugs to market requires long and costly clinical studies, which some experts say have failed thus far because experimental drugs were tested too late in the disease process. By the time people show signs of dementia, their brains have lost neurons and no current therapy can revive dead cells. That is why drug trials are looking to recruit seniors with preclinical Alzheimer’s who are on the verge of decline but otherwise look healthy. This poses a tall order. Spinal taps are cumbersome and PET costs $3,000 per scan. “There’s no cheap, fast, noninvasive test that can accurately identify people at risk of Alzheimer’s,” says Brad Dolin, chief technology officer of Neurotrack. The company is developing a computerized visual test that might fit the bill. © 2015 Scientific American

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 20576 - Posted: 02.13.2015

|By Andrea Anderson and Victoria Stern Blood type may affect brain function as we age, according to a new large, long-term study. People with the rare AB blood type, present in less than 10 percent of the population, have a higher than usual risk of cognitive problems as they age. University of Vermont hematologist Mary Cushman and her colleagues used data from a national study called REGARDS, which has been following 30,239 African-American and Caucasian individuals older than 45 since 2007. The aim of the study is to understand the heavy stroke toll seen in the southeastern U.S., particularly among African-Americans. Cushman's team focused on information collected twice yearly via phone surveys that evaluate cognitive skills such as learning, short-term memory and executive function. The researchers zeroed in on 495 individuals who showed significant declines on at least two of the three phone survey tests. When they compared that cognitively declining group with 587 participants whose mental muster remained robust, researchers found that impairment in thinking was roughly 82 percent more likely in individuals with AB blood type than in those with A, B or O blood types, even after taking their race, sex and geography into account. The finding was published online last September in Neurology. The seemingly surprising result has some precedent: past studies suggest non-O blood types are linked to elevated incidence of heart disease, stroke and blood clots—vascular conditions that could affect brain function. Yet these cardiovascular consequences are believed to be linked to the way non-O blood types coagulate, which did not seem to contribute to the cognitive effects described in the new study. The researchers speculate that other blood-group differences, such as how likely cells are to stick to one another or to blood vessel walls, might affect brain function. © 2015 Scientific American

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20552 - Posted: 02.05.2015

|By Esther Landhuis One in nine Americans aged 65 and older has Alzheimer's disease, a fatal brain disorder with no cure or effective treatment. Therapy could come in the form of new drugs, but some experts suspect drug trials have failed so far because compounds were tested too late in the disease's progression. By the time people show signs of dementia, their brains have lost neurons. No therapy can revive dead cells, and little can be done to create new ones. So researchers running trials now seek participants who still pass as cognitively normal but are on the verge of decline. These “preclinical” Alzheimer's patients may represent a window of opportunity for therapeutic intervention. How to identify such individuals before they have symptoms presents a challenge, however. Today most Alzheimer's patients are diagnosed after a detailed medical workup and extensive tests that gauge mental function. Other tests, such as spinal fluid analyses and positron-emission tomography (PET) scans, can detect signs of approaching disease and help pinpoint the preclinical window but are cumbersome or expensive. “There's no cheap, fast, noninvasive test that can identify people at risk of Alzheimer's,” says Brad Dolin, chief technology officer of Neurotrack in Palo Alto, Calif.—a company developing a computerized visual screening test for Alzheimer's. Unlike other cognitive batteries, the Neurotrack test requires no language or motor skills. Participants view images on a monitor while a camera tracks their eye movements. The test draws on research by co-founder Stuart Zola of Emory University, who studies learning and memory in monkeys. When presented with a pair of images—one novel, the other familiar—primates fixate longer on the novel one. But if the hippocampus is damaged, as it is in people with Alzheimer's, the subject does not show a clear preference for the novel images. © 2015 Scientific American

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 20541 - Posted: 02.02.2015

Ian Sample, science editor People who carry a mutated gene linked to longer lifespan have extra tissue in part of the brain that seems to protect them against mental decline in old age. The finding has shed light on a biological pathway that researchers now hope to turn into a therapy that slows the progression of Alzheimer’s disease and other forms of dementia. Brain scans of more than 400 healthy men and women aged 53 and over found that those who carried a single copy of a particular gene variant had a larger brain region that deals with planning and decision making. Further tests on the group found that those with an enlarged right dorsolateral prefrontal cortex (rDLPFC), as the brain region is known, fared better on a series of mental tasks. About one in five people inherits a single copy of the gene variant, or allele, known as KL-VS, which improves heart and kidney function, and on average adds about three years to human lifespan, according to Dena Dubal, a neurologist at University of California, San Francisco. Her latest work suggests that the same genetic mutation has broader effects on the brain. While having a larger rDLPFC accounted for only 12% of the improvement in people’s mental test scores, Dubal suspects the gene alters the brain in other ways, perhaps by improving the connections that form between neurons.

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20526 - Posted: 01.28.2015

Over-the-counter sleeping aids and hayfever treatments can increase the risk of Alzheimer’s disease, a study has found. The sleeping medication Nytol and anti-allergy pills Benadryl and Piriton all belong to a class of drug highlighted in a warning from researchers. Each of these drugs has “anticholinergic” blocking effects on the nervous system that are said – at higher doses – to raise the likelihood of developing Alzheimer’s and other forms of dementia significantly over several years. Other drugs on the risk list include older “tricyclic” antidepressants such as doxepin, and the bladder control treatment Ditropan (oxybutynin). Many of these medicines are taken by vulnerable older people, according to the scientists, who say their findings have public health implications. Anticholinergic drugs block a nervous system chemical transmitter called acetylcholine, which can lead to side-effects including drowsiness, blurred vision and poor memory. People with Alzheimer’s disease are known to lack acetylcholine. The leader of the US study, Professor Shelly Gray, director of the geriatric pharmacy programme at the University of Washington School of Pharmacy, said: “Older adults should be aware that many medications – including some available without a prescription, such as over-the-counter sleep aids – have strong anticholinergic effects. And they should tell their healthcare providers. “Of course, no one should stop taking any therapy without consulting their healthcare provider. Healthcare providers should regularly review their older patients’ drug regimens – including over-the-counter medications – to look for chances to use fewer anticholinergic medications at lower doses.”

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 10: Biological Rhythms and Sleep
Link ID: 20519 - Posted: 01.27.2015

By PAULA SPAN DEDHAM, Mass. — Jerome Medalie keeps his advance directive hanging in a plastic sleeve in his front hall closet, as his retirement community recommends. That’s where the paramedics will look if someone calls 911. Like many such documents, it declares that if he is terminally ill, he declines cardiopulmonary resuscitation, a ventilator and a feeding tube. But Mr. Medalie’s directive also specifies something more unusual: If he develops Alzheimer’s disease or another form of dementia, he refuses “ordinary means of nutrition and hydration.” A retired lawyer with a proclivity for precision, he has listed 10 triggering conditions, including “I cannot recognize my loved ones” and “I cannot articulate coherent thoughts and sentences.” If any three such disabilities persist for several weeks, he wants his health care proxy — his wife, Beth Lowd — to ensure that nobody tries to keep him alive by spoon-feeding or offering him liquids. VSED, short for “voluntarily stopping eating and drinking,” is not unheard-of as an end-of-life strategy, typically used by older adults who hope to hasten their decline from terminal conditions. But now ethicists, lawyers and older adults themselves have begun a quiet debate about whether people who develop dementia can use VSED to end their lives by including such instructions in an advance directive. Experts know of just a handful of people with directives like Mr. Medalie’s. But dementia rates and numbers have begun a steep ascent, already afflicting an estimated 30 percent of those older than 85. Baby boomers are receiving a firsthand view of the disease’s devastation and burdens as they care for aging parents. They may well prove receptive to the idea that they shouldn’t be kept alive if they develop dementia themselves, predicted Alan Meisel, the director of the University of Pittsburgh’s Center for Bioethics and Health Law. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20495 - Posted: 01.20.2015

By Candy Schulman My mother’s greatest fear was Alzheimer’s. She got Lewy body dementia, or LBD, instead. This little known, oddly named, debilitating illness afflicts an estimated 1.3 million Americans, the actor and comedian Robin Williams possibly among them. It is often misdiagnosed because its signs, such as hallucinations and body rigidity, do not seem like those of dementia, but in the end it robs people of themselves even more painfully. I first noticed my mother’s cognitive difficulties when she was 88. Until then, she’d led an extraordinarily active life: She was a competitive golfer with a bureau full of trophies, a painter and a sculptor. Every Hanukkah she hosted a lively feast for her eight grandchildren and nine great-grandchildren. This time, though, she needed my help planning, shopping and cooking. She was having difficulty with the guest list, trying to write every family member’s name on a piece of paper, adding up the numbers to see how many potatoes to buy for latkes. Her concentration became frayed and she kept ripping it up and starting again, close to tears. Several months before that, she had sent me a Mother’s Day card that was illustrated with childlike prose, colorful illustrations and glitter hearts. The poem on the cover was printed in a playful purple font: “For you, Mom. For kissing my boo-boos, for wiping my face. . . . For calming my fears with your loving embrace.” On Mother’s Day and the rest of the year, Mom added in a shaky script, “thanks.”

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20422 - Posted: 12.16.2014

|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20421 - Posted: 12.16.2014

By Nicholas Bakalar Poor sleep in older adults may be linked to brain changes associated with dementia, a new study has found. Researchers studied 167 men who underwent sleep tests in 1999 and died by 2010. The study, in Neurology, recorded sleep duration, periods of waking up and episodes of apnea, and used pulse oximetry to measure oxygen saturation of their blood. On autopsy, they found that those in the highest one-quarter for duration of sleep at oxygen saturation of less than 95 percent were almost four times as likely to have higher levels microinfarcts, small areas of dead tissue caused by deprivation of blood supply, as those in the lowest one-quarter. Compared with those in the lowest 25 percent for duration of slow-wave (deep) sleep, those in the highest one-quarter were about a third as likely to have moderate or high levels of generalized brain atrophy. “Prior studies have shown an association between certain types of sleep disturbance and dementia,” said the lead author, Dr. Rebecca P. Gelber, an epidemiologist with the Veterans Administration in Hawaii. “These lesions may help explain the association.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 10: Biological Rhythms and Sleep
Link ID: 20420 - Posted: 12.16.2014

By Michelle Roberts Health editor, BBC News online The brain has a weak spot for Alzheimer's disease and schizophrenia, according to UK scientists who have pinpointed the region using scans. The brain area involved develops late in adolescence and degenerates early during ageing. At the moment, it is difficult for doctors to predict which people might develop either condition. The findings, in the journal PNAS, hint at a potential way to diagnose those at risk earlier, experts say. Although they caution that "much more research is needed into how to bring these exciting discoveries into the clinic". The Medical Research Council team who carried out the study did MRI brain scans on 484 healthy volunteers aged between eight and 85 years. The researchers, led by Dr Gwenaëlle Douaud of Oxford University, looked at how the brain naturally changes as people age. The images revealed a common pattern - the parts of the brain that were the last to develop were also the first to show signs of age-related decline. These brain regions - a network of nerve cells or grey matter - co-ordinate "high order" information coming from the different senses, such as sight and sound. When the researchers looked at scans of patients with Alzheimer's disease and scans of patients with schizophrenia they found the same brain regions were affected. The findings fit with what other experts have suspected - that although distinct, Alzheimer's and schizophrenia are linked. Prof Hugh Perry of the MRC said: "Early doctors called schizophrenia 'premature dementia' but until now we had no clear evidence that the same parts of the brain might be associated with two such different diseases. This large-scale and detailed study provides an important, and previously missing, link between development, ageing and disease processes in the brain. BBC © 2014

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 12: Psychopathology: Biological Basis of Behavioral Disorders
Link ID: 20354 - Posted: 11.25.2014

By Emma Wilkinson Health reporter, BBC News Taking vitamin B12 and folic acid supplements does not seem to cut the risk of developing dementia in healthy people, say Dutch researchers. In one of the largest studies to date, there was no difference in memory test scores between those who had taken the supplements for two years and those who were given a placebo. The research was published in the journal Neurology. Alzheimer's Research UK said longer trials were needed to be sure. B vitamins have been linked to Alzheimer's for some years, and scientists know that higher levels of a body chemical called homocysteine can raise the risk of both strokes and dementia. Vitamin B12 and folic acid are both known to lower levels of homocysteine. That, along with studies linking low vitamin B12 and folic acid intake with poor memory, had prompted scientists to view the supplements as a way to ward off dementia. Yet in the study of almost 3,000 people - with an average age of 74 - who took 400 micrograms of folic acid and 500 micrograms of vitamin B12 or a placebo every day, researchers found no evidence of a protective effect. All those taking part in the trial had high blood levels of homocysteine, which did drop more in those taking the supplements. But on four different tests of memory and thinking skills taken at the start and end of the study, there was no beneficial effect of the supplements on performance. The researchers did note that the supplements might slightly slow the rate of decline but concluded the small difference they detected could just have been down to chance. Study leader Dr Rosalie Dhonukshe-Rutten, from Wageningen University in the Netherlands, said: "Since homocysteine levels can be lowered with folic acid and vitamin B12 supplements, the hope has been that taking these vitamins could also reduce the risk of memory loss and Alzheimer's disease. BBC © 2014

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20313 - Posted: 11.15.2014

Sara Reardon Delivering medications to the brain could become easier, thanks to molecules that can escort drugs through the notoriously impervious sheath that separates blood vessels from neurons. In a proof-of-concept study in monkeys, biologists used the system to reduce levels of the protein amyloid-β, which accumulates in the brain plaques associated with Alzheimer's disease1. The blood–brain barrier is a layer of cells lining the inner surface of the capillaries that feed the central nervous system. It is nature's way of protecting the delicate brain from infectious agents and toxic compounds, while letting nutrients and oxygen in and waste products out. Because the barrier strictly regulates the passage of larger molecules and often prevents drug molecules from entering the brain, it has long posed one of the most difficult challenges in developing treatments for brain disorders. Several approaches to bypassing the barrier are being tested, including nanoparticles that are small enough to pass through the barrier's cellular membranes and deliver their payload; catheters that carry a drug directly into the brain; and ultrasound pulses that push microbubbles through the barrier. But no approach has yet found broad therapeutic application. Neurobiologist Ryan Watts and his colleagues at the biotechnology company Genentech in South San Francisco have sought to break through the barrier by exploiting transferrin, a protein that sits on the surface of blood vessels and carries iron into the brain. The team created an antibody with two ends. One end binds loosely to transferrin and uses the protein to transport itself into the brain. And once the antibody is inside, its other end targets an enzyme called β-secretase 1 (BACE1), which produces amyloid-β. Crucially, the antibody binds more tightly to BACE1 than to transferrin, and this pulls it off the blood vessel and into the brain. It locks BACE1 shut and prevents it from making amyloid-β. © 2014 Nature Publishing Group,

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20286 - Posted: 11.06.2014

By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 20238 - Posted: 10.23.2014

By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 20237 - Posted: 10.23.2014

By Paula Span Maybe it’s something else. That’s what you tell yourself, isn’t it, when an older person begins to lose her memory, repeat herself, see things that aren’t there, lose her way on streets she’s traveled for decades? Maybe it’s not dementia. And sometimes, thankfully, it is indeed some other problem, something that mimics the cognitive destruction of Alzheimer’s disease or another dementia — but, unlike them, is fixable. “It probably happens more often than people realize,” said Dr. P. Murali Doraiswamy, a neuroscientist at Duke University Medical Center. But, he added, it doesn’t happen nearly as often as family members hope. Several confounding cases have appeared at Duke: A woman who appeared to have Alzheimer’s actually was suffering the effects of alcoholism. Another patient’s symptoms resulted not from dementia but from chronic depression. Dr. Doraiswamy estimates that when doctors suspect Alzheimer’s, they’re right 50 to 60 percent of the time. (The accuracy of Alzheimer’s diagnoses, even in specialized medical centers, is more haphazard than you would hope.) Perhaps another 25 percent of patients actually have other types of dementia, like Lewy body or frontotemporal — scarcely happy news, but because these diseases have different trajectories and can be exacerbated by the wrong drugs, the distinction matters. The remaining 15 to 25 percent “usually have conditions that can be reversed or at least improved,” Dr. Doraiswamy said. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20227 - Posted: 10.22.2014

By GINA KOLATA For the first time, and to the astonishment of many of their colleagues, researchers created what they call Alzheimer’s in a Dish — a petri dish with human brain cells that develop the telltale structures of Alzheimer’s disease. In doing so, they resolved a longstanding problem of how to study Alzheimer’s and search for drugs to treat it; the best they had until now were mice that developed an imperfect form of the disease. The key to their success, said the lead researcher, Rudolph E. Tanzi of Massachusetts General Hospital in Boston, was a suggestion by his colleague Doo Yeon Kim to grow human brain cells in a gel, where they formed networks as in an actual brain. They gave the neurons genes for Alzheimer’s disease. Within weeks they saw the hard Brillo-like clumps known as plaques and then the twisted spaghetti-like coils known as tangles — the defining features of Alzheimer’s disease. The work, which also offers strong support for an old idea about how the disease progresses, was published in Nature on Sunday. Leading researchers said it should have a big effect. “It is a giant step forward for the field,” said Dr. P. Murali Doraiswamy, an Alzheimer’s researcher at Duke University. “It could dramatically accelerate testing of new drug candidates.” Of course, a petri dish is not a brain, and the petri dish system lacks certain crucial components, like immune system cells, that appear to contribute to the devastation once Alzheimer’s gets started. But it allows researchers to quickly, cheaply and easily test drugs that might stop the process in the first place. The crucial step, of course, will be to see if drugs that work in this system stop Alzheimer’s in patients. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20203 - Posted: 10.13.2014

By Fredrick Kunkle Years ago, many scientists assumed that a woman’s heart worked pretty much the same as a man’s. But as more women entered the male-dominated field of cardiology, many such assumptions vanished, opening the way for new approaches to research and treatment. A similar shift is underway in the study of Alzheimer’s disease. It has long been known that more women than men get the deadly neurodegenerative disease, and an emerging body of research is challenging the common wisdom as to why. Although the question is by no means settled, recent findings suggest that biological, genetic and even cultural influences may play heavy roles. Of the more than 5 million people in the United States who have been diagnosed with Alzheimer’s, the leading cause of dementia, two-thirds are women. Because advancing age is considered the biggest risk factor for the disease, researchers largely have attributed that disparity to women’s longer life spans. The average life expectancy for women is 81 years, compared with 76 for men. Yet “even after taking age into account, women are more at risk,” said Richard Lipton, a physician who heads the Einstein Aging Study at Albert Einstein College of Medicine in New York. With the number of Alzheimer’s cases in the United States expected to more than triple by 2050, some researchers are urging a greater focus on understanding the underlying reasons women are more prone to the disease and on developing gender-specific treatments. .

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 8: Hormones and Sex
Link ID: 20155 - Posted: 10.04.2014

By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20148 - Posted: 10.02.2014