Chapter 7. Life-Span Development of the Brain and Behavior

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 4677

by Laura Sanders The way babies learn to speak is nothing short of breathtaking. Their brains are learning the differences between sounds, rehearsing mouth movements and mastering vocabulary by putting words into meaningful context. It’s a lot to fit in between naps and diaper changes. A recent study shows just how durable this early language learning is. Dutch-speaking adults who were adopted from South Korea as preverbal babies held on to latent Korean language skills, researchers report online January 18 in Royal Society Open Science. In the first months of their lives, these people had already laid down the foundation for speaking Korean — a foundation that persisted for decades undetected, only revealing itself later in careful laboratory tests. Researchers tested how well people could learn to identify and speak tricky Korean sounds. “For Korean listeners, these sounds are easy to distinguish, but for second-language learners they are very difficult to master,” says study coauthor Mirjam Broersma, a psycholinguist of Radboud University in Nijmegen, Netherlands. For instance, a native Dutch speaker would listen to three distinct Korean sounds and hear only the same “t” sound. Broersma and her colleagues compared the language-absorbing skills of a group of 29 native Dutch speakers to 29 South Korea-born Dutch speakers. Half of the adoptees moved to the Netherlands when they were older than 17 months — ages at which the kids had probably begun talking. The other half were adopted as preverbal babies younger than 6 months. As a group, the South Korea-born adults outperformed the native-born Dutch adults, more easily learning both to recognize and speak the Korean sounds. |© Society for Science & the Public 2000 - 2017

Keyword: Language; Development of the Brain
Link ID: 23455 - Posted: 04.06.2017

Bruce Bower Kids can have virtual out-of-body experiences as early as age 6. Oddly enough, the ability to inhabit a virtual avatar signals a budding sense that one’s self is located in one’s own body, researchers say. Grade-schoolers were stroked on their backs with a stick while viewing virtual versions of themselves undergoing the same touch. Just after the session ended, the children often reported that they had felt like the virtual body was their actual body, says psychologist Dorothy Cowie of Durham University in England. This sense of being a self in a body, which can be virtually manipulated via sight and touch, gets stronger and more nuanced throughout childhood, the scientists report March 22 in Developmental Science. By around age 10, individuals start to report feeling the touch of a stick stroking a virtual body, denoting a growing integration of sensations with the experience of body ownership, Cowie’s team finds. A year after that, youngsters still don’t display all the elements of identifying self with body observed in adults. During virtual reality trials, only adults perceived their actual bodies as physically moving through space toward virtual bodies receiving synchronized touches. This first-of-its-kind study opens the way to studying how a sense of self develops from childhood on, says cognitive neuroscientist Olaf Blanke of the Swiss Federal Institute of Technology in Lausanne. “The new data clearly show that kids at age 6 have brain mechanisms that generate an experience of being a self located inside one’s own body.” He suspects that a beginner’s version of “my body is me” emerges by age 4. |© Society for Science & the Public 2000 - 2017.

Keyword: Consciousness; Development of the Brain
Link ID: 23446 - Posted: 04.04.2017

by Emilie Reas Alzheimer’s disease (AD) has been characterized as a “complete loss of self.” Early on when memory begins to fade, the victim has difficulty recalling names, their grocery list or where they put their keys. As the disease progresses, they have trouble staying focused, planning and performing basic daily activities. From the exterior, dementia appears to ravage one’s intellect and personality; yet as mere observers, it’s impossible to ascertain how consciousness of the self and environment is transformed by dementia. The celebrated late neurologist Oliver Sacks once suggested that, “Style, neurologically, is the deepest part of one’s being and may be preserved, almost to the last, in dementia.” Is this remaining neurological “style” sufficient to preserve consciousness? Is the AD patient aware of their deteriorating cognition, retaining a sense of identity or morality, or can they still connect with friends and loved ones? Emerging advances in neuroscience have enabled researchers to more precisely probe the AD brain, suggesting that although some aspects of consciousness are compromised by dementia, others are remarkably spared. Scientists are beginning to piece together how the selective loss of some functions, but the preservation of others, alters consciousness in AD. A recent study found that the severity of cognitive impairment strongly relates to “meta-cognition” (reflecting on one’s own condition), moral judgments and thinking about the future, whereas basic personal identity and body awareness remain. Perhaps the most widely observed deficit in consciousness is “anosognosia,” impaired awareness of one’s own illness; whereas individuals with mild cognitive impairment (MCI; considered a precursor to full AD) are aware of their declining memory, AD patients may be unaware of their impairments. These behavioral signs suggest that only some aspects of consciousness and self-awareness are truly lost in AD.

Keyword: Alzheimers; Consciousness
Link ID: 23444 - Posted: 04.04.2017

By Eric Boodman, MEDFORD, Mass. — They look like little more than grayish-black grains of couscous floating in water. But they are actually African clawed frogs-to-be, replete with minuscule blobs that will become eyes. “These little beans here are what I do the surgery on,” said Douglas Blackiston, a postdoctoral fellow at Tufts University’s Allen Discovery Center, holding out a Petri dish. On Thursday, Blackiston published the results of a few years’ worth of those microscopic surgeries, and the finding is bizarre: If you transplant an eye onto what will become the tadpole’s tail, that organ — misplaced though it may be — can allow the animal to see. Admittedly, it’s impossible for humans to look through a clawed frog’s eyes, and in this case, Blackiston and the director of his lab, Michael Levin, were mainly testing whether the tadpoles could perceive movement and colored light. But they say their research doesn’t just have implications for scientists’ ability to restore vision; it also sheds light on how to connect implants and grafts to the body’s own wiring. “You implant these organs, but you want them to be functionally integrated with the host nervous system otherwise they aren’t going to work,” said Levin, the lead author of a paper published Thursday in Nature Regenerative Medicine. Do you have to “connect up every neuron,” he wondered, or can you make use of the natural ability of the nervous system to adapt and rewire itself? © 2017 Scientific American

Keyword: Development of the Brain; Vision
Link ID: 23433 - Posted: 03.31.2017

By ALICE CALLAHAN Peruse the infant formula aisle, or check out the options for prenatal nutritional supplements, and you’ll find that nearly all these products boast a “brain nourishing” omega-3 fatty acid called DHA. But despite decades of research, it’s still not clear that DHA in formula boosts brain health in babies, or that mothers need to go out of their way to take DHA supplements. A systematic review of studies published this month by the Cochrane Collaboration concluded there was no clear evidence that formula supplementation with DHA, or docosahexaenoic acid, a nutrient found mainly in fish and fish oil, improves infant brain development. At the same time, it found no harm from adding the nutrient. The findings are consistent with a review of the effects of omega-3 supplements in pregnancy and infancy published by the Agency for Healthcare Research and Quality last fall that found little evidence of benefit. Still, many experts believe there is value in including DHA in formula. “Even if you can’t easily prove it, because it’s hard to prove developmental outcomes, it makes sense to use it,” said Dr. Steven Abrams, a professor of pediatrics at Dell Medical School at the University of Texas at Austin. “It’s probably a good idea to keep it in there, and it’s certainly safe.” During pregnancy and the first few years of life, DHA accumulates in the brain and retina of the eye and plays an important role in neural and vision development. Breast milk contains DHA in varying concentrations, depending on how much is in the mother’s diet, and some DHA can be made in the body from precursor omega-3 fatty acids, although this process is inefficient. © 2017 The New York Times Company

Keyword: Development of the Brain
Link ID: 23428 - Posted: 03.30.2017

Rae Ellen Bichell Exposure to lead as a child can affect an adult decades later, according to a study out Tuesday that suggests a link between early childhood lead exposure and a dip in a person's later cognitive ability and socioeconomic status. Lead in the United States can come from lots of sources: old, peeling paint; contaminated soil; or water that's passed through lead pipes. Before policies were enacted to get rid of lead in gasoline, it could even come from particles in the fumes that leave car tailpipes. "It's toxic to many parts of the body, but in particular in can accumulate in the bloodstream and pass through the blood brain barrier to reach the brain," says the study's first author, Aaron Reuben, a graduate student in clinical psychology at Duke University. Reuben and his colleagues published the results of a long-term study on the lingering effects of lead. Researchers had kept in touch with about 560 people for decades — starting when they were born in Dunedin, New Zealand, in the 1970s, all the way up to the present. As children, the study participants were tested on their cognitive abilities; researchers determined IQ scores based on tests of working memory, pattern recognition, verbal comprehension and ability to solve problems, among other skills. When the kids were 11 years old, researchers tested their blood for lead. (That measurement is thought to be a rough indicator of lead exposure in the few months before the blood draw.) Then, when they turned 38 years old, the cognitive ability of these study participants was tested again. As Reuben and his colleagues write in this week's issue of JAMA, the journal of the American Medical Association, they found a subtle but worrisome pattern in the data. © 2017 npr

Keyword: Development of the Brain; Neurotoxins
Link ID: 23422 - Posted: 03.29.2017

By C. CLAIBORNE RAY Q. When four of us shared memories of our very young lives, not one of us could recall events before the age of 4 or possibly 3. Is this common? A. Yes. For adults, remembering events only after age 3½ or 4 is typical, studies have found. The phenomenon was named childhood amnesia by Freud and identified late in the 19th century by the pioneering French researcher Victor Henri and his wife, Catherine. The Henris published a questionnaire on early memories in 1895, and the results from 123 people were published in 1897. Most of the participants’ earliest memories came from when they were 2 to 4 years old; the average was age 3. Very few participants recalled events from the first year of life. Many subsequent studies found similar results. Several theories have been offered to explain the timing of laying down permanent memories. One widely studied idea relates the formation of children’s earliest memories to when they start talking about past events with their mothers, suggesting a link between memories and the age of language acquisition. More recent studies, in 2010 and 2014, found discrepancies in the accuracy of young children’s estimates of when things had occurred in their lives. Another 2014 study found a progressive loss of recall as a child ages, with 5-, 6- and 7-year-olds remembering 60 percent or more of some early-life events that were discussed at age 3, while 8- and 9-year-olds remembered only 40 percent of these events. © 2017 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 23412 - Posted: 03.28.2017

By Linda Searing The precise cause, or causes, of dementias such as Alzheimer’s disease remain unclear, but one theory points to molecules called free radicals that can damage nerve cells. This damage, called oxidative stress, may lead to changes in the brain over time that result in dementia. Might antioxidant supplements prevent this? The study involved 7,540 men 60 and older (average age, 67) with no indications of dementia and no history of serious head injury, substance abuse or neurological conditions that affect cognition. They were randomly assigned to take vitamin E (an antioxidant, 400 International Units daily), selenium (also an antioxidant, 200 micrograms daily), both vitamin E and selenium or a placebo. The men also had their memory assessed periodically. In just over five years, 325 of the men (about 4 percent) developed dementia, with essentially no difference in the rate of occurrence between those who took one or both supplements and those who took the placebo. The researchers concluded that the antioxidant supplements “did not forestall dementia and are not recommended as preventive agents.” Who may be affected? Older men. The risk for dementia increases with advanced age and is most common among the very elderly. Memory loss is the most well-known symptom, but people with dementia may also have problems thinking, speaking, controlling emotions and doing daily activities such as getting dressed and eating. Alzheimer’s disease is the most common type of dementia, affecting more than 5.5 million Americans, including more than 10 percent of those 65 and older and more women than men. Caveats Participants took the supplements for a relatively short time. Whether the findings would apply to women was not tested. The study did not prove that the dementia developed by the study participants was caused by oxidative stress. © 1996-2017 The Washington Post

Keyword: Alzheimers
Link ID: 23403 - Posted: 03.25.2017

by Laura Sanders Many babies born early spend extra time in the hospital, receiving the care of dedicated teams of doctors and nurses. For these babies, the hospital is their first home. And early experiences there, from lights to sounds to touches, may influence how babies develop. Touches early in life in the NICU, both pleasant and not, may shape how a baby’s brain responds to gentle touches later, a new study suggests. The results, published online March 16 in Current Biology, draw attention to the importance of touch, both in type and number. Young babies can’t see that well. But the sense of touch develops early, making it a prime way to get messages to fuzzy-eyed, pre-verbal babies. “We focused on touch because it really is some of the basis for communication between parents and child,” says study coauthor Nathalie Maitre, a neonatologist and neuroscientist at Nationwide Children’s Hospital in Columbus, Ohio. Maitre and her colleagues studied how babies’ brains responded to a light puff of air on the palms of their hands — a “very gentle and very weak touch,” she says. They measured these responses by putting adorable, tiny electroencephalogram, or EEG, caps on the babies. The researchers puffed babies’ hands shortly before they were sent home. Sixty-one of the babies were born early, from 24 to 36 weeks gestation. At the time of the puff experiment, they had already spent a median of 28 days in the hospital. Another group of 55 babies, born full-term, was tested in the three days after birth. |© Society for Science & the Public 2000 - 2017

Keyword: Pain & Touch; Development of the Brain
Link ID: 23398 - Posted: 03.23.2017

Hannah Devlin Scientists have developed a new genetic test for Alzheimer’s risk that can be used to predict the age at which a person will develop the disease. A high score on the test, which is based on 31 genetic markers, can translate to being diagnosed many years earlier than those with a low-risk genetic profile, the study found. Those ranked in the top 10% in terms of risk were more than three times as likely to develop Alzheimer’s during the course of the study, and did so more than a decade before those who ranked in the lowest 10%. Strobe lighting provides a flicker of hope in the fight against Alzheimer’s Read more Rahul Desikan, of the University of California – who led the international effort, said the test could be used to calculate any individual’s risk of developing Alzheimer’s that year. “That is, if you don’t already have dementia, what is your yearly risk for AD onset, based on your age and genetic information,” he said. The so-called polygenic hazard score test was developed using genetic data from more than 70,000 individuals, including patients with Alzheimer’s disease and healthy elderly people. It is already known that genetics plays a powerful role in Alzheimer’s. Around a quarter of patients have a strong family history of the disease, and scientists have shown this is partly explained by a gene called ApoE, which comes in three versions, and is known to have a powerful influence on the risk of getting the most common late-onset type of Alzheimer’s. One version of ApoE appears to reduce risk by up to 40%, while those with two copies (one from each parent) of the high-risk version can increase risk by 12 times.

Keyword: Alzheimers
Link ID: 23390 - Posted: 03.22.2017

By KIM SEVERSON SONOMA, Calif. — The first thing Paula Wolfert wants to make a guest is coffee blended with butter from grass-fed cows and something called brain octane oil. She waves a greasy plastic bottle of the oil around her jumble of a kitchen like a preacher who has taken up a serpent. Never mind that this is the woman who introduced tagines, Aleppo pepper and cassoulet to American kitchens, wrote nine cookbooks and once possessed a palate the food writer Ruth Reichl declared the best she’d ever encountered. Ms. Wolfert, 78, has dementia. She can’t cook much, even if she wanted to. Which, by the way, she doesn’t. She learned she probably had Alzheimer’s disease in 2013, but she suspected something wasn’t right long before. Words on a page sometimes made no sense. Complex questions started to baffle her. Since she has always been an audacious and kinetic conversationalist with a touch of hypochondria, friends didn’t notice anything was wrong. Doctors spoke of “senior moments.” But she knew. One day, Ms. Wolfert went to make an omelet for her husband, the crime novelist William Bayer. She had to ask him how. The woman who once marched up to the French chef Jean-Louis Palladin and told him a dish didn’t have enough salt can no longer taste the difference between a walnut and a pecan, or smell whether the mushrooms are burning. The list of eight languages she once understood has been reduced to English. Maybe 40 percent of the words she knew have evaporated. “What am I going to do, cry about it?” Ms. Wolfert said in an interview at her home this month, the slap of her Brooklyn accent still sharp. After all, she points out, her first husband left her in Morocco with two small children and $2,000: “I cried for 20 minutes and I thought, ‘This isn’t going to do any good.’” © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23389 - Posted: 03.22.2017

By NICHOLAS BAKALAR Some research has suggested that vitamin E and selenium supplements might lower the risk for Alzheimer’s disease, but a new long-term trial has found no evidence that they will. The study began as a randomized clinical trial in 2002 testing the supplements for the prevention of prostate cancer. When that study was stopped in 2009 because no effect was found, 3,786 of the original 7,540 men participated in a continuing study to test the antioxidants as a preventive for Alzheimer’s. The study, in JAMA Neurology, randomly assigned the men, whose average age was 67 at the start, to take either vitamin E, selenium, both supplements, or a placebo. By 2015, 4.4 percent of the men had dementia, but there was no difference between the groups. Neither selenium, vitamin E, nor both in combination were any more effective than a placebo. The study controlled for age, family history of Alzheimer’s disease, education, race, diabetes and other factors. The lead author, Richard J. Kryscio, a professor of statistics at the University of Kentucky, said that it is possible that different dosages or different types of selenium or vitamin E might show an effect. “We could have picked the wrong version or the wrong dose,” he said. “But there’s really no evidence that these supplements will make a difference down the road in preventing dementia.” © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23388 - Posted: 03.22.2017

By Jill Serjeant NEW YORK (Reuters) - Long-running children's television show "Sesame Street" is welcoming a new kid to the block - a Muppet with autism called Julia. A redhead who loves to sing and remembers the words to lots of songs, Julia will debut on the show for preschoolers on April 10 after a five-year outreach effort to families and experts on autism, Sesame Workshop said on Monday. "For years, families of children with autism have asked us to address the issue," Dr. Jeanette Betancourt, senior vice president of U.S. social impact at the nonprofit Sesame Workshop, said in a statement. One in 68 American children is currently diagnosed with autism, according to the Centers for Disease Control and Prevention, an increase of some 119 percent since 2000. Autism is a developmental disorder present from early childhood, characterized by difficulty in communicating and forming relationships with other people and in using language and abstract concepts Stacey Gordon, the puppeteer who will perform the role of Julia, and Christine Ferraro who wrote her part, both have family members who are on the autism spectrum. "It's important for kids without autism to see what autism can look like," Gordon told the CBS show "60 Minutes" in a preview on Sunday. "Had my son's friends been exposed to his behaviors through something that they had seen on TV before they experienced them in the classroom, they might not have been frightened. They might not have been worried when he cried. They would have known that he plays in a different way and that that's okay," she added. © 2017 Scientific American

Keyword: Autism
Link ID: 23387 - Posted: 03.22.2017

By Linda Geddes A gentle touch can make all the difference. Premature babies – who miss out on the sensory experiences of late gestation – show different brain responses to gentle touch from babies that stay inside the uterus until term. This could affect later physical and emotional development, but regular skin-to-skin contact from parents and hospital staff seem to counteract it. Infants who are born early experience dramatic events at a time when babies that aren’t born until 40 weeks are still developing in the amniotic fluid. Premature babies are often separated from their parents for long periods, undergo painful procedures like operations and ventilation, and they experience bigger effects of gravity on the skin and muscles. “There is substantial evidence that pain exposure during early life can cause long-term alterations in infant brain development,” says Rebeccah Slater at the University of Oxford. But it has been less clear how gentle touches shape the brains of babies, mainly because the brain’s response to light touch is about a hundredth of that it has to pain, so it’s harder to study. Nathalie Maitre of the Nationwide Children’s Hospital in Columbus, Ohio, and her colleagues have gently stretched soft nets of 128 electrodes over the heads of 125 preterm and full-term babies, shortly before they were discharged from hospital. These were used to record how their brains responded to a gentle puff of air on the skin. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain; Pain & Touch
Link ID: 23371 - Posted: 03.17.2017

By Mike Stobbe, Elderly people are suffering concussions and other brain injuries from falls at what appear to be unprecedented rates, according to a new report from U.S. government researchers. The reason for the increase isn't clear, the report's authors said. But one likely factor is that a growing number of elderly people are living at home and taking repeated tumbles, said one expert. "Many older adults are afraid their independence will be taken away if they admit to falling, and so they minimize it," said Dr. Lauren Southerland, an Ohio State University emergency physician who specializes in geriatric care. But what may seem like a mild initial fall may cause concussions or other problems that increase the chances of future falls — and more severe injuries, she said. Whatever the cause, the numbers are striking, according to the new report released Thursday by the Centers for Disease Control and Prevention. One in every 45 Americans 75 and older suffered brain injuries that resulted in emergency department visits, hospitalizations, or deaths in 2013. The rate for that age group jumped 76 per cent from 2007. The rate of these injuries for people of all ages rose 39 per cent over that time, hitting a record level, the CDC found. Falls account for 90 per cent of hip and wrist fractures and 60 per cent of head injuries among people aged 65 and older, Canadian researchers have previously reported. The report, which explored brain injuries in general, also found an increase in brain injuries from suicides and suicide attempts, mainly gunshot wounds to the head. Brain injuries from car crashes fell. But the elderly suffered at far higher rates than any other group. ©2017 CBC/Radio-Canada.

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 23370 - Posted: 03.17.2017

By Kate Darby Rauch When Marian Diamond was growing up in Southern California, she got her first glimpse of a real brain at Los Angeles County Hospital with her dad, a physician. She was 15. Looking back now, at age 90, Diamond, a Berkeley resident, points to that moment as the start of something profound — a curiosity, wonderment, drive. “It just blew my mind, the fact that a cell could create an idea,” Diamond said in a recent interview, reflecting on her first encounter with that sinewy purple-tinged mass. She didn’t know that this was the start of a distinguished legacy that would stretch for decades, touching millions. But today, she’d be one of the first to scientifically equate that adolescent thrill with her life’s work. Because she helped prove a link. Brains, we now know, thanks in large part to research by Diamond, thrive on challenge, newness, discovery. With this enrichment, brain cells are stimulated and grow. This week, Diamond, a UC Berkeley emeritus professor of integrative biology and the first woman to earn a PhD in anatomy at Cal, is being honored by the Berkeley City Council, which is designating March 14 as Marian Diamond Day. And on March 22, KQED TV will air a new documentary film about her life’s work, My Love Affair With the Brain. © Berkeleyside All Rights Reserved.

Keyword: Development of the Brain
Link ID: 23366 - Posted: 03.16.2017

By DENISE GRADY Three women suffered severe, permanent eye damage after stem cells were injected into their eyes, in an unproven treatment at a loosely regulated clinic in Florida, doctors reported in an article published Wednesday in The New England Journal of Medicine. One, 72, went completely blind from the injections, and the others, 78 and 88, lost much of their eyesight. Before the procedure, all had some visual impairment but could see well enough to drive. The cases expose gaps in the ability of government health agencies to protect consumers from unproven treatments offered by entrepreneurs who promote the supposed healing power of stem cells. The women had macular degeneration, an eye disease that causes vision loss, and they paid $5,000 each to receive stem-cell injections in 2015 at a private clinic in Sunrise, Fla. The clinic was part of a company then called Bioheart, now called U.S. Stem Cell. Staff members there used liposuction to suck fat out of the women’s bellies, and then extracted stem cells from the fat to inject into the women’s eyes. The disastrous results were described in detail in the journal article, by doctors who were not connected to U.S. Stem Cell and treated the patients within days of the injections. An accompanying article by scientists from the Food and Drug Administration warned that stem cells from fat “are being used in practice on the basis of minimal clinical evidence of safety or efficacy, sometimes with the claims that they constitute revolutionary treatments for various conditions.” © 2017 The New York Times Company

Keyword: Vision; Stem Cells
Link ID: 23365 - Posted: 03.16.2017

By Andy Coghlan A woman in her 80s has become the first person to be successfully treated with induced pluripotent stem (iPS) cells. A slither of laboratory-made retinal cells has protected her eyesight, fighting her age-related macular degeneration – a common form of progressive blindness. Such stem cells can be coaxed to form many other types of cell. Unlike other types of stem cell, such as those found in an embryo, induced pluripotent ones can be made from adult non-stem cells – a discovery that earned a Nobel prize in 2012. Now, more than a decade after they were created, these stem cells have helped someone. Masayo Takahashi at the RIKEN Laboratory for Retinal Regeneration in Kobe, Japan, and her team took skin cells from the woman and turned them into iPS cells. They then encouraged these to form retinal pigment epithelial cells, which are important for supporting and nourishing the retina cells that capture light for vision. The researchers made a slither of cells measuring just 1 by 3 millimetres. Before transplanting this into the woman’s eye in 2014, they first removed diseased tissue on her retina that was gradually destroying her sight. They then inserted the small patch of cells they had created, hoping they would become a part of her eye and stop her eyesight from degenerating. © Copyright Reed Business Information Ltd.

Keyword: Vision; Stem Cells
Link ID: 23363 - Posted: 03.16.2017

Heidi Ledford Like a zombie that keeps on kicking, legal battles over mutant mice used for Alzheimer’s research are haunting the field once again — four years after the last round of lawsuits. In the latest case, the University of South Florida (USF) in Tampa has sued the US National Institutes of Health (NIH) for authorizing the distribution of a particular type of mouse used in the field. The first pre-trial hearing in the case is set to begin in a federal court on 21 March. The university holds a patent on the mouse, but the NIH has contracted the Jackson Laboratory, a non-profit organization in Bar Harbor, Maine, to supply the animals to researchers. The USF is now claiming that it deserves some of the money that went to the contractor. If the suit, filed in December 2015, is successful, it could set a precedent for other universities, cautions Robert Cook-Deegan, an intellectual-property scholar at the Washington DC centre of Arizona State University in Tempe. And that would threaten the affordability of and access to lab animals used to investigate. “It feels greedy to me,” Cook-Deegan says. “If other universities start doing this, all it does is push up the cost of research tools.” The mice, on which the USF filed a patent in 1997, express mutated forms of two genes1. These modifications help researchers to study how amyloid plaques develop in the brain, and enable them to investigate behavioural changes that manifest before those plaques appear. © 2017 Macmillan Publishers Limited,

Keyword: Alzheimers
Link ID: 23356 - Posted: 03.15.2017

By Knvul Sheikh As we get older, we start to think a little bit more slowly, we are less able to multitask and our ability to remember things gets a little wobblier. This cognitive transformation is linked to a steady, widespread thinning of the cortex, the brain's outermost layer. Yet the change is not inevitable. So-called super agers retain their good memory and thicker cortex as they age, a recent study suggests. Researchers believe that studying what makes super agers different could help unlock the secrets to healthy brain aging and improve our understanding of what happens when that process goes awry. “Looking at successful aging could provide us with biomarkers for predicting resilience and for things that might go wrong in people with age-related diseases like Alzheimer's and dementia,” says study co-author Alexandra Touroutoglou, a neuroscientist at Harvard Medical School. Touroutoglou and her team gave standard recall tests to a group of 40 participants between the ages of 60 and 80 and 41 participants aged 18 to 35. Among the older participants, 17 performed as well as or better than adults four to five decades younger. When the researchers looked at MRI scans of the super agers' brains, they found that their brains not only functioned more like young brains, they also looked very similar. Two brain networks in particular seemed to be protected from shrinking: the default mode network, which helps to store and recall new information, and the salience network, which is associated with directing attention and identifying important details. In fact, the thicker these regions were, the better the super agers' memory was. © 2017 Scientific American,

Keyword: Development of the Brain; Learning & Memory
Link ID: 23349 - Posted: 03.13.2017