Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Rae Ellen Bichell Exposure to lead as a child can affect an adult decades later, according to a study out Tuesday that suggests a link between early childhood lead exposure and a dip in a person's later cognitive ability and socioeconomic status. Lead in the United States can come from lots of sources: old, peeling paint; contaminated soil; or water that's passed through lead pipes. Before policies were enacted to get rid of lead in gasoline, it could even come from particles in the fumes that leave car tailpipes. "It's toxic to many parts of the body, but in particular in can accumulate in the bloodstream and pass through the blood brain barrier to reach the brain," says the study's first author, Aaron Reuben, a graduate student in clinical psychology at Duke University. Reuben and his colleagues published the results of a long-term study on the lingering effects of lead. Researchers had kept in touch with about 560 people for decades — starting when they were born in Dunedin, New Zealand, in the 1970s, all the way up to the present. As children, the study participants were tested on their cognitive abilities; researchers determined IQ scores based on tests of working memory, pattern recognition, verbal comprehension and ability to solve problems, among other skills. When the kids were 11 years old, researchers tested their blood for lead. (That measurement is thought to be a rough indicator of lead exposure in the few months before the blood draw.) Then, when they turned 38 years old, the cognitive ability of these study participants was tested again. As Reuben and his colleagues write in this week's issue of JAMA, the journal of the American Medical Association, they found a subtle but worrisome pattern in the data. © 2017 npr
By C. CLAIBORNE RAY Q. When four of us shared memories of our very young lives, not one of us could recall events before the age of 4 or possibly 3. Is this common? A. Yes. For adults, remembering events only after age 3½ or 4 is typical, studies have found. The phenomenon was named childhood amnesia by Freud and identified late in the 19th century by the pioneering French researcher Victor Henri and his wife, Catherine. The Henris published a questionnaire on early memories in 1895, and the results from 123 people were published in 1897. Most of the participants’ earliest memories came from when they were 2 to 4 years old; the average was age 3. Very few participants recalled events from the first year of life. Many subsequent studies found similar results. Several theories have been offered to explain the timing of laying down permanent memories. One widely studied idea relates the formation of children’s earliest memories to when they start talking about past events with their mothers, suggesting a link between memories and the age of language acquisition. More recent studies, in 2010 and 2014, found discrepancies in the accuracy of young children’s estimates of when things had occurred in their lives. Another 2014 study found a progressive loss of recall as a child ages, with 5-, 6- and 7-year-olds remembering 60 percent or more of some early-life events that were discussed at age 3, while 8- and 9-year-olds remembered only 40 percent of these events. © 2017 The New York Times Company
By Linda Searing The precise cause, or causes, of dementias such as Alzheimer’s disease remain unclear, but one theory points to molecules called free radicals that can damage nerve cells. This damage, called oxidative stress, may lead to changes in the brain over time that result in dementia. Might antioxidant supplements prevent this? The study involved 7,540 men 60 and older (average age, 67) with no indications of dementia and no history of serious head injury, substance abuse or neurological conditions that affect cognition. They were randomly assigned to take vitamin E (an antioxidant, 400 International Units daily), selenium (also an antioxidant, 200 micrograms daily), both vitamin E and selenium or a placebo. The men also had their memory assessed periodically. In just over five years, 325 of the men (about 4 percent) developed dementia, with essentially no difference in the rate of occurrence between those who took one or both supplements and those who took the placebo. The researchers concluded that the antioxidant supplements “did not forestall dementia and are not recommended as preventive agents.” Who may be affected? Older men. The risk for dementia increases with advanced age and is most common among the very elderly. Memory loss is the most well-known symptom, but people with dementia may also have problems thinking, speaking, controlling emotions and doing daily activities such as getting dressed and eating. Alzheimer’s disease is the most common type of dementia, affecting more than 5.5 million Americans, including more than 10 percent of those 65 and older and more women than men. Caveats Participants took the supplements for a relatively short time. Whether the findings would apply to women was not tested. The study did not prove that the dementia developed by the study participants was caused by oxidative stress. © 1996-2017 The Washington Post
Link ID: 23403 - Posted: 03.25.2017
By Jason G. Goldman In the summer of 2015 University of Oxford zoologists Antone Martinho III and Alex Kacelnik began quite the cute experiment—one involving ducklings and blindfolds. They wanted to see how the baby birds imprinted on their mothers depending on which eye was available. Why? Because birds lack a part of the brain humans take for granted. Suspended between the left and right hemispheres of our brains sits the corpus callosum, a thick bundle of nerves. It acts as an information bridge, allowing the left and right sides to rapidly communicate and act as a coherent whole. Although the hemispheres of a bird's brain are not entirely separated, the animals do not enjoy the benefits of this pathway. This quirk of avian neuroanatomy sets up a natural experiment. “I was in St. James's Park in London, and I saw some ducklings with their parents in the lake,” Martinho says. “It occurred to me that we could look at the instantaneous transfer of information through imprinting.” The researchers covered one eye of each of 64 ducklings and then presented a fake red or blue adult duck. This colored duck became “Mom,” and the ducklings followed it around. But when some of the ducklings' blindfolds were swapped so they could see out of only the other eye, they did not seem to recognize their “parent” anymore. Instead the ducklings in this situation showed equal affinity for both the red and blue ducks. It took three hours before any preferences began to emerge. Meanwhile ducklings with eyes that were each imprinted to a different duck did not show any parental preferences when allowed to use both eyes at once. The study was recently published in the journal Animal Behaviour. © 2017 Scientific American
by Laura Sanders Many babies born early spend extra time in the hospital, receiving the care of dedicated teams of doctors and nurses. For these babies, the hospital is their first home. And early experiences there, from lights to sounds to touches, may influence how babies develop. Touches early in life in the NICU, both pleasant and not, may shape how a baby’s brain responds to gentle touches later, a new study suggests. The results, published online March 16 in Current Biology, draw attention to the importance of touch, both in type and number. Young babies can’t see that well. But the sense of touch develops early, making it a prime way to get messages to fuzzy-eyed, pre-verbal babies. “We focused on touch because it really is some of the basis for communication between parents and child,” says study coauthor Nathalie Maitre, a neonatologist and neuroscientist at Nationwide Children’s Hospital in Columbus, Ohio. Maitre and her colleagues studied how babies’ brains responded to a light puff of air on the palms of their hands — a “very gentle and very weak touch,” she says. They measured these responses by putting adorable, tiny electroencephalogram, or EEG, caps on the babies. The researchers puffed babies’ hands shortly before they were sent home. Sixty-one of the babies were born early, from 24 to 36 weeks gestation. At the time of the puff experiment, they had already spent a median of 28 days in the hospital. Another group of 55 babies, born full-term, was tested in the three days after birth. |© Society for Science & the Public 2000 - 2017
By David Wiegand I just did something great for my brain and you can do the same, when the documentary “My Love Affair With the Brain: The Life and Science of Dr. Marian Diamond” airs on KQED on Wednesday, March 22. According to the UC Berkeley professor emerita, the five things that contribute to the continued development of the brain at any age are: diet, exercise, newness, challenge and love. You can check off three of those elements for the day by watching the film by Catherine Ryan and Gary Weimberg. No matter how smart you are, even about anatomy and neuroscience, you will find newness in the information about the miraculous human brain, how it works, and how it keeps on working no matter how old you are. That’s one of the fundamentals of modern neuroscience, of which Diamond is one of the founders. You will also be challenged to consider your own brain, to consider how Diamond’s favorite expression — “use it or lose it” — applies to your brain and your life. You will be challenged to consider what Diamond means when she says brain plasticity (its ability to keep developing by forming new connections between its cells) makes us “the masters of our own minds. We literally create our own masterpiece.” Before Diamond and her colleagues proved otherwise, the prevailing thought was that brains developed according to a genetically determined pattern, hit a high point and then essentially began to deteriorate. Bushwa: A brain can grow — i.e., learn — at any age, and you can teach an old dog new tricks. © 2017 Hearst Corporation
Keyword: Learning & Memory
Link ID: 23392 - Posted: 03.23.2017
By Mo Costandi This map of London shows how many other streets are connected to each street, with blue representing simple streets with few connecting streets and red representing complex streets with many connecting streets. Credit: Joao Pinelo Silva The brain contains a built-in GPS that relies on memories of past navigation experiences to simulate future ones. But how does it represent new environments in order to determine how to navigate them successfully? And what happens in the brain when we enter a new space, or use satellite navigation (SatNav) technology to help us find our way around? Research published Tuesday in Nature Communications reveals two distinct brain regions that cooperate to simulate the topology of one’s environment and plan future paths through it when one is actively navigating. In addition, the research suggests both regions become inactive when people follow SatNav instructions instead of using their spatial memories. In a previous study researchers at University College London took participants on a guided tour through the streets of London’s Soho district and then used functional magnetic resonance imaging (fMRI) to scan their brains as they watched 10 different simulations of navigating those streets. Some of the movies required them to decide at intersections which way would be the shortest path to a predetermined destination; others came with instructions about which way to go at each junction. © 2017 Scientific American,
Keyword: Learning & Memory
Link ID: 23391 - Posted: 03.22.2017
Hannah Devlin Scientists have developed a new genetic test for Alzheimer’s risk that can be used to predict the age at which a person will develop the disease. A high score on the test, which is based on 31 genetic markers, can translate to being diagnosed many years earlier than those with a low-risk genetic profile, the study found. Those ranked in the top 10% in terms of risk were more than three times as likely to develop Alzheimer’s during the course of the study, and did so more than a decade before those who ranked in the lowest 10%. Strobe lighting provides a flicker of hope in the fight against Alzheimer’s Read more Rahul Desikan, of the University of California – who led the international effort, said the test could be used to calculate any individual’s risk of developing Alzheimer’s that year. “That is, if you don’t already have dementia, what is your yearly risk for AD onset, based on your age and genetic information,” he said. The so-called polygenic hazard score test was developed using genetic data from more than 70,000 individuals, including patients with Alzheimer’s disease and healthy elderly people. It is already known that genetics plays a powerful role in Alzheimer’s. Around a quarter of patients have a strong family history of the disease, and scientists have shown this is partly explained by a gene called ApoE, which comes in three versions, and is known to have a powerful influence on the risk of getting the most common late-onset type of Alzheimer’s. One version of ApoE appears to reduce risk by up to 40%, while those with two copies (one from each parent) of the high-risk version can increase risk by 12 times.
Link ID: 23390 - Posted: 03.22.2017
By KIM SEVERSON SONOMA, Calif. — The first thing Paula Wolfert wants to make a guest is coffee blended with butter from grass-fed cows and something called brain octane oil. She waves a greasy plastic bottle of the oil around her jumble of a kitchen like a preacher who has taken up a serpent. Never mind that this is the woman who introduced tagines, Aleppo pepper and cassoulet to American kitchens, wrote nine cookbooks and once possessed a palate the food writer Ruth Reichl declared the best she’d ever encountered. Ms. Wolfert, 78, has dementia. She can’t cook much, even if she wanted to. Which, by the way, she doesn’t. She learned she probably had Alzheimer’s disease in 2013, but she suspected something wasn’t right long before. Words on a page sometimes made no sense. Complex questions started to baffle her. Since she has always been an audacious and kinetic conversationalist with a touch of hypochondria, friends didn’t notice anything was wrong. Doctors spoke of “senior moments.” But she knew. One day, Ms. Wolfert went to make an omelet for her husband, the crime novelist William Bayer. She had to ask him how. The woman who once marched up to the French chef Jean-Louis Palladin and told him a dish didn’t have enough salt can no longer taste the difference between a walnut and a pecan, or smell whether the mushrooms are burning. The list of eight languages she once understood has been reduced to English. Maybe 40 percent of the words she knew have evaporated. “What am I going to do, cry about it?” Ms. Wolfert said in an interview at her home this month, the slap of her Brooklyn accent still sharp. After all, she points out, her first husband left her in Morocco with two small children and $2,000: “I cried for 20 minutes and I thought, ‘This isn’t going to do any good.’” © 2017 The New York Times Company
Link ID: 23389 - Posted: 03.22.2017
By NICHOLAS BAKALAR Some research has suggested that vitamin E and selenium supplements might lower the risk for Alzheimer’s disease, but a new long-term trial has found no evidence that they will. The study began as a randomized clinical trial in 2002 testing the supplements for the prevention of prostate cancer. When that study was stopped in 2009 because no effect was found, 3,786 of the original 7,540 men participated in a continuing study to test the antioxidants as a preventive for Alzheimer’s. The study, in JAMA Neurology, randomly assigned the men, whose average age was 67 at the start, to take either vitamin E, selenium, both supplements, or a placebo. By 2015, 4.4 percent of the men had dementia, but there was no difference between the groups. Neither selenium, vitamin E, nor both in combination were any more effective than a placebo. The study controlled for age, family history of Alzheimer’s disease, education, race, diabetes and other factors. The lead author, Richard J. Kryscio, a professor of statistics at the University of Kentucky, said that it is possible that different dosages or different types of selenium or vitamin E might show an effect. “We could have picked the wrong version or the wrong dose,” he said. “But there’s really no evidence that these supplements will make a difference down the road in preventing dementia.” © 2017 The New York Times Company
Link ID: 23388 - Posted: 03.22.2017
By Jill Serjeant NEW YORK (Reuters) - Long-running children's television show "Sesame Street" is welcoming a new kid to the block - a Muppet with autism called Julia. A redhead who loves to sing and remembers the words to lots of songs, Julia will debut on the show for preschoolers on April 10 after a five-year outreach effort to families and experts on autism, Sesame Workshop said on Monday. "For years, families of children with autism have asked us to address the issue," Dr. Jeanette Betancourt, senior vice president of U.S. social impact at the nonprofit Sesame Workshop, said in a statement. One in 68 American children is currently diagnosed with autism, according to the Centers for Disease Control and Prevention, an increase of some 119 percent since 2000. Autism is a developmental disorder present from early childhood, characterized by difficulty in communicating and forming relationships with other people and in using language and abstract concepts Stacey Gordon, the puppeteer who will perform the role of Julia, and Christine Ferraro who wrote her part, both have family members who are on the autism spectrum. "It's important for kids without autism to see what autism can look like," Gordon told the CBS show "60 Minutes" in a preview on Sunday. "Had my son's friends been exposed to his behaviors through something that they had seen on TV before they experienced them in the classroom, they might not have been frightened. They might not have been worried when he cried. They would have known that he plays in a different way and that that's okay," she added. © 2017 Scientific American
Link ID: 23387 - Posted: 03.22.2017
Laura Sanders Not too long ago, the internet was stationary. Most often, we’d browse the Web from a desktop computer in our living room or office. If we were feeling really adventurous, maybe we’d cart our laptop to a coffee shop. Looking back, those days seem quaint. Today, the internet moves through our lives with us. We hunt Pokémon as we shuffle down the sidewalk. We text at red lights. We tweet from the bathroom. We sleep with a smartphone within arm’s reach, using the device as both lullaby and alarm clock. Sometimes we put our phones down while we eat, but usually faceup, just in case something important happens. Our iPhones, Androids and other smartphones have led us to effortlessly adjust our behavior. Portable technology has overhauled our driving habits, our dating styles and even our posture. Despite the occasional headlines claiming that digital technology is rotting our brains, not to mention what it’s doing to our children, we’ve welcomed this alluring life partner with open arms and swiping thumbs. Scientists suspect that these near-constant interactions with digital technology influence our brains. Small studies are turning up hints that our devices may change how we remember, how we navigate and how we create happiness — or not. Somewhat limited, occasionally contradictory findings illustrate how science has struggled to pin down this slippery, fast-moving phenomenon. Laboratory studies hint that technology, and its constant interruptions, may change our thinking strategies. Like our husbands and wives, our devices have become “memory partners,” allowing us to dump information there and forget about it — an off-loading that comes with benefits and drawbacks. Navigational strategies may be shifting in the GPS era, a change that might be reflected in how the brain maps its place in the world. Constant interactions with technology may even raise anxiety in certain settings. |© Society for Science & the Public 2000 - 2017
By Linda Geddes A gentle touch can make all the difference. Premature babies – who miss out on the sensory experiences of late gestation – show different brain responses to gentle touch from babies that stay inside the uterus until term. This could affect later physical and emotional development, but regular skin-to-skin contact from parents and hospital staff seem to counteract it. Infants who are born early experience dramatic events at a time when babies that aren’t born until 40 weeks are still developing in the amniotic fluid. Premature babies are often separated from their parents for long periods, undergo painful procedures like operations and ventilation, and they experience bigger effects of gravity on the skin and muscles. “There is substantial evidence that pain exposure during early life can cause long-term alterations in infant brain development,” says Rebeccah Slater at the University of Oxford. But it has been less clear how gentle touches shape the brains of babies, mainly because the brain’s response to light touch is about a hundredth of that it has to pain, so it’s harder to study. Nathalie Maitre of the Nationwide Children’s Hospital in Columbus, Ohio, and her colleagues have gently stretched soft nets of 128 electrodes over the heads of 125 preterm and full-term babies, shortly before they were discharged from hospital. These were used to record how their brains responded to a gentle puff of air on the skin. © Copyright Reed Business Information Ltd.
By Mike Stobbe, Elderly people are suffering concussions and other brain injuries from falls at what appear to be unprecedented rates, according to a new report from U.S. government researchers. The reason for the increase isn't clear, the report's authors said. But one likely factor is that a growing number of elderly people are living at home and taking repeated tumbles, said one expert. "Many older adults are afraid their independence will be taken away if they admit to falling, and so they minimize it," said Dr. Lauren Southerland, an Ohio State University emergency physician who specializes in geriatric care. But what may seem like a mild initial fall may cause concussions or other problems that increase the chances of future falls — and more severe injuries, she said. Whatever the cause, the numbers are striking, according to the new report released Thursday by the Centers for Disease Control and Prevention. One in every 45 Americans 75 and older suffered brain injuries that resulted in emergency department visits, hospitalizations, or deaths in 2013. The rate for that age group jumped 76 per cent from 2007. The rate of these injuries for people of all ages rose 39 per cent over that time, hitting a record level, the CDC found. Falls account for 90 per cent of hip and wrist fractures and 60 per cent of head injuries among people aged 65 and older, Canadian researchers have previously reported. The report, which explored brain injuries in general, also found an increase in brain injuries from suicides and suicide attempts, mainly gunshot wounds to the head. Brain injuries from car crashes fell. But the elderly suffered at far higher rates than any other group. ©2017 CBC/Radio-Canada.
By Kate Darby Rauch When Marian Diamond was growing up in Southern California, she got her first glimpse of a real brain at Los Angeles County Hospital with her dad, a physician. She was 15. Looking back now, at age 90, Diamond, a Berkeley resident, points to that moment as the start of something profound — a curiosity, wonderment, drive. “It just blew my mind, the fact that a cell could create an idea,” Diamond said in a recent interview, reflecting on her first encounter with that sinewy purple-tinged mass. She didn’t know that this was the start of a distinguished legacy that would stretch for decades, touching millions. But today, she’d be one of the first to scientifically equate that adolescent thrill with her life’s work. Because she helped prove a link. Brains, we now know, thanks in large part to research by Diamond, thrive on challenge, newness, discovery. With this enrichment, brain cells are stimulated and grow. This week, Diamond, a UC Berkeley emeritus professor of integrative biology and the first woman to earn a PhD in anatomy at Cal, is being honored by the Berkeley City Council, which is designating March 14 as Marian Diamond Day. And on March 22, KQED TV will air a new documentary film about her life’s work, My Love Affair With the Brain. © Berkeleyside All Rights Reserved.
Keyword: Development of the Brain
Link ID: 23366 - Posted: 03.16.2017
By DENISE GRADY Three women suffered severe, permanent eye damage after stem cells were injected into their eyes, in an unproven treatment at a loosely regulated clinic in Florida, doctors reported in an article published Wednesday in The New England Journal of Medicine. One, 72, went completely blind from the injections, and the others, 78 and 88, lost much of their eyesight. Before the procedure, all had some visual impairment but could see well enough to drive. The cases expose gaps in the ability of government health agencies to protect consumers from unproven treatments offered by entrepreneurs who promote the supposed healing power of stem cells. The women had macular degeneration, an eye disease that causes vision loss, and they paid $5,000 each to receive stem-cell injections in 2015 at a private clinic in Sunrise, Fla. The clinic was part of a company then called Bioheart, now called U.S. Stem Cell. Staff members there used liposuction to suck fat out of the women’s bellies, and then extracted stem cells from the fat to inject into the women’s eyes. The disastrous results were described in detail in the journal article, by doctors who were not connected to U.S. Stem Cell and treated the patients within days of the injections. An accompanying article by scientists from the Food and Drug Administration warned that stem cells from fat “are being used in practice on the basis of minimal clinical evidence of safety or efficacy, sometimes with the claims that they constitute revolutionary treatments for various conditions.” © 2017 The New York Times Company
By Andy Coghlan A woman in her 80s has become the first person to be successfully treated with induced pluripotent stem (iPS) cells. A slither of laboratory-made retinal cells has protected her eyesight, fighting her age-related macular degeneration – a common form of progressive blindness. Such stem cells can be coaxed to form many other types of cell. Unlike other types of stem cell, such as those found in an embryo, induced pluripotent ones can be made from adult non-stem cells – a discovery that earned a Nobel prize in 2012. Now, more than a decade after they were created, these stem cells have helped someone. Masayo Takahashi at the RIKEN Laboratory for Retinal Regeneration in Kobe, Japan, and her team took skin cells from the woman and turned them into iPS cells. They then encouraged these to form retinal pigment epithelial cells, which are important for supporting and nourishing the retina cells that capture light for vision. The researchers made a slither of cells measuring just 1 by 3 millimetres. Before transplanting this into the woman’s eye in 2014, they first removed diseased tissue on her retina that was gradually destroying her sight. They then inserted the small patch of cells they had created, hoping they would become a part of her eye and stop her eyesight from degenerating. © Copyright Reed Business Information Ltd.
Ian Sample Science editor Researchers have overcome one of the major stumbling blocks in artificial intelligence with a program that can learn one task after another using skills it acquires on the way. Developed by Google’s AI company, DeepMind, the program has taken on a range of different tasks and performed almost as well as a human. Crucially, and uniquely, the AI does not forget how it solved past problems, and uses the knowledge to tackle new ones. The AI is not capable of the general intelligence that humans draw on when they are faced with new challenges; its use of past lessons is more limited. But the work shows a way around a problem that had to be solved if researchers are ever to build so-called artificial general intelligence (AGI) machines that match human intelligence. “If we’re going to have computer programs that are more intelligent and more useful, then they will have to have this ability to learn sequentially,” said James Kirkpatrick at DeepMind. The ability to remember old skills and apply them to new tasks comes naturally to humans. A regular rollerblader might find ice skating a breeze because one skill helps the other. But recreating this ability in computers has proved a huge challenge for AI researchers. AI programs are typically one trick ponies that excel at one task, and one task only.
Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017
Heidi Ledford Like a zombie that keeps on kicking, legal battles over mutant mice used for Alzheimer’s research are haunting the field once again — four years after the last round of lawsuits. In the latest case, the University of South Florida (USF) in Tampa has sued the US National Institutes of Health (NIH) for authorizing the distribution of a particular type of mouse used in the field. The first pre-trial hearing in the case is set to begin in a federal court on 21 March. The university holds a patent on the mouse, but the NIH has contracted the Jackson Laboratory, a non-profit organization in Bar Harbor, Maine, to supply the animals to researchers. The USF is now claiming that it deserves some of the money that went to the contractor. If the suit, filed in December 2015, is successful, it could set a precedent for other universities, cautions Robert Cook-Deegan, an intellectual-property scholar at the Washington DC centre of Arizona State University in Tempe. And that would threaten the affordability of and access to lab animals used to investigate. “It feels greedy to me,” Cook-Deegan says. “If other universities start doing this, all it does is push up the cost of research tools.” The mice, on which the USF filed a patent in 1997, express mutated forms of two genes1. These modifications help researchers to study how amyloid plaques develop in the brain, and enable them to investigate behavioural changes that manifest before those plaques appear. © 2017 Macmillan Publishers Limited,
Link ID: 23356 - Posted: 03.15.2017