Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 4943

|By Marissa Fessenden Songbirds stutter, babble when young, become mute if parts of their brains are damaged, learn how to sing from their elders and can even be "bilingual"—in other words, songbirds' vocalizations share a lot of traits with human speech. However, that similarity goes beyond behavior, researchers have found. Even though humans and birds are separated by millions of years of evolution, the genes that give us our ability to learn speech have much in common with those that lend birds their warble. A four-year long effort involving more than 100 researchers around the world put the power of nine supercomputers into analyzing the genomes of 48 species of birds. The results, published this week in a package of eight articles in Science and 20 papers in other journals, provides the most complete picture of the bird family tree thus far. The project has also uncovered genetic signatures in song-learning bird brains that have surprising similarities to the genetics of speech in humans, a finding that could help scientists study human speech. The analysis suggests that most modern birds arose in an impressive speciation event, a "big bang" of avian diversification, in the 10 million years immediately following the extinction of dinosaurs. This period is more recent than posited in previous genetic analyses, but it lines up with the fossil record. By delving deeper into the rich data set, research groups identified when birds lost their teeth, investigated the relatively slow evolution of crocodiles and outlined the similarities between birds' and humans' vocal learning ability, among other findings. © 2014 Scientific American,

Keyword: Language; Genes & Behavior
Link ID: 20423 - Posted: 12.16.2014

By Candy Schulman My mother’s greatest fear was Alzheimer’s. She got Lewy body dementia, or LBD, instead. This little known, oddly named, debilitating illness afflicts an estimated 1.3 million Americans, the actor and comedian Robin Williams possibly among them. It is often misdiagnosed because its signs, such as hallucinations and body rigidity, do not seem like those of dementia, but in the end it robs people of themselves even more painfully. I first noticed my mother’s cognitive difficulties when she was 88. Until then, she’d led an extraordinarily active life: She was a competitive golfer with a bureau full of trophies, a painter and a sculptor. Every Hanukkah she hosted a lively feast for her eight grandchildren and nine great-grandchildren. This time, though, she needed my help planning, shopping and cooking. She was having difficulty with the guest list, trying to write every family member’s name on a piece of paper, adding up the numbers to see how many potatoes to buy for latkes. Her concentration became frayed and she kept ripping it up and starting again, close to tears. Several months before that, she had sent me a Mother’s Day card that was illustrated with childlike prose, colorful illustrations and glitter hearts. The poem on the cover was printed in a playful purple font: “For you, Mom. For kissing my boo-boos, for wiping my face. . . . For calming my fears with your loving embrace.” On Mother’s Day and the rest of the year, Mom added in a shaky script, “thanks.”

Keyword: Alzheimers
Link ID: 20422 - Posted: 12.16.2014

|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American

Keyword: Alzheimers; Genes & Behavior
Link ID: 20421 - Posted: 12.16.2014

By Nicholas Bakalar Poor sleep in older adults may be linked to brain changes associated with dementia, a new study has found. Researchers studied 167 men who underwent sleep tests in 1999 and died by 2010. The study, in Neurology, recorded sleep duration, periods of waking up and episodes of apnea, and used pulse oximetry to measure oxygen saturation of their blood. On autopsy, they found that those in the highest one-quarter for duration of sleep at oxygen saturation of less than 95 percent were almost four times as likely to have higher levels microinfarcts, small areas of dead tissue caused by deprivation of blood supply, as those in the lowest one-quarter. Compared with those in the lowest 25 percent for duration of slow-wave (deep) sleep, those in the highest one-quarter were about a third as likely to have moderate or high levels of generalized brain atrophy. “Prior studies have shown an association between certain types of sleep disturbance and dementia,” said the lead author, Dr. Rebecca P. Gelber, an epidemiologist with the Veterans Administration in Hawaii. “These lesions may help explain the association.” © 2014 The New York Times Company

Keyword: Alzheimers; Sleep
Link ID: 20420 - Posted: 12.16.2014

By David Noonan It was the day before Christmas, and the normally busy MIT laboratory on Vassar Street in Cambridge was quiet. But creatures were definitely stirring, including a mouse that would soon be world famous. Steve Ramirez, a 24-year-old doctoral student at the time, placed the mouse in a small metal box with a black plastic floor. Instead of curiously sniffing around, though, the animal instantly froze in terror, recalling the experience of receiving a foot shock in that same box. It was a textbook fear response, and if anything, the mouse’s posture was more rigid than Ramirez had expected. Its memory of the trauma must have been quite vivid. Which was amazing, because the memory was bogus: The mouse had never received an electric shock in that box. Rather, it was reacting to a false memory that Ramirez and his MIT colleague Xu Liu had planted in its brain. “Merry Freaking Christmas,” read the subject line of the email Ramirez shot off to Liu, who was spending the 2012 holiday in Yosemite National Park. The observation culminated more than two years of a long-shot research effort and supported an extraordinary hypothesis: Not only was it possible to identify brain cells involved in the encoding of a single memory, but those specific cells could be manipulated to create a whole new “memory” of an event that never happened. “It’s a fantastic feat,” says Howard Eichenbaum, a leading memory researcher and director of the Center for Neuroscience at Boston University, where Ramirez did his undergraduate work. “It’s a real breakthrough that shows the power of these techniques to address fundamental questions about how the brain works.” In a neuroscience breakthrough, the duo implanted a false memory in a mouse

Keyword: Learning & Memory; Emotions
Link ID: 20418 - Posted: 12.16.2014

By Gail Sullivan Chemicals found in food and common household products have been linked to lower IQ in kids exposed to high levels during pregnancy. Previous research linked higher exposure to chemicals called "phthalates" to poor mental and motor development in preschoolers. This study was said to be the first to report a link between prenatal exposure to the chemicals and childhood development. Researchers from Columbia University’s Mailman School of Public Health studied exposure to five types of phthalates, which are sometimes referred to as “hormone disruptors” or “endocrine disruptors.” Among these, di-n-butyl phthalate (DnBP) is used in shower curtains, raincoats, hairspray, food wraps, vinyl and pill coating, among other things — but according to the EPA, the largest source of exposure may be seafood. Di-isobutyl phthalate (DiBP) and Butylbenzyl phthalate (BBzP) are added to plastics to make them flexible. These chemicals may also used in makeup, nail polish, lacquer and explosives. The researchers linked prenatal exposure to phthalates to a more than six-point drop in IQ score compared with kids with less exposure. The study, “Persistent Associations between Maternal Prenatal Exposure to Phthalates on Child IQ at Age 7 Years," was published Wednesday in the journal PLOS One. "The magnitude of these IQ differences is troubling," one of the study’s authors, Robin Whyatt, said in a press release. "A six- or seven-point decline in IQ may have substantial consequences for academic achievement and occupational potential."

Keyword: Intelligence; Neurotoxins
Link ID: 20413 - Posted: 12.13.2014

By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014

by Helen Thomson Zapping your brain might make you better at maths tests – or worse. It depends how anxious you are about taking the test in the first place. A recent surge of studies has shown that brain stimulation can make people more creative and better at maths, and can even improve memory, but these studies tend to neglect individual differences. Now, Roi Cohen Kadosh at the University of Oxford and his colleagues have shown that brain stimulation can have completely opposite effects depending on your personality. Previous research has shown that a type of non-invasive brain stimulation called transcranial direct current stimulation (tDCS) – which enhances brain activity using an electric current – can improve mathematical ability when applied to the dorsolateral prefrontal cortex, an area involved in regulating emotion. To test whether personality traits might affect this result, Kadosh's team tried the technique on 25 people who find mental arithmetic highly stressful, and 20 people who do not. They found that participants with high maths anxiety made correct responses more quickly and, after the test, showed lower levels of cortisol, an indicator of stress. On the other hand, individuals with low maths anxiety performed worse after tDCS. "It is hard to believe that all people would benefit similarly [from] brain stimulation," says Cohen Kadosh. He says that further research could shed light on how to optimise the technology and help to discover who is most likely to benefit from stimulation. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging; Learning & Memory
Link ID: 20406 - Posted: 12.10.2014

Ian Sample, science editor Electrical brain stimulation equipment – which can boost cognitive performance and is easy to buy online – can have bad effects, impairing brain functioning, research from scientists at Oxford University has shown. A steady stream of reports of stimulators being able to boost brain performance, coupled with the simplicity of the devices, has led to a rise in DIY enthusiasts who cobble the equipment together themselves, or buy it assembled on the web, then zap themselves at home. In science laboratories brain stimulators have long been used to explore cognition. The equipment uses electrodes to pass gentle electric pulses through the brain, to stimulate activity in specific regions of the organ. Roi Cohen Kadosh, who led the study, published in the Journal of Neuroscience, said: “It’s not something people should be doing at home at this stage. I do not recommend people buy this equipment. At the moment it’s not therapy, it’s an experimental tool.” The Oxford scientists used a technique called transcranial direct current stimulation (tDCS) to stimulate the dorsolateral prefrontal cortex in students as they did simple sums. The results of the test were surprising. Students who became anxious when confronted with sums became calmer and solved the problems faster than when they had sham stimulation (the stimulation itself lasted only 30 seconds of the half hour study). The shock was that the students who did not fear maths performed worse with the same stimulation.

Keyword: Brain imaging; Learning & Memory
Link ID: 20405 - Posted: 12.10.2014

Kelly Servick* Anesthesiologists and surgeons who operate on children have been dogged by a growing fear—that being under anesthesia can permanently damage the developing brain. Although the few studies of children knocked out for surgeries have been inconclusive, evidence of impaired development in nematodes, zebrafish, rats, guinea pigs, pigs, and monkeys given common anesthetics has piled up in recent years. Now, the alarm is reaching a tipping point. “Anything that goes from [the roundworm] C. elegans to nonhuman primates, I've got to worry about,” Maria Freire, co-chair of the U.S. Food and Drug Administration (FDA) science advisory board, told attendees at a meeting the agency convened here last month to discuss the issue. The gathering came as anesthesia researchers and regulators consider several moves to address the concerns: a clinical trial of anesthetics in children, a consensus statement about their possible risks, and an FDA warning label on certain drugs. But each step stirs debate. Many involved in the issue are reluctant to make recommendations to parents and physicians based on animal data alone. At the same time, more direct studies of anesthesia's risks in children are plagued by confounding factors, lack of funding, and ethical issues. “We have to generate—very quickly—an action item, because I don't think the status quo is acceptable,” Freire said at the 19 November meeting. “Generating an action item without having the data is where things become very, very tricky.” © 2014 American Association for the Advancement of Science

Keyword: Sleep; Development of the Brain
Link ID: 20399 - Posted: 12.06.2014

|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American

Keyword: Learning & Memory; Emotions
Link ID: 20397 - Posted: 12.06.2014

By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014

Keyword: Hearing; Learning & Memory
Link ID: 20393 - Posted: 12.04.2014

by Andy Coghlan How does this make you feel? Simply asking people to think about emotion-laden actions as their brains are scanned could become one of the first evidence-based tests for psychiatric illness. Assessing people in this way would be a step towards a more scientific approach to diagnosis, away from that based on how someone behaves or how they describe their symptoms. The US National Institute of Mental Health has had such a goal in mind since 2013. Marcel Just of Carnegie Mellon University in Pittsburgh, Pennsylvania, and his colleagues developed the brain scanning technique and used it to identify people with autism. "This gives us a whole new perspective to understanding psychiatric illnesses and disorders," says Just. "We've discovered a biological thought-marker for autism." The technique builds on work by the group showing that specific thoughts and emotions are represented in the brain by certain patterns of neural activation. The idea is that deviations from these patterns, what Just refers to as thought-markers, can be used to diagnose different psychiatric conditions. The team asked a group of adults to imagine 16 actions, some of which required emotional involvement, such as "hugging", "persuading" or "adoring", while they lay in an fMRI scanner. © Copyright Reed Business Information Ltd.

Keyword: Autism; Emotions
Link ID: 20392 - Posted: 12.04.2014

By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20387 - Posted: 12.03.2014

|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,

Keyword: Learning & Memory
Link ID: 20380 - Posted: 12.02.2014

By Nicholas Bakalar Researchers have found that people diagnosed with diabetes in their 50’s are significantly more likely than others to suffer mental decline by their 70’s. The study, published Monday in the Annals of Internal Medicine, started in 1990. Scientists examined 13,351 black and white adults, aged 48 to 67, for diabetes and prediabetes using self-reported physician diagnoses and glucose control tests. They also administered widely used tests of memory, reasoning, problem solving and planning. About 13 percent had diabetes at the start. The researchers followed them with five periodic examinations over the following 20 years. By that time, 5,987 participants were still enrolled. After adjusting for numerous health and behavioral factors, and for the large attrition in the study, the researchers found people with diabetes suffered a 30 percent larger decline in mental acuity than those without the disease. Diabetes can impair blood circulation, and the authors suggest that the association of diabetes with thinking and memory problems may be the result of damage to small blood vessels in the brain. “People may think cognitive decline with age is inevitable, but it’s not,” said the senior author, Elizabeth Selvin, an associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health. “Factors like diabetes are potentially modifiable. If we can better control diabetes we can stave off cognitive decline and future dementia.” © 2014 The New York Times Company

Keyword: Obesity; Learning & Memory
Link ID: 20377 - Posted: 12.02.2014

by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Glia
Link ID: 20375 - Posted: 12.01.2014

by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.

Keyword: Language; Development of the Brain
Link ID: 20367 - Posted: 11.29.2014

By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."

Keyword: Vision; Development of the Brain
Link ID: 20365 - Posted: 11.29.2014

By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014