Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 5653

Erin Ross The teenage brain has been characterized as a risk-taking machine, looking for quick rewards and thrills instead of acting responsibly. But these behaviors could actually make teens better than adults at certain kinds of learning. "In neuroscience, we tend to think that if healthy brains act in a certain way, there should be a reason for it," says Juliet Davidow, a postdoctoral researcher at Harvard University in the Affective Neuroscience and Development Lab and the lead author of the study, which was published Wednesday in the journal Neuron. But scientists and the public often focus on the negatives of teen behavior, so she and her colleagues set out to test the hypothesis that teenagers' drive for rewards, and the risk-taking that comes from it, exist for a reason. When it comes to what drives reward-seeking in teens, fingers have always been pointed at the striatum, a lobster-claw-shape structure in the brain. When something surprising and good happens — say, you find $20 on the street — your body produces the pleasure-related hormone dopamine, and the striatum responds. "Research shows that the teenage striatum is very active," says Davidow. This suggests that teens are hard-wired to seek immediate rewards. But, she adds, it's also shown that their prefrontal cortex, which helps with impulse control, isn't fully developed. Combined, these two things have given teens their risky rep. But the striatum isn't just involved in reward-seeking. It's also involved in learning from rewards, explains Daphna Shohamy, a cognitive neuroscientist at the Zuckerman Mind Brain Behavior Institute at Columbia University who worked on the study. She wanted to see if teenagers would be better at this type of learning than adults would. © 2016 npr

Keyword: Development of the Brain; Learning & Memory
Link ID: 22738 - Posted: 10.10.2016

Richard A. Friedman There’s a reason adults don’t pick up Japanese or learn how to kite surf. It’s ridiculously hard. In stark contrast, young people can learn the most difficult things relatively easily. Polynomials, Chinese, skateboarding — no problem! Neuroplasticity — the brain’s ability to form new neural connections and be influenced by the environment — is greatest in childhood and adolescence, when the brain is still a work in progress. But this window of opportunity is finite. Eventually it slams shut. Or so we thought. Until recently, the conventional wisdom within the fields of neuroscience and psychiatry has been that development is a one-way street, and once a person has passed through his formative years, experiences and abilities are very hard, if not impossible, to change. What if we could turn back the clock in the brain and recapture its earlier plasticity? This possibility is the focus of recent research in animals and humans. The basic idea is that during critical periods of brain development, the neural circuits that help give rise to mental states and behaviors are being sculpted and are particularly sensitive to the effects of experience. If we can understand what starts and stops these periods, perhaps we can restart them. Think of the brain’s sensitive periods as blown glass: The molten glass is very malleable, but you have a relatively brief time before it cools and becomes crystalline. Put it back into the furnace, and it can once again change shape. © 2016 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 22737 - Posted: 10.10.2016

Dean Burnett Throughout history, people have always worried about new technologies. The fear that the human brain cannot cope with the onslaught of information made possible by the latest development was first voiced in response to the printing press, back in the sixteenth century. Swap “printing press” for “internet” and you have the exact same concerns today, regularly voiced in the mainstream media, and usually focused on children. But is there any legitimacy to these claims? Or are they just needless scaremongering? There are several things to bear in mind when considering how our brains deal with the internet. The human brain is always dealing with a constant stream of rich information - that’s what the real world is First, don’t forget that “the internet” is a very vague term, given that it contains so many things across so many formats. You could, for instance, develop a gambling addiction via online casinos or poker sites. This is an example of someone’s brain being negatively affected via the internet, but it would be difficult to argue that the internet is the main culprit, any more than a gambling addiction obtained via a real world casino can be blamed on “buildings”; it’s just the context in which the problem occurred. However, the internet does give us a far more direct, constant and wide ranging access to information than pretty much anything else in human history. So how could, or does, this affect us and our brains? © 2016 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 22736 - Posted: 10.10.2016

By Anna Azvolinsky _The human cerebral cortex experiences a burst of growth late in fetal development thanks to the expansion and migration of progenitor cells that ultimately form excitatory neurons. For a fully functional brain, in addition to excitatory neurons, inhibitory ones (called interneurons) are also necessary. Yet scientists have not been able to account for the increase in inhibitory neurons that occurs after birth. Now, in a paper published today (October 6) in Science, researchers from the University of California, San Francisco (UCSF), have shown that there is a reserve of young neurons that continue to migrate and integrate into the frontal lobes of infants. “It was thought previously that addition of new neurons to the human cortex [mostly] happens only during fetal development. This new study shows that young neurons continue to migrate on a large scale into the cerebral cortex of infants,” Benedikt Berninger, who studies brain development at the Johannes Gutenberg University of Mainz, Germany, and was not involved in the work, wrote in an email to The Scientist. “This implies that experience during the first few months could affect this migration and thereby contribute to brain plasticity.” Aside from the migration of neurons into the olfactory bulb in infants, “this is the first time anyone has been able to catch neurons in the act of moving into the cortex,” said New York University neuroscientist Gord Fishell who penned an accompanying editorial but was not involved in the work. “We kept expecting these interneurons to be new cells but, in fact, they are immature ones hanging around and taking the long road from the bottom of the brain to the cortex.” © 1986-2016 The Scientist

Keyword: Development of the Brain; Neurogenesis
Link ID: 22734 - Posted: 10.08.2016

/ By Seth Mnookin When Henry Molaison died at a Connecticut nursing home in 2008, at the age of 82, a front-page obituary in The New York Times called him “the most important patient in the history of brain science.” It was no exaggeration: Much of what we know about how memory works is derived from experiments on Molaison, a patient with severe epilepsy who in 1953 had undergone an operation that left him without medial temporal lobes and the ability to form new memories. The operation didn’t completely stop Molaison’s seizures — the surgeon, William Beecher Scoville, had done little more than guess at the locus of his affliction — but by chance, it rendered him a near-perfect research subject. Not only could postoperative changes in his behavior be attributed to the precise area of his brain that had been removed, but the fact that he couldn’t remember what had happened 30 seconds earlier made him endlessly patient and eternally willing to endure all manner of experiments. It didn’t take long for those experiments to upend our understanding of the human brain. By the mid-1950s, studies on Molaison (known until his death only as Patient H.M.) had shown that, contrary to popular belief, memories were created not in the brain as a whole, but in specific regions — and that different types of memories were formed in different ways. Molaison remained a research subject until his death, and for the last 41 years of his life, the person who controlled access to him, and was involved in virtually all the research on him, was an MIT neuroscientist named Suzanne Corkin. Copyright 2016 Undark

Keyword: Learning & Memory
Link ID: 22729 - Posted: 10.05.2016

By GRETCHEN REYNOLDS A single concussion experienced by a child or teenager may have lasting repercussions on mental health and intellectual and physical functioning throughout adulthood, and multiple head injuries increase the risks of later problems, according to one of the largest, most elaborate studies to date of the impacts of head trauma on the young. You cannot be an athlete, parent of an athlete, sports fan or reader of this newspaper and not be aware that concussions appear to be both more common — and more dangerous — than most of us once thought. According to a report released last week by the health insurer Blue Cross Blue Shield, based on data from medical claims nationwide, the incidence of diagnosed concussions among people under the age of 20 climbed 71 percent between 2010 and 2015. The rates rose most steeply among girls, with the incidence soaring by 119 percent during that time, although almost twice as many concussions over all were diagnosed in boys. The report acknowledges that the startling increase may partly reflect a growing awareness of the injury among parents, sports officials and physicians, which has led to more diagnoses. But the sheer numbers also suggest that more young people, particularly young athletes, are experiencing head injuries than in the past. Similar increases have been noted among young people in other nations. But the consequences, if any, for their health during adulthood have largely remained unknown. So for the new study, which was funded primarily by the Wellcome Trust and published in August in PLOS Medicine, scientists from Oxford University, Indiana University, the Karolinska Institute in Stockholm and other universities turned to an extensive trove of data about the health of people in Sweden. © 2016 The New York Times Company

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 22728 - Posted: 10.05.2016

Jon Hamilton Want to be smarter? More focused? Free of memory problems as you age? If so, don't count on brain games to help you. That's the conclusion of an exhaustive evaluation of the scientific literature on brain training games and programs. It was published Monday in the journal Psychological Science in the Public Interest. "It's disappointing that the evidence isn't stronger," says Daniel Simons, an author of the article and a psychology professor at the University of Illinois at Urbana-Champaign. "It would be really nice if you could play some games and have it radically change your cognitive abilities," Simons says. "But the studies don't show that on objectively measured real-world outcomes." The evaluation, done by a team of seven scientists, is a response to a very public disagreement about the effectiveness of brain games, Simons says. In October 2014, more than 70 scientists published an open letter objecting to marketing claims made by brain training companies. Pretty soon, another group, with more than 100 scientists, published a rebuttal saying brain training has a solid scientific base. "So you had two consensus statements, each signed by many, many people, that came to essentially opposite conclusions," Simons says. © 2016 npr

Keyword: Learning & Memory
Link ID: 22727 - Posted: 10.05.2016

By Rebecca Robbins, In the months before his death, Robin Williams was besieged by paranoia and so confused he couldn’t remember his lines while filming a movie, as his brain was ambushed by what doctors later identified as an unusually severe case of Lewy body dementia. “Robin was losing his mind and he was aware of it. Can you imagine the pain he felt as he experienced himself disintegrating?” the actor’s widow, Susan Schneider Williams, wrote in a wrenching editorial published this week in the journal Neurology. The title of her piece: “The terrorist inside my husband’s brain.” Susan Williams addressed the editorial to neurologists, writing that she hoped husband’s story would “help you understand your patients along with their spouses and caregivers a little more.” Susan Williams has previously blamed Lewy body dementia for her husband’s death by suicide in 2014. About 1.3 million Americans have the disease, which is caused by protein deposits in the brain. Williams was diagnosed with Parkinson’s disease a few months before he died; the telltale signs of Lewy body dementia in his brain were not discovered until an autopsy. The editorial chronicles Williams’s desperation as he sought to understand a bewildering array of symptoms that started with insomnia, constipation, and an impaired sense of smell and soon spiraled into extreme anxiety, tremors, and difficulty reasoning. © 2016 Scientific American,

Keyword: Alzheimers
Link ID: 22721 - Posted: 10.02.2016

Mia Persson Dogs may look to humans for help in solving impossible tasks thanks to some genes previously linked to social disorders in people. Beagles with particular variants in a gene associated with autism were more likely to sidle up to and make physical contact with a human stranger, researchers report September 29 in Scientific Reports. That gene, SEZ6L, is one of five genes in a particular stretch of beagle DNA associated with sociability in the dogs, animal behaviorist Per Jensen and colleagues at Linköping University in Sweden say. Versions of four of those five genes have been linked to human social disorders such as autism, schizophrenia and aggression. “What we figure has been going on here is that there are genetic variants that tend to make dogs more sociable and these variants have been selected during domestication,” Jensen says. But other researchers say the results are preliminary and need to be confirmed by looking at other dog breeds. Previous genetic studies of dog domestication have not implicated these genes. But, says evolutionary geneticist Bridgett vonHoldt of Princeton University, genes that influence sociability are “not an unlikely target for domestication — as humans, we would be most interested in a protodog that was interested in spending time with humans.” |© Society for Science & the Public 2000 - 2016.

Keyword: Autism; Genes & Behavior
Link ID: 22716 - Posted: 09.30.2016

By Deborah R. Glasofer, Joanna Steinglass Every day on the dot of noon, Jane* would eat her 150-calorie lunch: nonfat yogurt and a handful of berries. To eat earlier, she felt, would be “gluttonous.” To eat later would disrupt the dinner ritual. Jane's eating initially became more restrictive in adolescence, when she worried about the changes her body was undergoing in the natural course of puberty. When she first settled on her lunchtime foods and routine—using a child-size spoon to “make the yogurt last” and sipping water between each bite—she felt accomplished. Jane enjoyed her friends' compliments about her “incredible willpower.” In behavioral science terms, her actions were goal-directed, motivated by achieving a particular outcome. In relatively short order, she got the result she really wanted: weight loss. Years later Jane, now in her 30s and a newspaper reporter, continued to eat the same lunch in the same way. Huddled over her desk in the newsroom, she tried to avoid unwanted attention and feared anything that might interfere with the routine. She no longer felt proud of her behavior. Her friends stopped complimenting her “self-control” years ago, when her weight plummeted perilously low. So low that she has had to be hospitalized on more than one occasion. The longed-for weight loss did not make her feel better about herself or her appearance. Jane's curly hair, once shiny and thick, dulled and thinned; her skin and eyes lost their brightness. There were other costs as well—to her relationships, to her career. Instead of dreaming about a great romance, Jane would dream of the cupcakes she could not let herself have at her niece's birthday party. Instead of thinking about the best lead for her next story, she obsessed over calories and exercise. © 2016 Scientific American

Keyword: Anorexia & Bulimia; Attention
Link ID: 22713 - Posted: 09.30.2016

Jon Hamilton What rats can remember may help people who forget. Researchers are reporting evidence that rats possess "episodic memories," the kind of memories that allow us to go back in time and recall specific events. These memories are among the first to disappear in people who develop Alzheimer's disease. The finding, which appears Thursday in Current Biology, suggests that rats could offer a better way to test potential drugs for Alzheimer's. Right now, most of these drugs are tested in mice. "We need to have a way to study the exact type of memory that we think is impaired in Alzheimer's disease," says Bruce Lamb, a professor of medical and molecular genetics at Indiana University in Indianapolis. He was not involved in the study. The lack of an adequate animal model of Alzheimer's disease may be one reason drugs that seemed to work in mice have failed when given to people, Lamb says. Loss of episodic memories, especially recent ones, is a key sign of Alzheimer's, says Jonathon Crystal, an author of the study and director of the neuroscience program at Indiana University in Bloomington. "So if you visit your grandmother who has Alzheimer's, [she] isn't going to remember that you were visiting a couple of weeks ago and what you described about things that are going on in your life," he says. Crystal and a team of researchers thought rats might have some form of episodic memory. So they began doing studies that relied on the animals' remarkable ability to recognize a wide range of odors, like basil and banana and strawberry. © 2016 npr

Keyword: Alzheimers
Link ID: 22711 - Posted: 09.30.2016

By CATHERINE SAINT LOUIS Increasing numbers of children have high blood pressure, largely as a consequence of their obesity. A growing body of evidence suggests that high blood pressure may impair children’s cognitive skills, reducing their ability to remember, pay attention and organize facts. In the most comprehensive study to date, published on Thursday in The Journal of Pediatrics, 75 children ages 10 to 18 with untreated high blood pressure performed worse on several tests of cognitive function, compared with 75 peers who had normal blood pressure. The differences were subtle, and the new research does not prove that high blood pressure diminishes cognitive skills in children. Still, the findings set off alarm bells among some experts. “This study really shows there are some differences,” said Dr. David B. Kershaw, the director of pediatric nephrology at C. S. Mott Children’s Hospital at the University of Michigan, who was not involved with the research. “This was not just random chance.” Dr. Marc B. Lande, a professor of pediatric nephrology at the University of Rochester Medical Center, and his colleagues had children tested at four sites in three states, matching those with and without high blood pressure by age, maternal education, race, obesity levels and other factors. The researchers excluded children with learning disabilities and sleep problems, which can affect cognitive skills. Children with elevated blood pressure performed worse than their peers on tests of memory, processing speed and verbal skills, the researchers found. But all the scores were still in the normal range. Because of increased obesity, elevated blood pressure, also called hypertension, is no longer rare in children, though it is underdiagnosed. In a recent survey, about 3.5 percent of 14,187 children ages 3 to 18 had hypertension. © 2016 The New York Times Company

Keyword: ADHD; Obesity
Link ID: 22709 - Posted: 09.29.2016

Hannah Devlin Science correspondent Scientists have found the most definitive evidence yet that some people are destined to age quicker and die younger than others - regardless of their lifestyle. The findings could explain the seemingly random and unfair way that death is sometimes dealt out, and raise the intriguing future possibility of being able to extend the natural human lifespan. “You get people who are vegan, sleep 10 hours a day, have a low-stress job, and still end up dying young,” said Steve Horvath, a biostatistician who led the research at the University of California, Los Angeles. “We’ve shown some people have a faster innate ageing rate.” A higher biological age, regardless of actual age, was consistently linked to an earlier death, the study found. For the 5% of the population who age fastest, this translated to a roughly 50% greater than average risk of death at any age. Intriguingly, the biological changes linked to ageing are potentially reversible, raising the prospect of future treatments that could arrest the ageing process and extend the human lifespan. “The great hope is that we find anti-ageing interventions that would slow your innate ageing rate,” said Horvath. “This is an important milestone to realising this dream.” Horvath’s ageing “clock” relies on measuring subtle chemical changes, in which methyl compounds attach or detach from the genome without altering the underlying code of our DNA. © 2016 Guardian News and Media Limited

Keyword: Development of the Brain; Epigenetics
Link ID: 22708 - Posted: 09.29.2016

By Edd Gent, A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say. The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse. The researchers said the new device could lead to significant advances in brain-inspired—or neuromorphic—computers, which could be much better at perceptual and learning tasks than traditional computers, as well as far more energy efficient. "In the past, people have used devices like transistors and capacitors to simulate synaptic dynamics, which can work, but those devices have very little resemblance to real biological systems. So it's not efficient to do it that way, and it results in a larger device area, larger energy consumption and less fidelity," said study leader Joshua Yang, a professor of electrical and computer engineering at the University of Massachusetts Amherst. [10 Things You Didn't Know About the Brain] Previous research has suggested that the human brain has about 100 billion neurons and approximately 1 quadrillion (1 million billion) synapses. A brain-inspired computer would ideally be designed to mimic the brain's enormous computing power and efficiency, scientists have said. © 2016 Scientific American

Keyword: Robotics; Learning & Memory
Link ID: 22705 - Posted: 09.28.2016

By GRETCHEN REYNOLDS Before you skip another workout, you might think about your brain. A provocative new study finds that some of the benefits of exercise for brain health may evaporate if we take to the couch and stop being active, even just for a week or so. I have frequently written about how physical activity, especially endurance exercise like running, aids our brains and minds. Studies with animals and people show that working out can lead to the creation of new neurons, blood vessels and synapses and greater overall volume in areas of the brain related to memory and higher-level thinking. Presumably as a result, people and animals that exercise tend to have sturdier memories and cognitive skills than their sedentary counterparts. Exercise prompts these changes in large part by increasing blood flow to the brain, many exercise scientists believe. Blood carries fuel and oxygen to brain cells, along with other substances that help to jump-start desirable biochemical processes there, so more blood circulating in the brain is generally a good thing. Exercise is particularly important for brain health because it appears to ramp up blood flow through the skull not only during the actual activity, but throughout the rest of the day. In past neurological studies, when sedentary people began an exercise program, they soon developed augmented blood flow to their brains, even when they were resting and not running or otherwise moving. But whether those improvements in blood flow are permanent or how long they might last was not clear. So for the new study, which was published in August in Frontiers in Aging Neuroscience, researchers from the department of kinesiology at the University of Maryland in College Park decided to ask a group of exceedingly fit older men and women to stop exercising for awhile. © 2016 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 22704 - Posted: 09.28.2016

Ramin Skibba. Physiologist Ivan Pavlov conditioned dogs to associate food with the sound of a buzzer, which left them salivating. Decades later, researchers discovered such training appears to block efforts to teach the animals to link other stimuli to the same reward. Dogs trained to expect food when a buzzer sounds can then be conditioned to salivate when they are exposed to the noise and a flash of light simultaneously. But light alone will not cue them to drool. This ‘blocking effect’ is well-known in psychology, but new research suggests that the concept might not be so simple. Psychologists in Belgium failed to replicate the effect in 15 independent experiments, they report this month in the Journal of Experimental Psychology1. “For a long time, you tend to think, ‘It’s me’ — I’m doing something wrong, or messing up the experiment,’” says lead author Tom Beckers, a psychologist at the Catholic University of Leuven (KU Leuven) in Belgium. But after his student, co-author Elisa Maes, also could not replicate the blocking effect, and the team failed again in experiments in other labs, Beckers realized that “it can’t just be us”. The scientists do not claim that the blocking effect is not real, or that previous observations of it are wrong. Instead, Beckers thinks that psychologists do not yet know enough about the precise conditions under which it applies. © 2016 Macmillan Publishers Limited,

Keyword: Learning & Memory
Link ID: 22701 - Posted: 09.27.2016

By Abdul-Kareem Ahmed In the world of recreational and professional sports, many athletes—particularly in contact sports—suffer concussions. These mild traumatic brain injuries cause headaches, memory problems and confusion, but usually resolve on their own with rest. Some players, however, especially after repeated concussions, continue to experience symptoms for many months—a phenomenon termed post-concussion syndrome. A few of these players will eventually develop chronic traumatic encephalopathy (CTE), a progressive neurodegenerative disease that causes dementia symptoms similar to Alzheimer’s disease. CTE can lead to personality changes, movement problems and, sometimes, mortality. CTE is diagnosed after death because it requires postmortem examination of a player’s brain. Post-concussion syndrome, in contrast, is diagnosed based on patient symptoms. To date, doctors do not have any objective tests to determine syndrome severity or relate it to the risk of developing CTE. Now, a group of researchers from Sweden and the U.K. say they have developed such a test, reporting their findings last week in JAMA Neurology. The test measures biomarkers in the cerebrospinal fluid—the colorless liquid that supports and suspends the brain and spinal cord—that appear to provide a measure of concussion severity and CTE risk. The researchers collected cerebrospinal fluid via spinal taps from 16 professional Swedish ice hockey players and a similar number of healthy individuals. The hockey players had all experienced post-concussion syndrome, causing nine of them to retire from the game. © 2016 Scientific American,

Keyword: Brain Injury/Concussion; Alzheimers
Link ID: 22700 - Posted: 09.27.2016

By KEN BELSON One of the frustrations of researchers who study chronic traumatic encephalopathy, the degenerative brain disease linked to repeated head hits, is that it can be detected only in autopsies, and not in the living. Researchers, though, have been trying to solve this problem in two primary ways: by identifying biomarkers linked to the disease that show up on imaging tests in certain locations in the brain, and by trying to locate in the blood the protein that is the hallmark of the disease. On Monday, two groups of researchers said they had made what they considered small steps in developing both methods. The announcements are small parts of much larger studies that will take years to bear fruit, if they ever do. Both methods have been questioned by detractors, some of whom say the hype is getting ahead of the science. Scientists, these critics note, have spent decades trying to find ways to accurately diagnose Alzheimer’s disease, which has some of the same characteristics as C.T.E. Still, at a medical conference in Boston on Monday, Robert Stern, a professor of neurology at Boston University, said technology developed by the company Quanterix (paid for in part with a grant from the N.F.L.) had identified elevated levels of tau proteins in blood samples of 96 former football players between 40 and 69 years old, compared with only 25 people of the same age in a control group. The results, which are part of a seven-year study and are under review for publication, are preliminary because they identify only the total amount of tau in the blood, not the amount of the specific tau linked to C.T.E. Additional tests are being done in Sweden to determine the amount of the C.T.E.-related tau in the blood samples, Stern said. Even so, Stern said, the blood samples from the 96 former players suggest that absorbing repeated head hits earlier in life can lead to higher concentrations of tau in the blood later. © 2016 The New York Times Company

Keyword: Brain Injury/Concussion; Alzheimers
Link ID: 22699 - Posted: 09.27.2016

By CONOR DOUGHERTY SAN FRANCISCO — Every now and again, when I’m feeling a little down, I go to Baseball-Reference.com and look up the San Francisco Giants’ box score from July 29, 2012. It’s an odd choice for a Giants fan. The Los Angeles Dodgers won, 4-0, completing a weekend sweep in which they outscored the Giants by 19-3 and tied them for the lead in the National League West. The Giants went on to win the World Series that year, but that’s not why I remember the July 29 game. I remember that afternoon because my mom, in the throes of Alzheimer’s, left the house she shared with my dad in the Noe Valley neighborhood, walked four or so miles and somehow ended up at AT&T Park. Then she went inside and watched her team. It took a while for me to believe this. When Mom told me she had gone to the park — my dad barely watches baseball, so the Giants have always been a thing between me and Mom — I assumed it was an old memory misplaced on a new day. But it turned out that Sunday game did overlap with the hours she had been out, and a month or so later my dad got a credit card bill with the charge for the ticket. I can’t tell you when Mom cheered or if she managed to find her seat. All I know is Clayton Kershaw struck out seven, the Giants had five hits, and even though I’ve committed these statistics to memory, I still like looking them up. On the chance that this hasn’t been clubbed into your head by now, the Giants have won the World Series in every even-numbered year this decade. And for reasons that I choose to see as cosmic, this run of baseball dominance has tracked my mom’s descent into Alzheimer’s. The disease doesn’t take people from you in a day or a week or a season. You get years of steady disappearance, with an indeterminate end. So for me and Mom and baseball, this decade has been a long goodbye. © 2016 The New York Times Company

Keyword: Alzheimers
Link ID: 22690 - Posted: 09.24.2016

By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 22674 - Posted: 09.21.2016