Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1310

By Agata Blaszczak-Boxe Some rodents have a sweet tooth. And sometimes, you need to get crafty to reach your sugar fix. Rats have been filmed for the first time using hooked tools to get chocolate cereal – a manifestation of their critter intelligence. Akane Nagano and Kenjiro Aoyama, of Doshisha University in Kyotanabe, Japan, placed eight brown rats in a transparent box and trained them to pull small hooked tools to obtain the cereal that was otherwise beyond their reach. In one experiment they gave them two similar hooked tools, one of which worked well for the food retrieval task, and the other did not. The rats quickly learned to choose the correct tool for the job, selecting it 95 per cent of the time. The experiments showed that the rats understood the spatial arrangement between the food and the tool. The team’s study is the first to demonstrate that rats are able to use tools, says Nagano. The rats did get a little confused in the final experiment. When the team gave them a rake that looked the part but with a bottom was too soft and flimsy to move the cereal, they still tried to use it as much as the working tool that was also available. But, says Nagano, it is possible their eyesight was simply not good enough for them to tell that the flimsy tool wasn’t up to the task. The rodents’ crafty feat places them in the ever-growing club of known tool-using animals such as chimps, bearded capuchin monkeys, New Caledonian crows, alligators and even some fish. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Intelligence
Link ID: 22774 - Posted: 10.22.2016

By Gareth Cook According to the American Psychiatric Association, about 5 percent of American children suffer from Attention Deficit Hyperactivity Disorder (ADHD), yet the diagnosis is given to some 15 percent of American children, many of whom are placed on powerful drugs with lifelong consequences. This is the central fact of the journalist Alan Schwarz’s new book, ADHD Nation. Explaining this fact—how it is that perhaps two thirds of the children diagnosed with ADHD do not actually suffer from the disorder—is the book’s central mystery. The result is a damning indictment of the pharmaceutical industry, and an alarming portrait of what is being done to children in the name of mental health. What prompted you to write this book? In 2011, having spent four years exposing the dangers of concussions in the National Football League and youth sports for The New York Times, I wanted another project. I had heard that high school students in my native Westchester County (just north of New York City) were snorting Adderall before the S.A.T.'s to focus during the test. I was horrified and wanted to learn more. I saw it not as a "child psychiatry" story, and not as a "drug abuse" story, but one about academic pressure and the demands our children feel they're under. When I looked deeper, it was obvious that our nationwide system of ADHD treatment was completely scattershot—basically, many doctors were merely prescribing with little thought into whether a kid really had ADHD or not, and then the pills would be bought and sold among students who had no idea what they were messing with. I asked the ADHD and child-psychiatry establishment about this, and they denied it was happening. They denied that there were many false diagnoses. They denied that teenagers were buying and selling pills. They denied that the national diagnosis rates reported by the C.D.C.—then 9.5 percent of children aged 4-17, now 11 percent and still growing—were valid. They basically denied that anything about their world was malfunctioning at all. In the end, they doth protest too much. I wrote about 10 front-page stories for The New York Times on the subject from 2012-2014. © 2016 Scientific American,

Keyword: ADHD; Drug Abuse
Link ID: 22747 - Posted: 10.12.2016

Erin Ross The teenage brain has been characterized as a risk-taking machine, looking for quick rewards and thrills instead of acting responsibly. But these behaviors could actually make teens better than adults at certain kinds of learning. "In neuroscience, we tend to think that if healthy brains act in a certain way, there should be a reason for it," says Juliet Davidow, a postdoctoral researcher at Harvard University in the Affective Neuroscience and Development Lab and the lead author of the study, which was published Wednesday in the journal Neuron. But scientists and the public often focus on the negatives of teen behavior, so she and her colleagues set out to test the hypothesis that teenagers' drive for rewards, and the risk-taking that comes from it, exist for a reason. When it comes to what drives reward-seeking in teens, fingers have always been pointed at the striatum, a lobster-claw-shape structure in the brain. When something surprising and good happens — say, you find $20 on the street — your body produces the pleasure-related hormone dopamine, and the striatum responds. "Research shows that the teenage striatum is very active," says Davidow. This suggests that teens are hard-wired to seek immediate rewards. But, she adds, it's also shown that their prefrontal cortex, which helps with impulse control, isn't fully developed. Combined, these two things have given teens their risky rep. But the striatum isn't just involved in reward-seeking. It's also involved in learning from rewards, explains Daphna Shohamy, a cognitive neuroscientist at the Zuckerman Mind Brain Behavior Institute at Columbia University who worked on the study. She wanted to see if teenagers would be better at this type of learning than adults would. © 2016 npr

Keyword: Development of the Brain; Learning & Memory
Link ID: 22738 - Posted: 10.10.2016

Richard A. Friedman There’s a reason adults don’t pick up Japanese or learn how to kite surf. It’s ridiculously hard. In stark contrast, young people can learn the most difficult things relatively easily. Polynomials, Chinese, skateboarding — no problem! Neuroplasticity — the brain’s ability to form new neural connections and be influenced by the environment — is greatest in childhood and adolescence, when the brain is still a work in progress. But this window of opportunity is finite. Eventually it slams shut. Or so we thought. Until recently, the conventional wisdom within the fields of neuroscience and psychiatry has been that development is a one-way street, and once a person has passed through his formative years, experiences and abilities are very hard, if not impossible, to change. What if we could turn back the clock in the brain and recapture its earlier plasticity? This possibility is the focus of recent research in animals and humans. The basic idea is that during critical periods of brain development, the neural circuits that help give rise to mental states and behaviors are being sculpted and are particularly sensitive to the effects of experience. If we can understand what starts and stops these periods, perhaps we can restart them. Think of the brain’s sensitive periods as blown glass: The molten glass is very malleable, but you have a relatively brief time before it cools and becomes crystalline. Put it back into the furnace, and it can once again change shape. © 2016 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 22737 - Posted: 10.10.2016

Dean Burnett Throughout history, people have always worried about new technologies. The fear that the human brain cannot cope with the onslaught of information made possible by the latest development was first voiced in response to the printing press, back in the sixteenth century. Swap “printing press” for “internet” and you have the exact same concerns today, regularly voiced in the mainstream media, and usually focused on children. But is there any legitimacy to these claims? Or are they just needless scaremongering? There are several things to bear in mind when considering how our brains deal with the internet. The human brain is always dealing with a constant stream of rich information - that’s what the real world is First, don’t forget that “the internet” is a very vague term, given that it contains so many things across so many formats. You could, for instance, develop a gambling addiction via online casinos or poker sites. This is an example of someone’s brain being negatively affected via the internet, but it would be difficult to argue that the internet is the main culprit, any more than a gambling addiction obtained via a real world casino can be blamed on “buildings”; it’s just the context in which the problem occurred. However, the internet does give us a far more direct, constant and wide ranging access to information than pretty much anything else in human history. So how could, or does, this affect us and our brains? © 2016 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 22736 - Posted: 10.10.2016

/ By Seth Mnookin When Henry Molaison died at a Connecticut nursing home in 2008, at the age of 82, a front-page obituary in The New York Times called him “the most important patient in the history of brain science.” It was no exaggeration: Much of what we know about how memory works is derived from experiments on Molaison, a patient with severe epilepsy who in 1953 had undergone an operation that left him without medial temporal lobes and the ability to form new memories. The operation didn’t completely stop Molaison’s seizures — the surgeon, William Beecher Scoville, had done little more than guess at the locus of his affliction — but by chance, it rendered him a near-perfect research subject. Not only could postoperative changes in his behavior be attributed to the precise area of his brain that had been removed, but the fact that he couldn’t remember what had happened 30 seconds earlier made him endlessly patient and eternally willing to endure all manner of experiments. It didn’t take long for those experiments to upend our understanding of the human brain. By the mid-1950s, studies on Molaison (known until his death only as Patient H.M.) had shown that, contrary to popular belief, memories were created not in the brain as a whole, but in specific regions — and that different types of memories were formed in different ways. Molaison remained a research subject until his death, and for the last 41 years of his life, the person who controlled access to him, and was involved in virtually all the research on him, was an MIT neuroscientist named Suzanne Corkin. Copyright 2016 Undark

Keyword: Learning & Memory
Link ID: 22729 - Posted: 10.05.2016

Jon Hamilton Want to be smarter? More focused? Free of memory problems as you age? If so, don't count on brain games to help you. That's the conclusion of an exhaustive evaluation of the scientific literature on brain training games and programs. It was published Monday in the journal Psychological Science in the Public Interest. "It's disappointing that the evidence isn't stronger," says Daniel Simons, an author of the article and a psychology professor at the University of Illinois at Urbana-Champaign. "It would be really nice if you could play some games and have it radically change your cognitive abilities," Simons says. "But the studies don't show that on objectively measured real-world outcomes." The evaluation, done by a team of seven scientists, is a response to a very public disagreement about the effectiveness of brain games, Simons says. In October 2014, more than 70 scientists published an open letter objecting to marketing claims made by brain training companies. Pretty soon, another group, with more than 100 scientists, published a rebuttal saying brain training has a solid scientific base. "So you had two consensus statements, each signed by many, many people, that came to essentially opposite conclusions," Simons says. © 2016 npr

Keyword: Learning & Memory
Link ID: 22727 - Posted: 10.05.2016

By Deborah R. Glasofer, Joanna Steinglass Every day on the dot of noon, Jane* would eat her 150-calorie lunch: nonfat yogurt and a handful of berries. To eat earlier, she felt, would be “gluttonous.” To eat later would disrupt the dinner ritual. Jane's eating initially became more restrictive in adolescence, when she worried about the changes her body was undergoing in the natural course of puberty. When she first settled on her lunchtime foods and routine—using a child-size spoon to “make the yogurt last” and sipping water between each bite—she felt accomplished. Jane enjoyed her friends' compliments about her “incredible willpower.” In behavioral science terms, her actions were goal-directed, motivated by achieving a particular outcome. In relatively short order, she got the result she really wanted: weight loss. Years later Jane, now in her 30s and a newspaper reporter, continued to eat the same lunch in the same way. Huddled over her desk in the newsroom, she tried to avoid unwanted attention and feared anything that might interfere with the routine. She no longer felt proud of her behavior. Her friends stopped complimenting her “self-control” years ago, when her weight plummeted perilously low. So low that she has had to be hospitalized on more than one occasion. The longed-for weight loss did not make her feel better about herself or her appearance. Jane's curly hair, once shiny and thick, dulled and thinned; her skin and eyes lost their brightness. There were other costs as well—to her relationships, to her career. Instead of dreaming about a great romance, Jane would dream of the cupcakes she could not let herself have at her niece's birthday party. Instead of thinking about the best lead for her next story, she obsessed over calories and exercise. © 2016 Scientific American

Keyword: Anorexia & Bulimia; Attention
Link ID: 22713 - Posted: 09.30.2016

By CATHERINE SAINT LOUIS Increasing numbers of children have high blood pressure, largely as a consequence of their obesity. A growing body of evidence suggests that high blood pressure may impair children’s cognitive skills, reducing their ability to remember, pay attention and organize facts. In the most comprehensive study to date, published on Thursday in The Journal of Pediatrics, 75 children ages 10 to 18 with untreated high blood pressure performed worse on several tests of cognitive function, compared with 75 peers who had normal blood pressure. The differences were subtle, and the new research does not prove that high blood pressure diminishes cognitive skills in children. Still, the findings set off alarm bells among some experts. “This study really shows there are some differences,” said Dr. David B. Kershaw, the director of pediatric nephrology at C. S. Mott Children’s Hospital at the University of Michigan, who was not involved with the research. “This was not just random chance.” Dr. Marc B. Lande, a professor of pediatric nephrology at the University of Rochester Medical Center, and his colleagues had children tested at four sites in three states, matching those with and without high blood pressure by age, maternal education, race, obesity levels and other factors. The researchers excluded children with learning disabilities and sleep problems, which can affect cognitive skills. Children with elevated blood pressure performed worse than their peers on tests of memory, processing speed and verbal skills, the researchers found. But all the scores were still in the normal range. Because of increased obesity, elevated blood pressure, also called hypertension, is no longer rare in children, though it is underdiagnosed. In a recent survey, about 3.5 percent of 14,187 children ages 3 to 18 had hypertension. © 2016 The New York Times Company

Keyword: ADHD; Obesity
Link ID: 22709 - Posted: 09.29.2016

By Edd Gent, A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say. The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse. The researchers said the new device could lead to significant advances in brain-inspired—or neuromorphic—computers, which could be much better at perceptual and learning tasks than traditional computers, as well as far more energy efficient. "In the past, people have used devices like transistors and capacitors to simulate synaptic dynamics, which can work, but those devices have very little resemblance to real biological systems. So it's not efficient to do it that way, and it results in a larger device area, larger energy consumption and less fidelity," said study leader Joshua Yang, a professor of electrical and computer engineering at the University of Massachusetts Amherst. [10 Things You Didn't Know About the Brain] Previous research has suggested that the human brain has about 100 billion neurons and approximately 1 quadrillion (1 million billion) synapses. A brain-inspired computer would ideally be designed to mimic the brain's enormous computing power and efficiency, scientists have said. © 2016 Scientific American

Keyword: Robotics; Learning & Memory
Link ID: 22705 - Posted: 09.28.2016

By GRETCHEN REYNOLDS Before you skip another workout, you might think about your brain. A provocative new study finds that some of the benefits of exercise for brain health may evaporate if we take to the couch and stop being active, even just for a week or so. I have frequently written about how physical activity, especially endurance exercise like running, aids our brains and minds. Studies with animals and people show that working out can lead to the creation of new neurons, blood vessels and synapses and greater overall volume in areas of the brain related to memory and higher-level thinking. Presumably as a result, people and animals that exercise tend to have sturdier memories and cognitive skills than their sedentary counterparts. Exercise prompts these changes in large part by increasing blood flow to the brain, many exercise scientists believe. Blood carries fuel and oxygen to brain cells, along with other substances that help to jump-start desirable biochemical processes there, so more blood circulating in the brain is generally a good thing. Exercise is particularly important for brain health because it appears to ramp up blood flow through the skull not only during the actual activity, but throughout the rest of the day. In past neurological studies, when sedentary people began an exercise program, they soon developed augmented blood flow to their brains, even when they were resting and not running or otherwise moving. But whether those improvements in blood flow are permanent or how long they might last was not clear. So for the new study, which was published in August in Frontiers in Aging Neuroscience, researchers from the department of kinesiology at the University of Maryland in College Park decided to ask a group of exceedingly fit older men and women to stop exercising for awhile. © 2016 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 22704 - Posted: 09.28.2016

Ramin Skibba. Physiologist Ivan Pavlov conditioned dogs to associate food with the sound of a buzzer, which left them salivating. Decades later, researchers discovered such training appears to block efforts to teach the animals to link other stimuli to the same reward. Dogs trained to expect food when a buzzer sounds can then be conditioned to salivate when they are exposed to the noise and a flash of light simultaneously. But light alone will not cue them to drool. This ‘blocking effect’ is well-known in psychology, but new research suggests that the concept might not be so simple. Psychologists in Belgium failed to replicate the effect in 15 independent experiments, they report this month in the Journal of Experimental Psychology1. “For a long time, you tend to think, ‘It’s me’ — I’m doing something wrong, or messing up the experiment,’” says lead author Tom Beckers, a psychologist at the Catholic University of Leuven (KU Leuven) in Belgium. But after his student, co-author Elisa Maes, also could not replicate the blocking effect, and the team failed again in experiments in other labs, Beckers realized that “it can’t just be us”. The scientists do not claim that the blocking effect is not real, or that previous observations of it are wrong. Instead, Beckers thinks that psychologists do not yet know enough about the precise conditions under which it applies. © 2016 Macmillan Publishers Limited,

Keyword: Learning & Memory
Link ID: 22701 - Posted: 09.27.2016

By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 22674 - Posted: 09.21.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Keyword: Intelligence; Attention
Link ID: 22666 - Posted: 09.19.2016

By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.

Keyword: Intelligence; Learning & Memory
Link ID: 22659 - Posted: 09.15.2016

By Julia Shaw The brain, with its 100 billion neurons, allows us to do amazing things like learn multiple languages, or build things that send people into outer space. Yet despite this astonishing capacity, we routinely can’t remember where we put our keys, we forget why we went to the grocery store, and we fail when trying to recall personal life events. This apparent contradiction in functionality opens up the question of why we forget some things but remember others. Or, more fundamentally, what causes forgetting? This week my book ‘The Memory Illusion’ drops in Canada, and as a Canadian girl I want to celebrate this by showcasing some Canadian researchers who have given us insight into precisely this question. An article published recently in Psychological Science by Talya Sadeh and colleagues at the Rotman Research institute in Toronto addresses a long-running debate in the world of memory science; do we forget things because of decay or interference? Decay. Advocates of the decay account posit that our memories slowly disappear, fading because of a passage of time during which they have not been accessed. You can picture this much like a message written in sand, with every ocean wave that flows over the shore making the writing less legible until it eventually disappears entirely. The sand represents the web of brain cells that form a memory in the brain, and the ocean waves represent time passing. © 2016 Scientific American,

Keyword: Learning & Memory
Link ID: 22651 - Posted: 09.13.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Keyword: Attention; Learning & Memory
Link ID: 22646 - Posted: 09.12.2016

By GRETCHEN REYNOLDS A busy brain can mean a hungry body. We often seek food after focused mental activity, like preparing for an exam or poring over spreadsheets. Researchers speculate that heavy bouts of thinking drain energy from the brain, whose capacity to store fuel is very limited. So the brain, sensing that it may soon require more calories to keep going, apparently stimulates bodily hunger, and even though there has been little in the way of physical movement or caloric expenditure, we eat. This process may partly account for the weight gain so commonly seen in college students. Scientists at the University of Alabama at Birmingham and another institution recently experimented with exercise to counter such post-­study food binges. Gary Hunter, an exercise physiologist at U.A.B., oversaw the study, which was published this month in the journal Medicine & Science in Sports & Exercise. Hunter notes that strenuous activity both increases the amount of blood sugar and lactate — a byproduct of intense muscle contractions — circulating in the blood and augments blood flow to the head. Because the brain uses sugar and lactate as fuel, researchers wondered if the increased flow of fuel-rich blood during exercise could feed an exhausted brain and reduce the urge to overeat. Thirty-­eight healthy college students were invited to U.A.B.’s exercise lab to determine their fitness and metabolic rates — and to report what their favorite pizza was. Afterward, they sat quietly for 35 minutes before being given as much of their favorite pizza as they wanted, which established a baseline measure of self-­indulgence. At a later date, the volunteers returned and spent 20 minutes tackling selections from college and graduate-­school entrance exams. Hunter says this work has been used in other studies “to induce mental fatigue and hunger.” Next, half the students sat quietly for 15 minutes, before being given pizza. The rest of the volunteers spent those 15 minutes doing intervals on a treadmill: two minutes of hard running followed by about one minute of walking, repeated five times. This is the sort of brief but intensive routine, Hunter says, that should prompt the release of sugar and lactate into the bloodstream. These students were then allowed to gorge on pizza, too. But by and large, they did not overeat. © 2016 The New York Times Company

Keyword: Obesity; Learning & Memory
Link ID: 22643 - Posted: 09.10.2016

By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist

Keyword: Learning & Memory; Attention
Link ID: 22640 - Posted: 09.10.2016

By Amy Ellis Nutt Before iPhones and thumb drives, before Google docs and gigabytes of RAM, memory was more art than artifact. It wasn’t a tool or a byproduct of being human. It was essential to our character and therefore a powerful theme in both myth and literature. At the end of Book 2 of the “Divine Comedy,” with Paradise nearly in reach, Dante is dipped into the River Lethe, where the sins of the self are washed away in the waters of forgetfulness. To be truly cleansed of his memories, however, Dante must also drink from the river of oblivion. Only then will he be truly purified and the memories of his good deeds restored to him. Before we can truly remember, according to Dante, we must forget. In “Patient H.M.: A Story of Memory, Madness, and Family Secrets,” author Luke Dittrich seems to be saying that before we can forgive, we must remember. The terrible irony is that H.M., the real-life character around whom Dittrich’s book revolves, had no memory at all. In prose both elegant and intimate, and often thrilling, “Patient H.M.” is an important book about the wages not of sin but of science. It is deeply reported and surprisingly emotional, at times poignant, at others shocking. H.M., arguably the single most important research subject in the history of neuroscience, was once Henry Molaison, an ordinary New England boy. When Henry was 9 years old, he was hit by a bicyclist as he walked across the street in his home town, Hartford, Conn. © 1996-2016 The Washington Post

Keyword: Learning & Memory
Link ID: 22604 - Posted: 08.27.2016