Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1299

Ramin Skibba. Physiologist Ivan Pavlov conditioned dogs to associate food with the sound of a buzzer, which left them salivating. Decades later, researchers discovered such training appears to block efforts to teach the animals to link other stimuli to the same reward. Dogs trained to expect food when a buzzer sounds can then be conditioned to salivate when they are exposed to the noise and a flash of light simultaneously. But light alone will not cue them to drool. This ‘blocking effect’ is well-known in psychology, but new research suggests that the concept might not be so simple. Psychologists in Belgium failed to replicate the effect in 15 independent experiments, they report this month in the Journal of Experimental Psychology1. “For a long time, you tend to think, ‘It’s me’ — I’m doing something wrong, or messing up the experiment,’” says lead author Tom Beckers, a psychologist at the Catholic University of Leuven (KU Leuven) in Belgium. But after his student, co-author Elisa Maes, also could not replicate the blocking effect, and the team failed again in experiments in other labs, Beckers realized that “it can’t just be us”. The scientists do not claim that the blocking effect is not real, or that previous observations of it are wrong. Instead, Beckers thinks that psychologists do not yet know enough about the precise conditions under which it applies. © 2016 Macmillan Publishers Limited,

Keyword: Learning & Memory
Link ID: 22701 - Posted: 09.27.2016

By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 22674 - Posted: 09.21.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Keyword: Intelligence; Attention
Link ID: 22666 - Posted: 09.19.2016

By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.

Keyword: Intelligence; Learning & Memory
Link ID: 22659 - Posted: 09.15.2016

By Julia Shaw The brain, with its 100 billion neurons, allows us to do amazing things like learn multiple languages, or build things that send people into outer space. Yet despite this astonishing capacity, we routinely can’t remember where we put our keys, we forget why we went to the grocery store, and we fail when trying to recall personal life events. This apparent contradiction in functionality opens up the question of why we forget some things but remember others. Or, more fundamentally, what causes forgetting? This week my book ‘The Memory Illusion’ drops in Canada, and as a Canadian girl I want to celebrate this by showcasing some Canadian researchers who have given us insight into precisely this question. An article published recently in Psychological Science by Talya Sadeh and colleagues at the Rotman Research institute in Toronto addresses a long-running debate in the world of memory science; do we forget things because of decay or interference? Decay. Advocates of the decay account posit that our memories slowly disappear, fading because of a passage of time during which they have not been accessed. You can picture this much like a message written in sand, with every ocean wave that flows over the shore making the writing less legible until it eventually disappears entirely. The sand represents the web of brain cells that form a memory in the brain, and the ocean waves represent time passing. © 2016 Scientific American,

Keyword: Learning & Memory
Link ID: 22651 - Posted: 09.13.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Keyword: Attention; Learning & Memory
Link ID: 22646 - Posted: 09.12.2016

By GRETCHEN REYNOLDS A busy brain can mean a hungry body. We often seek food after focused mental activity, like preparing for an exam or poring over spreadsheets. Researchers speculate that heavy bouts of thinking drain energy from the brain, whose capacity to store fuel is very limited. So the brain, sensing that it may soon require more calories to keep going, apparently stimulates bodily hunger, and even though there has been little in the way of physical movement or caloric expenditure, we eat. This process may partly account for the weight gain so commonly seen in college students. Scientists at the University of Alabama at Birmingham and another institution recently experimented with exercise to counter such post-­study food binges. Gary Hunter, an exercise physiologist at U.A.B., oversaw the study, which was published this month in the journal Medicine & Science in Sports & Exercise. Hunter notes that strenuous activity both increases the amount of blood sugar and lactate — a byproduct of intense muscle contractions — circulating in the blood and augments blood flow to the head. Because the brain uses sugar and lactate as fuel, researchers wondered if the increased flow of fuel-rich blood during exercise could feed an exhausted brain and reduce the urge to overeat. Thirty-­eight healthy college students were invited to U.A.B.’s exercise lab to determine their fitness and metabolic rates — and to report what their favorite pizza was. Afterward, they sat quietly for 35 minutes before being given as much of their favorite pizza as they wanted, which established a baseline measure of self-­indulgence. At a later date, the volunteers returned and spent 20 minutes tackling selections from college and graduate-­school entrance exams. Hunter says this work has been used in other studies “to induce mental fatigue and hunger.” Next, half the students sat quietly for 15 minutes, before being given pizza. The rest of the volunteers spent those 15 minutes doing intervals on a treadmill: two minutes of hard running followed by about one minute of walking, repeated five times. This is the sort of brief but intensive routine, Hunter says, that should prompt the release of sugar and lactate into the bloodstream. These students were then allowed to gorge on pizza, too. But by and large, they did not overeat. © 2016 The New York Times Company

Keyword: Obesity; Learning & Memory
Link ID: 22643 - Posted: 09.10.2016

By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist

Keyword: Learning & Memory; Attention
Link ID: 22640 - Posted: 09.10.2016

By Amy Ellis Nutt Before iPhones and thumb drives, before Google docs and gigabytes of RAM, memory was more art than artifact. It wasn’t a tool or a byproduct of being human. It was essential to our character and therefore a powerful theme in both myth and literature. At the end of Book 2 of the “Divine Comedy,” with Paradise nearly in reach, Dante is dipped into the River Lethe, where the sins of the self are washed away in the waters of forgetfulness. To be truly cleansed of his memories, however, Dante must also drink from the river of oblivion. Only then will he be truly purified and the memories of his good deeds restored to him. Before we can truly remember, according to Dante, we must forget. In “Patient H.M.: A Story of Memory, Madness, and Family Secrets,” author Luke Dittrich seems to be saying that before we can forgive, we must remember. The terrible irony is that H.M., the real-life character around whom Dittrich’s book revolves, had no memory at all. In prose both elegant and intimate, and often thrilling, “Patient H.M.” is an important book about the wages not of sin but of science. It is deeply reported and surprisingly emotional, at times poignant, at others shocking. H.M., arguably the single most important research subject in the history of neuroscience, was once Henry Molaison, an ordinary New England boy. When Henry was 9 years old, he was hit by a bicyclist as he walked across the street in his home town, Hartford, Conn. © 1996-2016 The Washington Post

Keyword: Learning & Memory
Link ID: 22604 - Posted: 08.27.2016

Ian Sample Science editor For Jules Verne it was the friend who keeps us waiting. For Edgar Allan Poe so many little slices of death. But though the reason we spend a third of our lives asleep has so far resisted scientific explanation, research into the impact of sleepless nights on brain function has shed fresh light on the mystery - and also offered intriguing clues to potential treatments for depression. In a study published on Tuesday, researchers show for the first time that sleep resets the steady build-up of connectivity in the human brain which takes place in our waking hours. The process appears to be crucial for our brains to remember and learn so we can adapt to the world around us. The loss of a single night’s sleep was enough to block the brain’s natural reset mechanism, the scientists found. Deprived of rest, the brain’s neurons seemingly became over-connected and so muddled with electrical activity that new memories could not be properly laid down. Lack of sleep alters brain chemicals to bring on cannabis-style 'munchies' But Christoph Nissen, a psychiatrist who led the study at the University of Freiburg, is also excited about the potential for helping people with mental health disorders. One radical treatment for major depression is therapeutic sleep deprivation, which Nissen believes works through changing the patient’s brain connectivity. The new research offers a deeper understanding of the phenomenon which could be adapted to produce more practical treatments. © 2016 Guardian News and Media Limited

Keyword: Sleep; Learning & Memory
Link ID: 22593 - Posted: 08.24.2016

By Emily Underwood In 2010, neurobiologist Beth Stevens had completed a remarkable rise from laboratory technician to star researcher. Then 40, she was in her second year as a principal investigator at Boston Children’s Hospital with a joint faculty position at Harvard Medical School. She had a sleek, newly built lab and a team of eager postdoctoral investigators. Her credentials were impeccable, with high-profile collaborators and her name on an impressive number of papers in well-respected journals. But like many young researchers, Stevens feared she was on the brink of scientific failure. Rather than choosing a small, manageable project, she had set her sights on tackling an ambitious, unifying hypothesis linking the brain and the immune system to explain both normal brain development and disease. Although the preliminary data she’d gathered as a postdoc at Stanford University in Palo Alto, California, were promising, their implications were still murky. “I thought, ‘What if my model is just a model, and I let all these people down?’” she says. Stevens, along with her mentor at Stanford, Ben Barres, had proposed that brain cells called microglia prune neuronal connections during embryonic and later development in response to a signal from a branch of the immune system known as the classical complement pathway. If a glitch in the complement system causes microglia to prune too many or too few connections, called synapses, they’d hypothesized, it could lead to both developmental and degenerative disorders. © 2016 American Association for the Advancement of Science.

Keyword: Development of the Brain; Glia
Link ID: 22576 - Posted: 08.20.2016

By Jessica Hamzelou Feel like you’ve read this before? Most of us have experienced the eerie familiarity of déjà vu, and now the first brain scans of this phenomenon have revealed why – it’s a sign of our brain checking its memory. Déjà vu was thought to be caused by the brain making false memories, but research by Akira O’Connor at the University of St Andrews, UK, and his team now suggests this is wrong. Exactly how déjà vu works has long been a mystery, partly because its fleeting and unpredictable nature makes it difficult to study. To get around this, O’Connor and his colleagues developed a way to trigger the sensation of déjà vu in the lab. The team’s technique uses a standard method to trigger false memories. It involves telling a person a list of related words – such as bed, pillow, night, dream – but not the key word linking them together, in this case, sleep. When the person is later quizzed on the words they’ve heard, they tend to believe they have also heard “sleep” – a false memory. To create the feeling of déjà vu, O’ Connor’s team first asked people if they had heard any words beginning with the letter “s”. The volunteers replied that they hadn’t. This meant that when they were later asked if they had heard the word sleep, they were able to remember that they couldn’t have, but at the same time, the word felt familiar. “They report having this strange experience of déjà vu,” says O’Connor. © Copyright Reed Business Information Ltd.

Keyword: Attention; Learning & Memory
Link ID: 22565 - Posted: 08.17.2016

By Anna Azvolinsky Sets of neurons in the brain that behave together—firing synchronously in response to sensory or motor stimuli—are thought to be functionally and physiologically connected. These naturally occurring ensembles of neurons are one of the ways memories may be programmed in the brain. Now, in a paper published today (August 11) in Science, researchers at Columbia University and their colleagues show that it is possible to stimulate visual cortex neurons in living, awake mice and induce a new ensemble of neurons that behave as a group and maintain their concerted firing for several days. “This work takes the concept of correlated [neuronal] firing patterns in a new and important causal direction,” David Kleinfeld, a neurophysicist at the University of California, San Diego, who was not involved in the work told The Scientist. “In a sense, [the researchers] created a memory for a visual feature that does not exist in the physical world as a proof of principal of how real visual memories are formed.” “Researchers have previously related optogenetic stimulation to behavior [in animals], but this study breaks new ground by investigating the dynamics of neural activity in relation to the ensemble to which these neurons belong,” said Sebastian Seung, a computational neuroscientist at the Princeton Neuroscience Institute in New Jersey who also was not involved in the study. Columbia’s Rafael Yuste and colleagues stimulated randomly selected sets of individual neurons in the visual cortices of living mice using two-photon stimulation while the animals ran on a treadmill. © 1986-2016 The Scientist

Keyword: Learning & Memory; Vision
Link ID: 22558 - Posted: 08.13.2016

Ed Yong At the age of seven, Henry Gustav Molaison was involved in an accident that left him with severe epilepsy. Twenty years later, a surgeon named William Scoville tried to cure him by removing parts of his brain. It worked, but the procedure left Molaison unable to make new long-term memories. Everyone he met, every conversation he had, everything that happened to him would just evaporate from his mind. These problems revolutionized our understanding of how memory works, and transformed Molaison into “Patient H.M.”—arguably the most famous and studied patient in the history of neuroscience. That’s the familiar version of the story, but the one presented in Luke Dittrich’s new book Patient H.M.: A Story of Memory, Madness, and Family Secrets is deeper and darker. As revealed through Dittrich’s extensive reporting and poetic prose, Molaison’s tale is one of ethical dilemmas that not only influenced his famous surgery but persisted well beyond his death in 2008. It’s a story about more than just the life of one man or the root of memory; it’s also about how far people are willing to go for scientific advancement, and the human cost of that progress. And Dittrich is uniquely placed to consider these issues. Scoville was his grandfather. Suzanne Corkin, the scientist who worked with Molaison most extensively after his surgery, was an old friend of his mother’s. I spoke to him about the book and the challenges of reporting a story that he was so deeply entwined in. Most of this interview was conducted on July 19th. Following a New York Times excerpt published on August 7th, and the book’s release two weeks later, many neuroscientists have expressed “outrage” at Dittrich’s portrayal of Corkin. The controversy culminated in a statement from MIT, where Corkin was based, rebutting three allegations in the book. Dittrich has himself responded to the rebuttals, and at the end of this interview, I talk to him about the debate. © 2016 by The Atlantic Monthly Group.

Keyword: Learning & Memory
Link ID: 22552 - Posted: 08.13.2016

Like many students of neuroscience, I first learned of patient HM in a college lecture. His case was so strange yet so illuminating, and I was immediately transfixed. HM was unable to form new memories, my professor explained, because a surgeon had removed a specific part of his brain. The surgery froze him in time. HM—or Henry Molaison, as his name was revealed to be after his death in 2008—might be the most famous patient in the history of brain research. He is now the subject of the new book, Patient HM: A Story of Memory, Madness, and Family Secrets. An excerpt from the book in the New York Times Magazine, which details MIT neuroscientist Sue Corkin’s custody fight over HM’s brain after his death, has since sparked a backlash. Should you wish to go down that particular rabbit hole, you can read MIT’s response, the book author’s response to the response, and summaries of the back and forth. Why HM’s brain was worth fighting over should be obvious; he was probably the most studied individual in neuroscience while alive. But in the seven years since scientists sectioned HM’s brain into 2,401 slices, it has yielded surprisingly little research. Only two papers examining his brain have come out, and so far, physical examinations have led to no major insights. HM’s scientific potential remains unfulfilled—thanks to delays from the custody fight and the limitations of current neuroscience itself. Corkin, who made her career studying HM, confronted her complicated emotions about his death in her own 2013 book. She describes being “ecstatic to see his brain removed expertly from his skull.” Corkin passed away earlier this year.

Keyword: Learning & Memory
Link ID: 22551 - Posted: 08.13.2016

Tim Radford Eight paraplegics – some of them paralysed for more than a decade by severe spinal cord injury – have been able to move their legs and feel sensation, after help from an artificial exoskeleton, sessions using virtual reality (VR) technology and a non-invasive system that links the brain with a computer. In effect, after just 10 months of what their Brazilian medical team call “brain training” they have been able to make a conscious decision to move and then get a response from muscles that have not been used for a decade. Of the octet, one has been able to leave her house and drive a car. Another has conceived and delivered a child, feeling the contractions as she did so. The extent of the improvements was unexpected. The scientists had intended to exploit advanced computing and robotic technology to help paraplegics recover a sense of control in their lives. But their patients recovered some feeling and direct command as well. The implication is that even apparently complete spinal cord injury might leave some connected nerve tissue that could be reawakened after years of inaction. The patients responded unevenly, but all have reported partial restoration of muscle movement or skin sensation. Some have even recovered visceral function and are now able to tell when they need the lavatory. And although none of them can walk unaided, one woman has been able to make walking movements with her legs, while suspended in a harness, and generate enough force to make a robot exoskeleton move. © 2016 Guardian News and Media Limited

Keyword: Regeneration; Movement Disorders
Link ID: 22549 - Posted: 08.12.2016

By Sharon Begley, The Massachusetts Institute of Technology brain sciences department and, separately, a group of some 200 neuroscientists from around the world have written letters to The New York Times claiming that a book excerpt in the newspaper’s Sunday magazine this week contains important errors, misinterpretations of scientific disputes, and unfair characterizations of an MIT neuroscientist who did groundbreaking research on human memory. In particular, the excerpt contains a 36-volley verbatim exchange between author Luke Dittrich and MIT’s Suzanne Corkin in which she says that key documents from historic experiments were “shredded.” “Most of it has gone, is in the trash, was shredded,” Corkin is quoted as telling Dittrich before she died in May, explaining, “there’s no place to preserve it.” Destroying files related to historic scientific research would raise eyebrows, but Corkin’s colleagues say it never happened. “We believe that no records were destroyed and, to the contrary, that professor Corkin worked in her final days to organize and preserve all records,” said the letter that Dr. James DiCarlo, head of the MIT Department of Brain and Cognitive Sciences, sent to the Times late Tuesday. Even as Corkin fought advanced liver cancer, he wrote, “she instructed her assistant to continue to organize, label, and maintain all records” related to the research, and “the records currently remain within our department.” © 2016 Scientific American

Keyword: Learning & Memory
Link ID: 22546 - Posted: 08.11.2016

By Virginia Morell Fourteen years ago, a bird named Betty stunned scientists with her humanlike ability to invent and use tools. Captured from the wild and shown a tiny basket of meat trapped in a plastic tube, the New Caledonian crow bent a straight piece of wire into a hook and retrieved the food. Researchers hailed the observation as evidence that these crows could invent new tools on the fly—a sign of complex, abstract thought that became regarded as one of the best demonstrations of this ability in an animal other than a human. But a new study casts doubt on at least some of Betty’s supposed intuition. Scientists have long agreed that New Caledonian crows (Corvus moneduloides), which are found only on the South Pacific island of the same name, are accomplished toolmakers. At the time of Betty’s feat, researchers knew that in the wild these crows could shape either stiff or flexible twigs into tools with a tiny, barblike hook at one end, which they used to lever grubs from rotting logs. They also make rakelike tools from the leaves of the screw pine (Pandanus) tree. But Betty appeared to take things to the next level. Not only did she fashion a hook from a material she’d never previously encountered—a behavior not observed in the wild—she seemed to know she needed this specific shape to solve her particular puzzle. © 2016 American Association for the Advancement of Science. A

Keyword: Intelligence; Evolution
Link ID: 22538 - Posted: 08.10.2016

BENEDICT CAREY As a boy growing up in Massachusetts, Luke Dittrich revered his grandfather, a brain surgeon whose home was full of exotic instruments. Later, he learned that he was not only a prominent doctor but had played a significant role in modern medical history. In 1953, at Hartford Hospital, Dr. William Scoville had removed two slivers of tissue from the brain of a 27-year-old man with severe epilepsy. The operation relieved his seizures but left the patient — Henry Molaison, a motor repairman — unable to form new memories. Known as H. M. to protect his privacy, Mr. Molaison went on to become the most famous patient in the history of neuroscience, participating in hundreds of experiments that have helped researchers understand how the brain registers and stores new experiences. By the time Mr. Dittrich was out of college — and after a year and a half in Egypt, teaching English — he had become fascinated with H. M., brain science and his grandfather’s work. He set out to write a book about the famous case but discovered something unexpected along the way. His grandfather was one of a cadre of top surgeons who had performed lobotomies and other “psycho-surgeries” on thousands of people with mental problems. This was not a story about a single operation that went wrong; it was far larger. The resulting book — “Patient H. M.: A Story of Memory, Madness, and Family Secrets,” to be published Tuesday — describes a dark era of American medicine through a historical, and deeply personal, lens. Why should scientists and the public know this particular story in more detail? The textbook story of Patient H. M. — the story I grew up with — presents the operation my grandfather performed on Henry as a sort of one-off mistake. It was not. Instead, it was the culmination of a long period of human experimentation that my grandfather and other leading doctors and researchers had been conducting in hospitals and asylums around the country. © 2016 The New York Times Company

Keyword: Learning & Memory
Link ID: 22531 - Posted: 08.09.2016

By Julia Shaw Every memory you have ever had is chock-full of errors. I would even go as far as saying that memory is largely an illusion. This is because our perception of the world is deeply imperfect, our brains only bother to remember a tiny piece of what we actually experience, and every time we remember something we have the potential to change the memory we are accessing. I often write about the ways in which our memory leads us astray, with a particular focus on ‘false memories.’ False memories are recollections that feel real but are not based on actual experience. For this particular article I invited a few top memory researchers to comment on what they wish everyone knew about their field. First up, we have Elizabeth Loftus from the University of California, Irvine, who is one of the founders of the area of false memory research, and is considered one of the most ‘eminent psychologists of the 20th century.’ Elizabeth Loftus says you need independent evidence to corroborate your memories. According to Loftus: “The one take home message that I have tried to convey in my writings, and classes, and in my TED talk is this: Just because someone tells you something with a lot of confidence and detail and emotion, it doesn't mean it actually happened. You need independent corroboration to know whether you're dealing with an authentic memory, or something that is a product of some other process.” Next up, we have memory scientist Annelies Vredeveldt from the Vrije Universiteit Amsterdam, who has done fascinating work on how well we remember when we recall things with other people. © 2016 Scientific American,

Keyword: Learning & Memory
Link ID: 22530 - Posted: 08.09.2016