Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Ann Gibbons We may not be raring to go on a Monday morning, but humans are the Energizer Bunnies of the primate world. That’s the conclusion of a new study that, for the first time, measures precisely how many calories humans and apes burn each day. Compared with chimpanzees and other apes, our revved-up internal engines burn calories 27% faster, according to a paper in Nature this week. This higher metabolic rate equips us to quickly fuel energy-hungry brain cells, sustaining our bigger brains. And lest we run out of gas when food is short, the study also found that humans are fatter than other primates, giving us energy stores to draw on in lean times. “The brilliant thing here is showing for the first time that we do have a higher metabolic rate, and we do use more energy,” says paleoanthropologist Leslie Aiello, president of the Wenner-Gren Foundation for Anthropological Research in New York City. “Humans during evolution have become more and more hypermetabolic,” says biological anthropologist Carel van Schaik of the University of Zurich in Switzerland. “We turned up the thermostat.” For decades, researchers assumed that “there weren’t any differences in the rate at which different species burned calories,” says biological anthropologist Herman Pontzer of Hunter College in New York City, lead author of the new study. Comparing humans and other primates, they saw little difference in basal metabolic rate, which reflects the total calories used by our organs while we are at rest. © 2016 American Association for the Advancement of Science
By Jessica Lahey Before she became a neuroscientist, Mary Helen Immordino-Yang was a seventh-grade science teacher at a school outside Boston. One year, during a period of significant racial and ethnic tension at the school, she struggled to engage her students in a unit on human evolution. After days of apathy and outright resistance to Ms. Immordino-Yang’s teaching, a student finally asked the question that altered her teaching — and her career path — forever: “Why are early hominids always shown with dark skin?” With that question, one that connected the abstract concepts of human evolution and the very concrete, personal experiences of racial tension in the school, her students’ resistance gave way to interest. As she explained the connection between the effects of equatorial sunlight, melanin and skin color and went on to explain how evolutionary change and geography result in various human characteristics, interest blossomed into engagement, and something magical happened: Her students began to learn. Dr. Immordino-Yang’s eyes light up as she recounts this story in her office at the Brain and Creativity Institute at the University of Southern California. Now an associate professor of education, psychology and neuroscience, she understands the reason behind her students’ shift from apathy to engagement and, finally, to deep, meaningful learning. Her students learned because they became emotionally engaged in material that had personal relevance to them. Emotion is essential to learning, Dr. Immordino-Yang said, and should not be underestimated or misunderstood as a trend, or as merely the “E” in “SEL,” or social-emotional learning. Emotion is where learning begins, or, as is often the case, where it ends. Put simply, “It is literally neurobiologically impossible to think deeply about things that you don’t care about,” she said. © 2016 The New York Times Company
By Jennifer Jolly Every January for the past decade, Jessica Irish of Saline, Mich., has made the same New Year’s Resolution: to “cut out late night snacking and lose 30 pounds.” Like millions of Americans, Ms. Irish, 31, usually makes it about two weeks. But this year is different. “I’ve already lost 18 pounds,” she said, “and maintained my diet more consistently than ever. Even more amazing — I rarely even think about snacking at night anymore.” Ms. Irish credits a new wearable device called Pavlok for doing what years of diets, weight-loss programs, expensive gyms and her own willpower could not. Whenever she takes a bite of the foods she wants to avoid, like chocolate or Cheez-Its, she uses the Pavlok to give herself a lightning-quick electric shock. “Every time I took a bite, I zapped myself,” she said. “I did it five times on the first night, two times on the second night, and by the third day I didn’t have any cravings anymore.” As the name suggests, the $199 Pavlok, worn on the wrist, uses the classic theory of Pavlovian conditioning to create a negative association with a specific action. Next time you smoke, bite your nails or eat junk food, one tap of the device or a smartphone app will deliver a shock. The zap lasts only a fraction of a second, though the severity of the shock is up to you. It can be set between 50 volts, which feels like a strong vibration, and 450 volts, which feels like getting stung by a bee with a stinger the size of an ice pick. (By comparison, a police Taser typically releases about 50,000 volts.) Other gadgets and apps dabble in behavioral change by way of aversion therapy, such as the $49 MotivAider that is worn like a pager, or the $99 RE-vibe wristband. Both can be set to vibrate at specific intervals as a reminder of a habit to break or a goal to reach. The $80 Lumo Lift posture coach is a wearable disk that vibrates when you slouch. The $150 Spire clip-on sensor tracks physical activity and state of mind by detecting users’ breathing patterns. If it detects you’re stressed or anxious, it vibrates or sends a notification to your smartphone to take a deep breath. © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22171 - Posted: 05.03.2016
By Julia Shaw In the last couple of years memory science has really upped its game. I generally write about social processes that can change our memories, but right now I can’t help but get excited that memory science is getting an incredible new toy to play with. A toy that I believe will revolutionise how we talk about, and deal with, memory. This not-so-new sounding, but totally-newly-applied, neuroscience toy is ultrasound. Ultrasound is also called sonography and is essentially a type of ‘medical sonar’. It has revolutionized medicine since the 1940s, giving us the ability to look into the body in a completely safe way (without leaving icky radiation behind, like xrays). Beyond predicting whether your baby shower will be blue or pink, lesser known applications of ultrasound include the ability to essentially burn and destroy cells inside your body. As such, it has been successfully used to do surgery without making any cuts into the human body. This is a technique that has been used to remove cancerous cells while not affecting any of the surrounding tissue, and without any of the side-effects associated with other kinds of cancer treatment. This is referred to by scientist Yoav Medan as focused ultrasound. If you are unfamiliar with this, you need to watch this TED talk. Non-invasive procedures like this are the future of surgery. Non-invasive procedures are also the future of neuroscience. It is at this point that we find ourselves at the application of this astonishing science to memory research. © 2016 Scientific American
by Laura Sanders Some researchers believe that when memories are called to mind, they enter a fragile, wobbly state during which they are vulnerable to being weakened or changed. One way to erode old memories is to learn something new just after recalling the older memory, scientists reported in 2003 (SN: 10/11/2003, p. 228). But that result itself is wobbly, scientists report April 25 in the Proceedings of the National Academy of Sciences. In an attempt to replicate the original finding, experimental psychologist Tom Hardwicke of University College London and colleagues didn’t see any memory alterations in people who learned a new sequence of finger taps shortly after recalling an old sequence. Nor did the researchers turn up signs of this memory interference in other tests. The new study focused specifically on new learning, but the findings cast suspicion on the legitimacy of other ways to interfere with people’s memories, Hardwicke says. Approaches such as brain stimulation or drugs might also be flawed, the researchers argue. © Society for Science & the Public 2000 - 2016
Keyword: Learning & Memory
Link ID: 22141 - Posted: 04.26.2016
Eleanor Ainge Roy in Dunedin An octopus has made a brazen escape from the national aquarium in New Zealand by breaking out of its tank, slithering down a 50-metre drainpipe and disappearing into the sea. In scenes reminiscent of Finding Nemo, Inky – a common New Zealand octopus – made his dash for freedom after the lid of his tank was accidentally left slightly ajar. Staff believe that in the middle of the night, while the aquarium was deserted, Inky clambered to the top of his cage, down the side of the tank and travelled across the floor of the aquarium. Rob Yarrell, national manager of the National Aquarium of New Zealand in Napier, said: “Octopuses are famous escape artists. “But Inky really tested the waters here. I don’t think he was unhappy with us, or lonely, as octopus are solitary creatures. But he is such a curious boy. He would want to know what’s happening on the outside. That’s just his personality.” One theory is that Inky slid across the aquarium floor – a journey of three or four metres – and then, sensing freedom was at hand, into a drainpipe that lead directly to the sea. The drainpipe was 50 metres long, and opened on to the waters of Hawke’s Bay, on the east coast of New Zealand’s North Island. Another possible escape route could have involved Inky squeezing into an open pipe at the top of his tank, which led under the floor to the drain. © 2016 Guardian News and Media Limited
By Gareth Cook What are the most intelligent creatures on the planet? Humans come first. (Though there are days when we have to wonder.) After Homo sapiens, most people might answer chimpanzees, and then maybe dogs and dolphins. But what of birds? The science writer Jennifer Ackerman offers a lyrical testimony to the wonders of avian intelligence in her new book, “The Genius of Birds.” There have long been hints of bird smarts, but it’s become an active field of scientific inquiry, and Ackerman serves as tour guide. She answered questions from Mind Matters editor Gareth Cook. What drew you to birds? I’ve watched birds for most of my life. I admire all the usual things about them. Their plumage and song. Their intense way of living. Their flight. I also admire their resourcefulness and pluck. I’ve always been intrigued by their apparently smart behavior, whether learned or innate. I grew up in Washington, D.C. — the second youngest in a gaggle of five girls. My parents had precious little time for one-on-one. Especially my dad, who had a demanding government job. So when he asked me if I wanted to go birdwatching with him one spring morning when I was seven or eight, I jumped at the chance. It was magical, going out in the dark woods along the C&O canal and listening for bird song. My father had learned his calls and songs in Boy Scout camp from an expert, an elderly Greek man named Apollo, so he was pretty good at identifying birds, even the shy woodland species. Eventually he gave me my own copy of Peterson’s Field Guide, along with a small pair of binoculars. I’ve loved birds ever since. My first run in with a clever bird was on our dining room table. We had a pet parakeet, a budgerigar named Gre-Gre, who was allowed to fly around the dining room and perch on our head or shoulders. He had a kind of social genius. He made you love him. But at breakfast, it was impossible to eat your cereal without his constant harassment. He liked to perch on the edge of my bowl and peck at the cereal, flapping his wings frantically to keep his balance, splashing my milk. I’d build a barricade of boxes around my place setting, but he always found a way in, moving a box or popping over the top. He was a good problem-solver. © 2016 Scientific American
Sam Doernberg and Joe DiPietro It’s the first day of class, and we—a couple of instructors from Cornell—sit around a table with a few of our students as the rest trickle in. Anderson, one of the students seated across from us, smiles and says, “I’m going to get an A+ in your class.” “No,” VanAntwerp retorts, “I’m getting the A+.” You might think that this scene is typical of classes at a school like Cornell University, where driven students compete for top marks. But this didn’t happen on a college campus: It took place in a maximum-security prison. To the outside world, they are inmates, but in the classroom, they are students enrolled in the Cornell Prison Education Program, or “CPEP.” Per New York State Department of Corrections rules, we have permission to use the inmates’ last names only—which is also often how we know them best. Those who graduate from the program—taught by Cornell instructors—will receive an associate’s degree from Cayuga Community College. Before teaching neuroscience to prison inmates, we taught it to Cornell undergraduates as part of the teaching staff for Cornell’s Introduction to Neuroscience course. Most Cornell neuroscience students are high-achieving biology majors and premeds, who are well prepared to succeed in a demanding course. They generally have gone from one academic success to another, and it is no secret that they expect a similar level of success in a neuroscience class. © 2016 by The Atlantic Monthly Group
Keyword: Learning & Memory
Link ID: 22093 - Posted: 04.12.2016
By Melinda Wenner Moyer What if you could pop a pill that made you smarter? It sounds like a Hollywood movie plot, but a new systematic review suggests that the decades-long search for a safe and effective “smart drug” (see below) might have notched its first success. Researchers have found that modafinil boosts higher-order cognitive function without causing serious side effects. Modafinil, which has been prescribed in the U.S. since 1998 to treat sleep-related conditions such as narcolepsy and sleep apnea, heightens alertness much as caffeine does. A number of studies have suggested that it could provide other cognitive benefits, but results were uneven. To clear up the confusion, researchers then at the University of Oxford analyzed 24 studies published between 1990 and 2014 that specifically looked at how modafinil affects cognition. In their review, which was published last year in European Neuropsychopharmacology, they found that the methods used to evaluate modafinil strongly affected the outcomes. Research that looked at the drug's effects on the performance of simple tasks—such as pressing a particular button after seeing a certain color—did not detect many benefits. Yet studies that asked participants to do complex and difficult tasks after taking modafinil or a placebo found that those who took the drug were more accurate, which suggests that it may affect “higher cognitive functions—mainly executive functions but also attention and learning,” explains study co-author Ruairidh Battleday, now a medical doctor and Ph.D. student at the University of California, Berkeley. But don't run to the pharmacy just yet. Although many doctors very likely prescribe the drug off-label to help people concentrate—indeed, a 2008 survey by the journal Nature found that one in five of its readers had taken brain-boosting drugs, and half those people had used modafinil—trials have not yet been done on modafinil's long-term effectiveness or safety. © 2016 Scientific American
Laura Sanders NEW YORK — Cells in a brain structure known as the hippocampus are known to be cartographers, drawing mental maps of physical space. But new studies show that this seahorse-shaped hook of neural tissue can also keep track of social space, auditory space and even time, deftly mapping these various types of information into their proper places. Neuroscientist Rita Tavares described details of one of these new maps April 2 at the annual meeting of the Cognitive Neuroscience Society. Brain scans had previously revealed that activity in the hippocampus was linked to movement through social space. In an experiment reported last year in Neuron, people went on a virtual quest to find a house and job by interacting with a cast of characters. Through these social interactions, the participants formed opinions about how much power each character held, and how kindly they felt toward him or her. These judgments put each character in a position on a “social space” map. Activity in the hippocampus was related to this social mapmaking, Tavares and colleagues found. It turns out that this social map depends on the traits of the person who is drawing it, says Tavares, of Icahn School of Medicine at Mount Sinai in New York City. People with more social anxiety tended to give more power to characters they interacted with. What’s more, these people's social space maps were smaller overall, suggesting that they explored social space less, Tavares says. Tying these behavioral traits to the hippocampus may lead to a greater understanding of social behavior — and how this social mapping may go awry in psychiatric conditions, Tavares said. © Society for Science & the Public 2000 - 2016.
Keyword: Learning & Memory
Link ID: 22076 - Posted: 04.06.2016
by Daniel Galef Footage from a revolutionary behavioural experiment showed non-primates making and using tools just like humans. In the video, a crow is trying to get food out of a narrow vessel, but its beak is too short for it to reach through the container. Nearby, the researchers placed a straight wire, which the crow bent against a nearby surface into a hook. Then, holding the hook in its beak, it fished the food from the bottle. Corvids—the family of birds that includes crows, ravens, rooks, jackdaws, and jays—are pretty smart overall. Although not to the level of parrots and cockatoos, ravens can also mimic human speech. They also have a highly developed system of communication and are believed to be among the most intelligent non-primate animals in existence. McGill Professor Andrew Reisner recalls meeting a graduate student studying corvid intelligence at Oxford University when these results were first published in 2015. “I had read early in the year that some crows had been observed making tools, and I mentioned this to him,” Reisner explained. “He said that he knew about that, as it had been he who had first observed it happening. Evidently the graduate students took turns watching the ‘bird box,’ […] and the tool making first occurred there on his shift.”
Laura Sanders NEW YORK — Sometimes forgetting can be harder than remembering. When people forced themselves to forget a recently seen image, select brain activity was higher than when they tried to remember that image. Forgetting is often a passive process, one in which the memory slips out of the brain, Tracy Wang of the University of Texas at Austin said April 2 at the annual meeting of the Cognitive Neuroscience Society. But in some cases, forgetting can be deliberate. Twenty adults saw images of faces, scenes and objects while an fMRI scanner recorded their brains’ reactions to the images. If instructed to forget the preceding image, people were less likely to remember that image later. Researchers used the scan data to build a computer model that could infer how strongly the brain responds to each particular kind of image. In the ventral temporal cortex, a part of the brain above the ear, brain patterns elicited by a particular image were stronger when a participant was told to forget the sight than when instructed to remember it. Of course, everyone knows that it’s easy to forget something without even trying. But these results show that intentional forgetting isn’t a passive process — the brain has to actively work to wipe out a memory on purpose. Citations T.H. Wang et al. Forgetting is more work than remembering. Annual meeting of the Cognitive Neuroscience Society, New York City, April 2, 2016. © Society for Science & the Public 2000 - 2016
Keyword: Learning & Memory
Link ID: 22068 - Posted: 04.05.2016
The mystery is starting to untangle. It has long been known that twisted fibres of a protein called tau collect in the brain cells of people with Alzheimer’s, but their exact role in the disease is unclear. Now a study in mice has shown how tau interferes with the strengthening of connections between neurons – the key mechanism by which we form memories. In healthy cells, the tau protein helps to stabilise microtubules that act as rails for transporting materials around the cell. In people with Alzheimer’s, these proteins become toxic, but an important unanswered question is what forms of tau are toxic: the tangles may not be the whole story. In the new study, Li Gan and her colleagues at the Gladstone Institute of Neurological Disease in San Francisco found that the brains of those with Alzheimer’s have high levels of tau with a particular modification, called acetylated tau. They then looked at what acetylated tau does in a mouse model of Alzheimer’s, finding that it accumulates at synapses – the connections between neurons. When we form memories, synapses become strengthened through extra receptors inserted into the cell membranes, and this heightens their response. But acetylated tau depletes another protein called KIBRA, which is essential for this synapse-strengthening mechanism. “We’re excited because we think we now have a handle on the link between tau and memory,” says Gan. “We’re also cautious because we know this may not be the only link. It’s still early days in understanding the mechanism.” © Copyright Reed Business Information Ltd.
By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American
Chris French The fallibility of human memory is one of the most well established findings in psychology. There have been thousands of demonstrations of the unreliability of eyewitness testimony under well-controlled conditions dating back to the very earliest years of the discipline. Relatively recently, it was discovered that some apparent memories are not just distorted memories of witnessed events: they are false memories for events that simply never took place at all. Psychologists have developed several reliable methods for implanting false memories in a sizeable proportion of experimental participants. It is only in the last few years, however, that scientists have begun to systematically investigate the phenomenon of non-believed memories. These are subjectively vivid memories of personal experiences that an individual once believed were accurate but now accepts are not based upon real events. Prior to this, there were occasional anecdotal reports of non-believed memories. One of the most famous was provided by the influential developmental psychologist Jean Piaget. He had a clear memory of almost being kidnapped at about the age of two and of his brave nurse beating off the attacker. His grateful family were so impressed with the nurse that they gave her a watch as a reward. Years later, the nurse confessed that she had made the whole story up. Even after he no longer believed that the event had taken place, Piaget still retained his vivid and detailed memory of it. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22050 - Posted: 03.30.2016
Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group
By Patrick Monahan Yesterday, mountaineer Richard Parks set out for Kathmandu to begin some highly unusual data-gathering. As part of Project Everest Cynllun, he will climb Mount Everest without supplemental oxygen and perform—on himself—a series of blood draws, muscle biopsies, and cognitive tests. If he makes it to the summit, these will be the highest-elevation blood and tissue samples ever collected. Damian Bailey, a physiologist at the University of South Wales, Pontypridd, in the United Kingdom and the project’s lead scientist, hopes the risky experiment will yield new information about how the human body responds to low-oxygen conditions, and how similar mechanisms might drive cognitive decline with aging. As Parks began the acclimatization process with warm-up climbs on two smaller peaks, Bailey told ScienceInsider about his ambitions for the project. This interview has been edited for clarity and brevity. Q: Parks is an extreme athlete who has climbed Everest before. What can his performance tell us about regular people? A: What we’re trying to understand is, what is it about Richard’s brain that is potentially different from other people’s brains, and can that provide us with some clues to accelerated cognitive decline, which occurs with aging [and] dementia. We know that sedentary aging is associated with a progressive decline in blood flow to the brain. … And the main challenge for sedentary aging is we have to wait so long to see the changes occurring. So this is almost a snapshot, a day in the life of a patient with cognitive decline. © 2016 American Association for the Advancement of Science.
Healthy body, healthy mind. Elderly people who are physically active seem to be able to stave off memory loss – but only if they start exercising before symptoms appear. At the end of a five-year period, the brains of non-exercisers look 10 years older than those who did moderate exercise. That’s what Clinton Wright at the University of Miami in Florida and his colleagues found when they followed 876 people, starting at an average age of 71, for five years. At the start of the study, each participant underwent a number of memory and cognition tests, and had the health of their brain assessed during an MRI scan. Each person was also asked how much exercise they had done in recent weeks, ranging from “no/light”, such as walking or gardening, to “moderate/heavy”, which included running and swimming. Five years later, the volunteers were called back to repeat all the tests. The participants generally performed less well than they had five years earlier. But their scores were linked to their level of exercise – those who reported no or low levels of exercise scored lower in all tests, the team found. The 10 per cent of people who said they had been engaged in moderate-to-heavy exercise not only started with higher scores in the first round of tests, but showed less of a decline five years later . Those who did little or no exercise also seemed to have worse vascular health – they had higher blood pressure, and their MRI scans showed evidence of undetected strokes. © Copyright Reed Business Information Ltd.
By NATALIE ANGIER Juan F. Masello never intended to study wild parrots. Twenty years ago, as a graduate student visiting the northernmost province of Patagonia in Argentina, he planned to write his dissertation on colony formation among seabirds. But when he asked around for flocks of, say, cormorants or storm petrels, a park warden told him he was out of luck. “He said, ‘This is the only part of Patagonia with no seabird colonies,’” recalled Dr. Masello, a principal investigator in animal ecology and systematics at Justus Liebig University in Germany. Might the young scientist be interested in seeing a large colony of parrots instead? The sight that greeted Dr. Masello was “amazing” and “incredible,” he said. “It was almost beyond words.” On a 160-foot-high sandstone cliff that stretched some seven miles along the Atlantic coast, tens of thousands of pairs of burrowing parrots had used their powerful bills to dig holes — their nests — deep into the rock face. And when breeding season began not long afterward, the sky around the cliffs erupted into a raucous carnival of parrot: 150,000 crow-size, polychromed aeronauts with olive backsides, turquoise wings, white epaulets and bright red belly patches ringed in gold. Dr. Masello was hooked. Today, Dr. Masello’s hands are covered with bite scars. He has had four operations to repair a broken knee, a broken nose — “the little accidents you get from working with parrots,” he said. Still, he has no regrets. “Their astonishing beauty and intelligence,” Dr. Masello said, “are inspirational.” © 2016 The New York Times Company
By Manuel Valdes For nearly every step of his almost 12-mile walks around Seattle, Darryl Dyer has company. Flocks of crows follow him, signaling each other, because they all know that he’s the guy with the peanuts. “They know your body type. The way you walk,” Dyer said. “They’ll take their young down and say: ‘You want to get to know this guy. He’s got the food.’ ” Scientists have known for years that crows have great memories, that they can recognize a human face and behavior, that they can pass that information on to their offspring. Researchers are trying to understand more about the crow’s brain and behavior, specifically what the birds do when they see one of their own dead. They react loudly, but the reasons aren’t entirely known. Among the guesses is that they are mourning; given that crows mate for life, losing a partner could be a significant moment for the social animals. There are anecdotes of crows placing sticks and other objects on dead birds — a funeral of sorts. Using masks with dark-haired wigs that looked creepily nonhuman, researchers showed up at Seattle parks carrying a stuffed crow and recorded the reactions. One crow signals an alarm, then dozens show up. They surround the dead crow, looking at it as they perch on trees or fly above it, a behavior called mobbing. “Crows have evolved to have these complex social relationships, and they have a big brain,” said Kaeli Swift, a University of Washington graduate student who led the study.