Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Moheb Costandi Rest easy after learning a new skill. Experiments in mice suggest that a good night's sleep helps us lay down memories by promoting the growth of new connections between brain cells. Neuroscientists believe that memory involves the modification of synapses, which connect brain cells, and numerous studies published over the past decade have shown that sleep enhances the consolidation of newly formed memories in people. But exactly how these observations were related was unclear. To find out, Wenbiao Gan of the Skirball Institute of Biomolecular Medicine at New York University Medical School and his colleagues trained 15 mice to run backwards or forwards on a rotating rod. They allowed some of them to fall asleep afterwards for 7 hours, while the rest were kept awake. The team monitored the activity and microscopic structure of the mice's motor cortex, the part of the brain that controls movement, through a small transparent "window" in their skulls. This allowed them to watch in real time how the brain responded to learning the different tasks. Sprouting spines They found that learning a new task led to the formation of new dendritic spines – tiny structures that project from the end of nerve cells and help pass electric signals from one neuron to another – but only in the mice left to sleep. This happened during the non-rapid eye movement stage of sleep. Each task caused a different pattern of spines to sprout along the branches of the same motor cortex neurons. © Copyright Reed Business Information Ltd.
By Sadie Dingfelder Want to become famous in the field of neuroscience? You could go the usual route, spending decades collecting advanced degrees, slaving away in science labs and publishing your results. Or you could simply fall victim to a freak accident. The stars of local science writer Sam Kean’s new book, “The Tale of the Dueling Neurosurgeons,” (which he’ll discuss Saturday at Politics and Prose) took the latter route. Be it challenging the wrong guy to a joust, spinning out on a motorcycle, or suffering from a stroke, these folks sustained brain injuries with bizarre and fascinating results. One man, for instance, lost the ability to identify different kinds of animals but had no trouble naming plants and objects. Another man lost his short-term memory. The result? A diary filled with entries like: “I am awake for the very first time.” “Now, I’m really awake.” “Now, I’m really, completely awake.” Unfortunate mishaps like these have advanced our understanding of how the gelatinous gray mass that (usually) stays hidden inside our skulls gives rise to thoughts, feelings and ideas, Kean says. “Traditionally, every major discovery in the history of neuroscience came about this way,” he says. “We had no other way of looking at the brain for centuries and centuries, because we didn’t have things like MRI machines.” Rather than covering the case studies textbook-style, Kean provides all the gory details. Consider Phineas Gage. You may remember from Psych 101 that Gage, a railroad worker, survived having a metal rod launched through his skull. You might not know, however, that one doctor “shaved Gage’s scalp and peeled off the dried blood and gelatinous brains. He then extracted skull fragments from the wound by sticking his fingers in from both ends, Chinese-finger-trap-style,” as Kean writes in his new book. © 1996-2014 The Washington Post
By Jenny Graves The claim that homosexual men share a “gay gene” created a furor in the 1990s. But new research two decades on supports this claim – and adds another candidate gene. To an evolutionary geneticist, the idea that a person’s genetic makeup affects their mating preference is unsurprising. We see it in the animal world all the time. There are probably many genes that affect human sexual orientation. But rather than thinking of them as “gay genes,” perhaps we should consider them “male-loving genes.” They may be common because these variant genes, in a female, predispose her to mate earlier and more often and to have more children. Likewise, it would be surprising if there were not “female-loving genes” in lesbian women that, in a male, predispose him to mate earlier and have more children. We can detect genetic variants that produce differences between people by tracking traits in families that display differences. Patterns of inheritance reveal variants of genes (called “alleles”) that affect normal differences, such as hair color, or disease states, such as sickle cell anemia. Quantitative traits, such as height, are affected by many different genes, as well as environmental factors. It’s hard to use these techniques to detect genetic variants associated with male homosexuality partly because many gay men prefer not to be open about their sexuality. It is even harder because, as twin studies have shown, shared genes are only part of the story. Hormones, birth order and environment play roles, too.
Laura Spinney One day in 1991, neurologist Warren Strittmatter asked his boss to look at some bewildering data. Strittmatter was studying amyloid-β, the main component of the molecular clumps found in the brains of people with Alzheimer's disease. He was hunting for amyloid-binding proteins in the fluid that buffers the brain and spinal cord, and had fished out one called apolipoprotein E (ApoE), which had no obvious connection with the disease. Strittmatter's boss, geneticist Allen Roses of Duke University in Durham, North Carolina, immediately realized that his colleague had stumbled across something exciting. Two years earlier, the group had identified a genetic association between Alzheimer's and a region of chromosome 19. Roses knew that the gene encoding ApoE was also on chromosome 19. “It was like a lightning bolt,” he says. “It changed my life.” In humans, there are three common variants, or alleles, of the APOE gene, numbered 2, 3 and 4. The obvious step, Roses realized, was to find out whether individual APOE alleles influence the risk of developing Alzheimer's disease. The variants can be distinguished from one another using a technique called the polymerase chain reaction (PCR). But Roses had little experience with PCR, so he asked the postdocs in his team to test samples from people with the disease and healthy controls. The postdocs refused: they were busy hunting for genes underlying Alzheimer's, and APOE seemed an unlikely candidate. The feeling in the lab, recalls Roses, was that “the chief was off on one of his crazy ideas”. Roses then talked to his wife, Ann Saunders, a mouse geneticist who was skilled at PCR. She had just given birth to their daughter and was on maternity leave, so they struck a deal. “She did the experiments while I held the baby,” he says. Within three weeks, they had collected the data that would fuel a series of landmark papers showing that the APOE4 allele is associated with a greatly increased risk of Alzheimer's disease1. © 2014 Nature Publishing Group,
By Charles Q. Choi Scientists have found a kind of brain cell in mice that can instruct stem cells to start making more neurons, according to a new study. In addition, they found that electrical signals could trigger this growth in rodents, raising the intriguing possibility that devices could one day help the human brain repair itself. The study appears in the journal Nature Neuroscience. We knew the brain can generate new neurons, a process known as neurogenesis, via neural stem cells. And neuroscientists knew these stem cells got their instructions from a variety of sources from chemicals in the bloodstream, for instance, and from cells in the structures that hold the cerebrospinal fluid that cushion the brain. Earlier research had suggested brain cells might also be able to command these stem cells to create neurons. Neuroscientist Chay Kuo at the Duke University School of Medicine in Durham, N.C., and his colleagues have now discovered such cells in mice. "It's really cool that the brain can tell stem cells to make more neurons," Kuo says. To begin their experiments, the researchers tested how well a variety of neurotransmitters performed at spurring mouse neural stem cells to produce new neurons; they found that a compound known as acetylcholine performed best. The team then discovered a previously unknown type of neuron that produces an enzyme needed to make acetylcholine. These neurons are found in a part of the adult mouse brain known as the subventricular zone, where neurogenesis occurs. ©2014 Hearst Communication, Inc
Link ID: 19694 - Posted: 06.05.2014
Ewen Callaway By controlling rats' brain cells they had genetically engineered to respond to light, researchers were able to create fearful memories of events that never happened — and then to erase those memories again. Neuroscientists can breathe a collective sigh of relief. Experiments have confirmed a long-standing theory for how memories are made and stored in the brain. Researchers have created and erased frightening associations in rats' brains using light, providing the most direct demonstration yet that the strengthening and weakening of connections between neurons is the basis for memory. “This is the best evidence so far available, period,” says Eric Kandel, a neuroscientist at Columbia University in New York. Kandel, who shared the 2000 Nobel Prize in Physiology or Medicine for his work unravelling the molecular basis of memory, was not involved in the latest study, which was published online in Nature1 on 1 June. In the 1960s and 1970s, researchers in Norway noticed a peculiar property of brain cells. Repeatedly delivering a burst of electricity to a neuron in an area of the brain known as the hippocampus seemed to boost the cell’s ability to talk to a neighbouring neuron. These communiqués occur across tiny gaps called synapses, which neurons can form with thousands of other nerve cells. The process was called long-term potentiation (LTP), and neuroscientists suspected that it was the physical basis of memory. The hippocampus, they realized, was important for forming long-term memories, and the long-lasting nature of LTP hinted that information might be stored in a neural circuit for later recall. © 2014 Nature Publishing Group,
Ian Sample, science correspondent Research on children in Denmark has found that boys with autism were more likely to have been exposed to higher levels of hormones in their mother's wombs than those who developed normally. Boys diagnosed with autism and related disorders had, on average, raised levels of testosterone, cortisol and other hormones in the womb, according to analyses of amniotic fluid that was stored after their mothers had medical tests during pregnancy. The findings add to a growing body of evidence that the biological foundations of autism are laid down well before birth and involve factors that go beyond the child's genetic make-up. The results may help scientists to unravel some of the underlying causes of autism and explain why boys are four to five times more likely to be diagnosed with the condition, which affects around one percent of the population. Amniotic fluid surrounds babies in the womb and contains hormones and other substances that they have passed through their urine. The liquid is collected for testing when some women have an amniocentesis around four months into their pregnancy. Scientists in Cambridge and Copenhagen drew on Danish medical records and biobank material to find amniotic fluid samples from 128 boys who were later diagnosed with autism. Compared to a control group, the boys with autism and related conditions had higher levels of four "sex steroid" hormones that form a biological production line in the body that starts with progesterone and ends with testosterone. "In the womb, boys produce about twice as much testosterone as girls, but compared with typical boys, the autism group has even higher levels. It's a significant difference and may have a large effect on brain development," said Simon Baron-Cohen, director of the Autism Research Centre at Cambridge University. © 2014 Guardian News and Media Limited
Jessica Morrison Bees, like birds and butterflies, use the Sun as a compass for navigation, whereas mammals typically find their way by remembering familiar landmarks on a continuous mental map. However, the latest research suggests that bees also use this type of map, despite their much smaller brain size. The work adds a new dimension to complex bee-navigation abilities that have long captivated researchers. “The surprise comes for many people that such a tiny little brain is able to form such a rich memory described as a cognitive map,” says co-author Randolf Menzel, a neurobiologist at the Free University of Berlin. The research by Menzel and his team, published today in the Proceedings of the National Academy of Sciences1, demonstrates that bees can find their way back to their hives without relying solely on the Sun. Instead, they seem to use a 'cognitive map' that is made up of memorized landscape snapshots that direct them home. The cognitive map used by mammals is thought to originate in the brain’s hippocampus. Humans employ such maps on a daily basis; for example, even in a windowless office, many people can point towards their home, orienting themselves in space based on knowledge of their location relative to the outside world. “They can point to their home generally even though they can’t see it, even along a path through a wall that they haven’t travelled,” explains Fred Dyer, a behavioural biologist at Michigan State University in East Lansing, who was not involved in the research. The study authors argue that bees can do something similar, albeit on a much more rudimentary level. © 2014 Nature Publishing Group
Keyword: Animal Migration
Link ID: 19684 - Posted: 06.03.2014
By DAAN HEERMA VAN VOSS I was 25 when I lost my memory. It happened on Jan. 16, 2012. I woke up, not knowing where I was. I was lying in bed, sure, but whose bed was it? There was no one in the room, no sound that I recognized: I was alone with my body. Of course, my relationship to my body was radically different than before. My body parts seemed to belong to someone else or, rather, to something else. The vague sense of identity that I possessed was confined to the knowledge of my name, but even that felt arbitrary — a collection of random letters, crumbling. No words can accurately describe the feeling of losing your memory, your life. Sammy Harkham Underlying the loss of facts is a deeper problem: the loss of logic and causality. A person can function, ask questions, only when he recognizes a fundamental link between circumstances and time, past and present. The links between something happening to you, leading you to do or say something, which leads to someone else responding. No act is without an act leading up to it, no word is without a word that came before. Without the sense of causality provided by memory, there is chaos. When I woke up, I had no grip on logic, and logic none on me. It was a profound not-knowing, and it was terrifying. I started hyperventilating. What struck me has a name: Transient Global Amnesia. T.G.A., as it’s referred to, is a neurological disorder. The name sounds definitive, but in fact, it’s just a fancy way of saying: We don’t know the cause, we know only what the symptoms are. Its most defining symptom is a near total disruption of short-term memory. In many cases, there is a temporary loss of long-term memory as well. But there is a bright side. T.G.A. lasts for approximately two to 20 hours, so it’s a one-day thing. At the time, though, I didn’t know this. Two names popped into my mind: Daniel and Sophie. I didn’t know where the names came from, or to whom they belonged. I stumbled across the room, opened a door, and discovered that I was alone in the apartment. (It was, in fact, my apartment.) I found an iPhone and, quite magically, I thought, knew how to work it. As it turns out, there was nothing magical about this: A characteristic of T.G.A. is that those afflicted with it can perform familiar tasks, even ones as difficult as driving a car. (But I wouldn’t recommend that.) Occurrence of T.G.A. is rare, with at most 10 cases per 100,000 people. It is most likely to happen when you’re between 40 and 80; the average age of a T.G.A. patient is 62 years old. But I have always been in the fast lane. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19681 - Posted: 06.02.2014
Learning a second language can have a positive effect on the brain, even if it is taken up in adulthood, a University of Edinburgh study suggests. Researchers found that reading, verbal fluency and intelligence were improved in a study of 262 people tested either aged 11 or in their seventies. A previous study suggested that being bilingual could delay the onset of dementia by several years. The study is published in Annals of Neurology. The big question in this study was whether learning a new language improved cognitive functions or whether individuals with better cognitive abilities were more likely to become bilingual. Dr Thomas Bak, from the Centre for Cognitive Ageing and Cognitive Epidemiology at the University of Edinburgh, said he believed he had found the answer. Using data from intelligence tests on 262 Edinburgh-born individuals at the age of 11, the study looked at how their cognitive abilities had changed when they were tested again in their seventies. The research was conducted between 2008 and 2010. All participants said they were able to communicate in at least one language other than English. Of that group, 195 learned the second language before the age of 18, and 65 learned it after that time. The findings indicate that those who spoke two or more languages had significantly better cognitive abilities compared to what would have been expected from their baseline test. The strongest effects were seen in general intelligence and reading. The effects were present in those who learned their second language early, as well as later in life. BBC © 2014
Elizabeth Norton It's a sad fact that children born in poverty start out at a disadvantage and continue to fall further behind kids who are more privileged as they grow up. In developing countries, chiefly in Africa and Asia, some 200 million children under age 5 won't reach the same milestones—for physical growth, school performance, and earnings later on—as children who are less deprived. But a new analysis of a long-term study in Jamaica shows that surprisingly simple ways of stimulating children’s mental development can have dramatic benefits later in life. The children were participants in the Jamaican Study, a project geared toward improving cognitive development begun in the mid-1980s by child health specialists Sally Grantham-McGregor of University College London and Susan Walker of the University of the West Indies, Mona, in Jamaica. They focused on children between the ages of 9 and 24 months whose growth was stunted, placing them in the bottom 5% of height for their age and sex (an easy-to-quantify gauge of extreme poverty). Children of normal height in the same neighborhoods were also studied for comparison. For 2 years, community health workers visited the families weekly. One group was given nutritional assistance only (a formula containing 66% of daily recommended calories, along with vitamins and minerals). One group received a mental and social stimulation program only, and one group got stimulation and nutritional assistance. A final group had no intervention and served as a control. The mental stimulation program involved giving parents simple picture books and handmade toys, and encouraging them to read and sing to their children and point out names of objects, shapes, and colors. They were also taught better ways to converse and respond to their toddlers. These everyday interactions aren't always part of the culture in low-income countries, explains Paul Gertler, an economist at the University of California, Berkeley. "Parents might have five or six kids and few toys. They might be working really hard and have a lot of competing demands. They might not have been taught how to talk to their children, or how important and effective it is," he says. Past research attests to the importance of everyday conversation for children’s mental development: A recent study suggests that children of affluent parents do better in life in large part because their parents talk to them more. © 2014 American Association for the Advancement of Science
Pain is a symptom of many disorders; chronic pain can present as a disease in of itself. The economic cost of pain is estimated to be hundreds of billions of dollars annually in lost wages and productivity. “This database will provide the public and the research community with an important tool to learn more about the breadth and details of pain research supported across the federal government. They can search for individual research projects or sets of projects grouped by themes uniquely relevant to pain,” said Linda Porter, Ph.D., Policy Advisor for Pain at the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health (NIH). “It also can be helpful in identifying potential collaborators by searching for topic areas of interest or for investigators.” Users of the database easily can search over 1,200 research projects in a multi-tiered system. In Tier 1, grants are organized as basic, translational (research that can be applied to diseases), or clinical research projects. In Tier 2, grants are sorted among 29 scientific topic areas related to pain, such as biobehavioral and psychosocial mechanisms, chronic overlapping conditions, and neurobiological mechanisms. The Tier 2 categories are also organized into nine research themes: pain mechanisms, basic to clinical, disparities, training and education, tools and instruments, risk factors and causes, surveillance and human trials, overlapping conditions, and use of services, treatments, and interventions.
By By Tanya Lewis, It's not every day you see a mouse with a mohawk. But that's what researchers saw while studying mice that had a genetic mutation linked to autism. The mohawks that the mice were sporting actually resulted from their "over-grooming" behavior, repeatedly licking each other's hair in the same direction. The behavior resembles the repetitive motions displayed by some people with autism, and the researchers say their experiments reveal a link between the genetic causes of autism and their effects on the brain, suggesting potential avenues for treating the disorder. "Our study tells us that to design better tools for treating a disease like autism, you have to get to the underlying genetic roots of its dysfunctional behaviors, whether it is over-grooming in mice or repetitive motor behaviors in humans," study researcher Gordon Fishell, a neuroscientist at NYU Langone Medical Center, said in a statement. Autism is a spectrum of developmental disorders that involve social impairments and communication deficits. People with autism may also engage in repetitive behaviors, such as rocking or hand flapping. In the study, detailed today (May 25) in the journal Nature, the researchers bred mice that lacked a gene for a protein called Cntnap4, which is found in brain cells called interneurons. Having low levels of this protein leads to the abnormal release of two brain-signaling molecules, known as dopamine and GABA. Dopamine is involved in sensations of pleasure; GABA (which stands for gamma-aminobutyric acid) dampens neural activity and regulates muscle tone. Mice that lacked the gene for this critical brain protein were found to obsessively groom their fellow animals' fur into mohawk-like styles, suggesting a link between genetics, brain function and autistic behaviors.
By KATE MURPHY The baseball hurtles toward the batter, and he must decide from its rotation whether it’s a fastball worth a swing or a slider about to drop out of the strike zone. Running full speed, the wide receiver tracks both the football flying through the air and the defensive back on his heels. Golfers must rapidly shift visual focus in order to drive the ball at their feet toward a green in the distance. Many athletes need excellent vision to perform well in their sports, and now many are adding something new to their practice regimens: vision training. The idea has been around for years, but only recently have studies hinted that it might really work — that it might be possible to train yourself to see better without resorting to glasses or surgery. “Vision training has been out there for a long time,” said Mark Blumenkranz, a professor of ophthalmology at Stanford University Medical School. “But it’s being made more respectable lately thanks to the attention it’s been getting from psychophysicists, vision scientists, neurologists and optometrists.” Vision training actually has little to do with improving eyesight. The techniques, a form of perceptual learning, are intended to improve the ability to process what is seen. The idea is that if visual sensory neurons are repeatedly activated, they increase their ability to send electrical signals from one cell to another across connecting synapses. If neurons are not used, over time these transmissions are weakened. “With sensory neurons, just like muscles, it’s use or lose it,” said Dr. Bernhard Sabel, a neuroscientist at Otto von Guericke University in Magdeburg, Germany, who studies plasticity in the brain. “This applies both to athletes and the partially blind.” Vision training may involve simple strategies — for instance, focusing sequentially on beads knotted at intervals on a length of string with one end held at the tip of the nose. This is said to improve convergence (inward turning of the eye to maintain binocular vision) and the ability to focus near and far. © 2014 The New York Times Company
Claudia M. Gold When Frank was a young boy, and he committed some typical toddler transgression such as having a meltdown when it was time to leave the playground, his father would slap him across the face, hurting and humiliating him in a very public way. When I spoke with Frank over 20 years later, in the context of helping him with his own son Leo's frequent tantrums in my behavioral pediatrics practice, he did not describe this experience as "trauma." Rather, he described it in a very matter-of-fact tone. But when we explored in detail his response to his son's tantrums, we discovered that, flooded by the stress of his own memories, Frank in a sense would shut down. Normally a thoughtful and empathic person, he simply told Leo to "cut it out." As we spoke he recognized how he was emotionally absent during these moments, which were increasing in frequency. It seemed as if Leo was testing Frank, perhaps looking for a more appropriate response that would help him manage this normal behavior. Once this process was brought in to awareness, Frank was able to be present with Leo- to tolerate his tantrums and understand them from his 2-year-old perspective. Soon the frequency and intensity of the tantrums returned to a level typical for Leo's developmental stage. Frank, greatly relieved, once again found himself enjoying his son. The upcoming Boston conference; Psychological Trauma: Neuroscience, Attachment, and Therapeutic Interventions, promises to offer insight in to the developmental neuroscience behind this story. What Frank experienced as a young child might be termed "quotidian" or "everyday" trauma. It was not watching a relative get shot, or having his house washed away in an avalanche. It was a daily mismatch with his father- he was looking for reassurance and containment and instead got a slap across the face. It was what leading researcher Ed Tronick would term "unrepaired mismatch." Frank, in a way that is extremely common- termed "intergenerational transmission of trauma"- was then repeating this cycle with his own child. When this dynamic was brought in to awareness, he was able to "repair the mismatch," setting his relationship with his own son on a healthier path. ©2014 Boston Globe Media Partners, LLC
By Neuroskeptic Nothing that modern neuroscience can detect, anyway. This is the message of a provocative article by Pace University psychologist Terence Hines, just published in Brain and Cognition: Neuromythology of Einstein’s brain As Hines notes, the story of how Einstein’s brain was preserved is well known. When the physicist died in 1955, his wish was to be cremated, but the pathologist who performed the autopsy decided to save his brain for science. Einstein’s son Hans later gave his blessing to this fait accompli. Samples and photos of the brain were then made available to neuroscientists around the world, who hoped to discover the secret of the great man’s genius. Many have claimed to have found it. But Hines isn’t convinced. Some researchers, for instance, have used microscopy to examine Einstein’s brain tissue on a histological (cellular) level. Most famous amongst these studies is Diamond et al, who in 1985 reported that Einstein’s brain had a significantly higher proportion of glial cells than those of matched, normal control brains. However, Hines points out that this ‘finding’ may have been a textbook example of the multiple-comparisons problem: Diamond et al. (1985) reported four different t-tests, each comparing Einstein’s brain to the brains of the controls. Only one of the four tests performed was significant at the .05 level. Although only the results of the neuron to glial cell ratios were reported by Diamond et al. (1985), the paper makes it clear that at least six other dependent measures were examined: (1) number of neurons, (2) total number of glial cells, (3) number of astrocytes, (4) number of oligodendrocytes, (5) neuron to astrocyte ratio and (6) neuron to oligodendrocyte ratio. Thus a total of seven different dependent measures were examined in four different brain areas for a total of 28 comparisons… one p less than 0.05 result out of 28 is not surprising. Other histological studies followed from other researchers, but Hines says that they do not present a coherent picture of clear differences:
By BRUCE WEBER Dr. Gerald M. Edelman at Rockefeller University in 1972, in front of a gamma globulin model. Credit Don Hogan Charles/The New York Times Dr. Gerald M. Edelman, who shared a 1972 Nobel Prize for a breakthrough in immunology and went on to contribute key findings in neuroscience and other fields, becoming a leading if contentious theorist on the workings of the brain, died on Saturday at his home in the La Jolla section of San Diego. He was 84. The precise cause was uncertain, but Dr. Edelman had Parkinson’s disease and prostate cancer, his son David said. Dr. Edelman was known as a problem solver, a man of relentless intellectual energy who asked big questions and attacked big projects. What interested him, he said, were “dark areas” where mystery reigned. “Anybody in science, if there are enough anybodies, can find the answer,” he said in a 1994 interview in The New Yorker. “It’s an Easter egg hunt. That isn’t the idea. The idea is: Can you ask the question in such a way as to facilitate the answer? And I think the great scientists do that.” His Nobel Prize in Physiology or Medicine came in 1972 after more than a decade of work on the process by which antibodies, the foot soldiers of the immune system, mount their defense against infection and disease. He shared the prize with Rodney R. Porter, a British scientist who worked independent of Dr. Edelman. The Nobel committee cited them for their separate approaches in deciphering the chemical structure of antibodies, also known as immunoglobulins. Dr. Edelman discovered that antibodies were not constructed in the shape of one long peptide chain, as thought, but of two different ones — one light, one heavy — that were linked. © 2014 The New York Times Company
By John Horgan Biologist Gerald Edelman–one of the truly great scientific characters I’ve encountered, whose work raised profound questions about the limits of science—has died. I interviewed Edelman in June 1992 at Rockefeller University in New York. Edelman subsequently left Rockefeller to head a center for neuroscience at the Scripps Institute in California. Edelman, 84, died in his home in La Jolla. The following is an edited version of my profile of Edelman in my 1996 book The End of Science. Gerald Edelman, who sought to solve the riddle of consciousness, had "the brain of an empiricist and the heart of a romantic." Gerald Edelman’s career, like that of his rival Francis Crick, has been eclectic, and highly successful. While still a graduate student, Edelman helped to determine the structure of a protein molecule crucial to the body’s immune response. In 1972 he shared a Nobel Prize for that work. Edelman moved on to developmental biology, the study of how a single fertilized cell becomes a full-fledged organism. He found a class of proteins, called cell adhesion molecules, thought to play an important role in embryonic development. All this was merely prelude, however, to Edelman’s grand project of creating a theory of mind. Edelman has set forth his theory in three books: Neural Darwinism, The Remembered Present and Bright Air, Brilliant Fire. The gist of the theory is that just as environmental stresses select the fittest members of a species, so do inputs to the brain select groups of neurons–corresponding to useful memories, for example–by strengthening the connections between them. © 2014 Scientific American
By BENEDICT CAREY SAN DIEGO – The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard. If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got. The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too. One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive. Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious. “We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.” People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line. The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer. © 2014 The New York Times Company
By David Grimm, A shaggy brown terrier approaches a large chocolate Labrador in a city park. When the terrier gets close, he adopts a yogalike pose, crouching on his forepaws and hiking his butt into the air. The Lab gives an excited bark, and soon the two dogs are somersaulting and tugging on each other’s ears. Then the terrier takes off and the Lab gives chase, his tail wagging wildly. When the two meet once more, the whole thing begins again. Watch a couple of dogs play, and you’ll probably see seemingly random gestures, lots of frenetic activity and a whole lot of energy being expended. But decades of research suggest that beneath this apparently frivolous fun lies a hidden language of honesty and deceit, empathy and perhaps even a humanlike morality. Take those two dogs. That yogalike pose is known as a “play bow,” and in the language of play it’s one of the most commonly used words. It’s an instigation and a clarification, a warning and an apology. Dogs often adopt this stance as an invitation to play right before they lunge at another dog; they also bow before they nip (“I’m going to bite you, but I’m just fooling around”) or after some particularly aggressive roughhousing (“Sorry I knocked you over; I didn’t mean it.”). All of this suggests that dogs have a kind of moral code — one long hidden to humans until a cognitive ethologist named Marc Bekoff began to crack it. A wiry 68-year-old with reddish-gray hair tied back in a long ponytail, Bekoff is a professor emeritus at the University of Colorado at Boulder, where he taught for 32 years. He began studying animal behavior in the early 1970s, spending four years videotaping groups of dogs, wolves and coyotes in large enclosures and slowly playing back the tapes, jotting down every nip, yip and lick. “Twenty minutes of film could take a week to analyze,” he says. © 1996-2014 The Washington Post