Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 5538

By Linda Marsa| Helen Epstein felt deeply isolated and alone. Haunted by her parents’ harrowing experiences in Nazi concentration camps in World War II, she was troubled as a child by images of piles of skeletons and barbed wire, and, in her words, “a floating sense of danger and incipient harm.” But her Czech-born parents’ defense against the horrific memories was to detach. “Their survival strategy in the war was denial and dissociation, and that carried into their behavior afterward,” recalls Epstein, who was born shortly after the war and grew up in Manhattan. “They believed in action over reflection. Introspection was not encouraged, but a full schedule of activities was.” It was only when she was a student at Israel’s Hebrew University in the late 1960s that she realized she was part of a community that shared a cultural and historical legacy that included both pain and fear. “I met dozens of kids of survivors,” she says, “one after the other who shared certain characteristics: preoccupation with a family past and Israel, and who spoke several middle European languages — just like me.” Epstein’s 1979 book about her observations, Children of the Holocaust, gave voice to that sense of alienation and free-floating anxiety. In the years since, mental health professionals have largely attributed the second generation’s moodiness, hypervigilance and depression to learned behavior. It is only now, more than three decades later, that science has the tools to see that this legacy of trauma becomes etched in our DNA — a process known as epigenetics, in which environmental factors trigger genetic changes that may be passed on, just as surely as blue eyes and crooked smiles.

Keyword: Sexual Behavior; Epigenetics
Link ID: 22307 - Posted: 06.09.2016

By Esther Landhuis About 100 times rarer than Parkinson’s, and often mistaken for it, progressive supranuclear palsy afflicts fewer than 20,000 people in the U.S.—and two thirds do not even know they have it. Yet this little-known brain disorder that killed comic actor Dudley Moore in 2002 is quietly becoming a gateway for research that could lead to powerful therapies for a range of intractable neurodegenerative conditions including Alzheimer’s and chronic traumatic encephalopathy, a disorder linked to concussions and head trauma. All these diseases share a common feature: abnormal buildup of a protein called tau in the brains of patients. Progressive supranuclear palsy has no cure and is hard to diagnose. Although doctors may have heard of the disease, many know little about it. It was not described in medical literature until 1964 but some experts believe one of the earliest accounts of the debilitating illness appeared in an 1857 short story by Charles Dickens and his friend Wilke Collins: “A cadaverous man of measured speech. A man who seemed as unable to wink, as if his eyelids had been nailed to his forehead. A man whose eyes—two spots of fire—had no more motion than if they had been connected with the back of his skull by screws driven through them, and riveted and bolted outside among his gray hair. He had come in and shut the door, and he now sat down. He did not bend himself to sit as other people do, but seemed to sink bolt upright, as if in water, until the chair stopped him.” © 2016 Scientific American

Keyword: Parkinsons; Alzheimers
Link ID: 22304 - Posted: 06.09.2016

Most available antidepressants do not help children and teenagers with serious mental health problems and some may be unsafe, experts have warned. A review of clinical trial evidence found that of 14 antidepressant drugs, only one, fluoxetine – marketed as Prozac – was better than a placebo at relieving the symptoms of young people with major depression. Another drug, venlafaxine, was associated with an increased risk of suicidal thoughts and suicide attempts. Blood test could identify people who will respond to antidepressants Read more But the authors stressed that the true effectiveness and safety of antidepressants taken by children and teenagers remained unclear because of the poor design and selective reporting of trials, which were mostly funded by drug companies. They recommended close monitoring of young people on antidepressants, regardless of what drugs they were prescribed, especially at the start of treatment. Professor Peng Xie, a member of the team from Chongqing Medical University in China, said: “The balance of risks and benefits of antidepressants for the treatment of major depression does not seem to offer a clear advantage in children and teenagers, with probably only the exception of fluoxetine.” Major depressive disorder affects around 3% of children aged six to 12 and 6% of teenagers aged 13 to 18. In 2004 the US Food and Drug Administration (FDA) issued a warning against the use of antidepressants in young people up to the age of 24 because of concerns about suicide risk. Yet the number of young people taking the drugs increased between 2005 and 2012, both in the US and UK, said the study authors writing in the Lancet medical journal. In the UK the proportion of children and teenagers aged 19 and under taking antidepressants rose from 0.7% to 1.1%. © 2016 Guardian News and Media Limited

Keyword: Depression; Development of the Brain
Link ID: 22303 - Posted: 06.09.2016

By Amina Zafar, When Susan Robertson's fingers and left arm felt funny while she was Christmas shopping, they were signs of a stroke she experienced at age 36. The stroke survivor is now concerned about her increased risk of dementia. The link between stroke and dementia is stronger than many Canadians realize, the Heart and Stroke Foundation says. The group's annual report, released Thursday, is titled "Mind the connection: preventing stroke and dementia." Stroke happens when blood stops flowing to parts of the brain. Robertson, 41, of Windsor, Ont., said her short-term memory, word-finding and organizational skills were impaired after her 2011 stroke. She's extremely grateful to have recovered the ability to speak and walk after doctors found clots had damaged her brain's left parietal lobe. "I knew what was happening, but I couldn't say it," the occupational nurse recalled. Dementia risk A stroke more than doubles the risk of dementia, said Dr. Rick Swartz, a spokesman for the foundation and a stroke neurologist in Toronto. Raising awareness about the link is not to scare people, but to show how controlling blood pressure, not smoking or quitting if you do, eating a balanced diet and being physically active reduce the risk to individuals and could make a difference at a society level, Swartz said. While aging is a common risk factor in stroke and dementia, evidence in Canada and other developed countries shows younger people are also increasingly affected. ©2016 CBC/Radio-Canada.

Keyword: Stroke; Alzheimers
Link ID: 22302 - Posted: 06.09.2016

By BENEDICT CAREY Jerome S. Bruner, whose theories about perception, child development and learning informed education policy for generations and helped launch the modern study of creative problem solving, known as the cognitive revolution, died on Sunday at his home in Manhattan. He was 100. His death was confirmed by his partner, Eleanor M. Fox. Dr. Bruner was a researcher at Harvard in the 1940s when he became impatient with behaviorism, then a widely held theory, which viewed learning in terms of stimulus and response: the chime of a bell before mealtime and salivation, in Ivan Pavlov’s famous dog experiments. Dr. Bruner believed that behaviorism, rooted in animal experiments, ignored many dimensions of human mental experience. In one 1947 experiment, he found that children from low-income households perceived a coin to be larger than it actually was — their desires apparently shaping not only their thinking but also the physical dimensions of what they saw. In subsequent work, he argued that the mind is not a passive learner — not a stimulus-response machine — but an active one, bringing a full complement of motives, instincts and intentions to shape comprehension, as well as perception. His writings — in particular the book “A Study of Thinking” (1956), written with Jacqueline J. Goodnow and George A. Austin — inspired a generation of psychologists and helped break the hold of behaviorism on the field. To build a more complete theory, he and the experimentalist George A. Miller, a Harvard colleague, founded the Center for Cognitive Studies, which supported investigation into the inner workings of human thought. Much later, this shift in focus from behavior to information processing came to be known as the cognitive revolution. © 2016 The New York Times Company

Keyword: Development of the Brain
Link ID: 22300 - Posted: 06.09.2016

By Sandra G. Boodman Richard McGhee and his family believed the worst was behind them. McGhee, a retired case officer at the Defense Intelligence Agency who lives near Annapolis, had spent six months battling leukemia as part of a clinical trial at MD Anderson Cancer Center in Houston. The experimental chemotherapy regimen he was given had worked spectacularly, driving his blood cancer into a complete remission. But less than nine months after his treatment ended, McGhee abruptly fell apart. He became moody, confused and delusional — even childish — a jarring contrast with the even-keeled, highly competent person he had been. He developed tremors in his arms, had trouble walking and became incontinent. “I was really a mess,” he recalled. Doctors suspected he had developed a rapidly progressive and fatal dementia, possibly a particularly aggressive form of Alzheimer’s disease. If that was the case, his family was told, his life span would be measured in months. Luckily, the cause of McGhee’s precipitous decline proved to be much more treatable — and prosaic — than doctors initially feared. “It’s really a pleasure to see somebody get better so rapidly,” said Michael A. Williams, a professor of neurology and neurosurgery at the University of Washington School of Medicine in Seattle. Until recently, Williams was affiliated with Baltimore’s Sinai Hospital, where he treated McGhee in 2010. “This was a diagnosis waiting to be found.”

Keyword: Alzheimers; Neuroimmunology
Link ID: 22293 - Posted: 06.07.2016

By Clare Wilson We’ve all been there: after a tough mental slog your brain feels as knackered as your body does after a hard workout. Now we may have pinpointed one of the brain regions worn out by a mentally taxing day – and it seems to also affect our willpower, so perhaps we should avoid making important decisions when mentally fatigued. Several previous studies have suggested that our willpower is a finite resource, and if it gets depleted in one way – like finishing a difficult task – we find it harder to make other good choices, like resisting a slice of cake. In a small trial, Bastien Blain at INSERM in Paris and his colleagues asked volunteers to spend six hours doing tricky memory tasks, while periodically choosing either a small sum of cash now, or a larger amount after a delay. .. As the day progressed, people became more likely to act on impulse and to pick an immediate reward. This didn’t happen in the groups that spent time doing easier memory tasks, reading or gaming. For those engaged in difficult work, fMRI brain scans showed a decrease in activity in the middle frontal gyrus, a brain area involved in decision-making. “That suggests this region is becoming less excitable, which could be impairing people’s ability to resist temptation,” says Blain. It’s involved in decisions like ‘Shall I have a beer with my friends tonight, or shall I save money to buy a bike next month,’ he says. Previous research has shown that children with more willpower in a similar type of choice test involving marshmallows end up as more successful adults, by some measures. “Better impulse control predicts your eventual wealth and health,” says Blain. The idea that willpower can be depleted is contentious as some researchers have failed to replicate others’ findings. © Copyright Reed Business Information Ltd.

Keyword: Attention; Learning & Memory
Link ID: 22292 - Posted: 06.07.2016

By Jordana Cepelewicz Colors exist on a seamless spectrum, yet we assign hues to discrete categories such as “red” and “orange.” Past studies have found that a person's native language can influence the way colors are categorized and even perceived. In Russian, for example, light blue and dark blue are named as different colors, and studies find that Russian speakers can more readily distinguish between the shades. Yet scientists have wondered about the extent of such verbal influence. Are color categories purely a construct of language, or is there a physiological basis for the distinction between green and blue? A new study in infants suggests that even before acquiring language, our brain already sorts colors into the familiar groups. A team of researchers in Japan tracked neural activity in 12 prelinguistic infants as they looked at a series of geometric figures. When the shapes' color switched between green and blue, activity increased in the occipitotemporal region of the brain, an area known to process visual stimuli. When the color changed within a category, such as between two shades of green, brain activity remained steady. The team found the same pattern in six adult participants. The infants used both brain hemispheres to process color changes. Language areas are usually in the left hemisphere, so the finding provides further evidence that color categorization is not entirely dependent on language. At some point as a child grows, language must start playing a role—just ask a Russian whether a cloudless sky is the same color as the deep sea. The researchers hope to study that developmental process next. “Our results imply that the categorical color distinctions arise before the development of linguistic abilities,” says Jiale Yang, a psychologist at Chuo University and lead author of the study, published in February in PNAS. “But maybe they are later shaped by language learning.” © 2016 Scientific American

Keyword: Vision; Development of the Brain
Link ID: 22291 - Posted: 06.07.2016

James Gorman Fruit flies are far from human, but not as far as you might think. They do many of the same things people do, like seek food, fight and woo mates. And their brains, although tiny and not set up like those of humans or other mammals, do many of the same things that all brains do — make and use memories, integrate information from the senses, and allow the creature to navigate both the physical and the social world. Consequently, scientists who study how all brains work like to use flies because it’s easier for them to do invasive research that isn’t allowed on humans. The technology of neuroscience is sophisticated enough to genetically engineer fly brains, and to then use fluorescent chemicals to indicate which neurons are active. But there are some remaining problems, like how to watch the brain of a fly that is moving around freely. It is one thing to record what is going on in a fly’s brain if the insect’s movement is restricted, but quite another to try to catch the light flash of brain cells from a fly that is walking around. Takeo Katsuki, an assistant project scientist at the Kavli Institute at the University of California, San Diego, is interested in courtship. And, he said, fruit flies simply won’t engage in courtship when they are tethered. So he and Dhruv Grover, another assistant project scientist, and Ralph J. Greenspan, in whose lab they both work, set out to develop a method for recording the brain activity of a walking fly. One challenge was to track the fly as it moved. They solved that problem with three cameras to follow the fly and a laser to activate the fluorescent chemicals in the brain. © 2016 The New York Times Company

Keyword: Development of the Brain; Genes & Behavior
Link ID: 22290 - Posted: 06.06.2016

By Julia Shaw A cure for almost every memory ailment seems to be just around the corner. Alzheimer’s affected brains can have their memories restored, we can create hippocampal implants to give us better memory, and we can effectively implant false memories with light. Except that we can’t really do any of these things, at least not in humans. We sometimes forget that developments in memory science need to go through a series of stages in order to come to fruition, each of which requires tremendous knowledge and skill. From coming up with a new idea, to designing an appropriate methodology, obtaining ethical approval, getting research funding, recruiting research assistants and test subjects, conducting the experiment(s), completing complex statistical analysis for which computer code is often required, writing a manuscript, surviving the peer review process, and finally effectively distributing the findings, each part of the process is incredibly complex and takes a long time. On top of it all, this process, which can take decades to complete, typically results in incremental rather than monumental change. Rather than creating massive leaps in technology, in the vast majority of instances, studies add a teeny tiny bit of insight to the greater body of knowledge. These incremental achievements in science are often blown out of proportion by the media. As John Oliver recently said “…[Science] deserves better than to be twisted out of proportion and be turned into morning show gossip.” Moving from science fiction to science fact is harder than the media makes it seem. © 2016 Scientific American,

Keyword: Learning & Memory; Robotics
Link ID: 22289 - Posted: 06.06.2016

By Andy Coghlan People once dependent on wheelchairs after having a stroke are walking again since receiving injections of stem cells into their brains. Participants in the small trial also saw improvements in their speech and arm movements. “One 71-year-old woman could only move her left thumb at the start of the trial,” says Gary Steinberg, a neurosurgeon at Stanford University who performed the procedure on some of the 18 participants. “She can now walk and lift her arm above her head.” Run by SanBio of Mountain View, California, this trial is the second to test whether stem cell injections into patients’ brains can help ease disabilities resulting from stroke. Patients in the first, carried out by UK company ReNeuron, also showed measurable reductions in disability a year after receiving their injections and beyond. All patients in the latest trial showed improvements. Their scores on a 100-point scale for evaluating mobility – with 100 being completely mobile – improved on average by 11.4 points, a margin considered to be clinically meaningful for patients. “The most dramatic improvements were in strength, coordination, ability to walk, the ability to use hands and the ability to communicate, especially in those whose speech had been damaged by the stroke,” says Steinberg. In both trials, improvements in patients’ mobility had plateaued since having had strokes between six months and three years previously. © Copyright Reed Business Information Ltd

Keyword: Stroke; Stem Cells
Link ID: 22281 - Posted: 06.04.2016

By NICHOLAS ST. FLEUR Nine scientists have won this year’s Kavli Prizes for work that detected the echoes of colliding black holes, revealed how adaptable the nervous system is, and created a technique for sculpting structures on the nanoscale. The announcement was made on Thursday by the Norwegian Academy of Science Letters in Oslo, and was live-streamed to a watching party in New York as a part of the World Science Festival. The three prizes, each worth $1 million and split among the recipients, are awarded in astrophysics, nanoscience and neuroscience every two years. They are named for Fred Kavli, a Norwegian-American inventor, businessman and philanthropist who started the awards in 2008 and died in 2013. Eve Marder of Brandeis University, Michael M. Merzenich of the University of California, San Francisco, and Carla J. Shatz of Stanford won the neuroscience prize. Dr. Marder illuminated the flexibility and stability of the nervous system through her work studying crabs and lobsters and the neurons that control their digestion. Dr. Merzenich was a pioneer in the study of neural plasticity, demonstrating that parts of the adult brain, like those of children, can be reorganized by experience. Dr. Shatz showed that “neurons that fire together wire together,” by investigating how patterns of activity sculpt the synapses in the developing brain. The winners will receive their prizes in September at a ceremony in Oslo. © 2016 The New York Times Company

Keyword: Development of the Brain
Link ID: 22279 - Posted: 06.04.2016

By Simon Makin Other species are capable of displaying dazzling feats of intelligence. Crows can solve multistep problems. Apes display numerical skills and empathy. Yet, neither species has the capacity to conduct scientific investigations into other species' cognitive abilities. This type of behavior provides solid evidence that humans are by far the smartest species on the planet. Besides just elevated IQs, however, humans set themselves apart in another way: Their offspring are among the most helpless of any species. A new study, published recently in Proceedings of the National Academy of Sciences (PNAS), draws a link between human smarts and an infant’s dependency, suggesting one thing led to the other in a spiraling evolutionary feedback loop. The study, from psychologists Celeste Kidd and Steven Piantadosi at the University of Rochester, represents a new theory about how humans came to possess such extraordinary smarts. Like a lot of evolutionary theories, this one can be couched in the form of a story—and like a lot of evolutionary stories, this one is contested by some scientists. Kidd and Piantadosi note that, according to a previous theory, early humans faced selection pressures for both large brains and the capacity to walk upright as they moved from forest to grassland. Larger brains require a wider pelvis to give birth whereas being bipedal limits the size of the pelvis. These opposing pressures—biological anthropologists call them the “obstetric dilemma”—could have led to giving birth earlier when infants’ skulls were still small. Thus, newborns arrive more immature and helpless than those of most other species. Kidd and Piantadosi propose that, as a consequence, the cognitive demands of child care increased and created evolutionary pressure to develop higher intelligence. © 2016 Scientific American

Keyword: Development of the Brain; Evolution
Link ID: 22277 - Posted: 06.02.2016

By David Z. Hambrick If you’re a true dog lover, you take it as one of life’s simple truths that all dogs are good, and you have no patience for scientific debate over whether dogs really love people. Of course they do. What else could explain the fact that your dog runs wildly in circles when you get home from work, and, as your neighbors report, howls inconsolably for hours on end when you leave? What else could explain the fact that your dog insists on sleeping in your bed, under the covers—in between you and your partner? At the same time, there’s no denying that some dogs are smarter than others. Not all dogs can, like a border collie mix named Jumpy, do a back flip, ride a skateboard, and weave through pylons on his front legs. A study published in the journal Intelligence by British psychologists Rosalind Arden and Mark Adams confirms as much. Consistent with over a century of research on human intelligence, Arden and Adams found that a dog that excels in one test of cognitive ability will likely excel in other tests of cognitive ability. In more technical terms, the study reveals that there is a general factor of intelligence in dogs—a canine “g” factor. For their study, Arden and Adams devised a battery of canine cognitive ability tests. All of the tests revolved around—you guessed it—getting a treat. In the detour test, the dog’s objective was to navigate around barriers arranged in different configurations to get to a treat. In the point-following test, a researcher pointed to one of two inverted beakers concealing a treat, and recorded whether the dog went to that beaker or the other one. Finally, the quantity discrimination test required the dog to choose between a small treat (a glob of peanut butter) and a larger one (the “correct” answer). Arden and Adams administered the battery to 68 border collies from Wales; all had been bred and trained to do herding work on a farm, and thus had similar backgrounds. © 2016 Scientific American

Keyword: Intelligence; Evolution
Link ID: 22272 - Posted: 06.01.2016

By Gretchen Reynolds A weekly routine of yoga and meditation may strengthen thinking skills and help to stave off aging-related mental decline, according to a new study of older adults with early signs of memory problems. Most of us past the age of 40 are aware that our minds and, in particular, memories begin to sputter as the years pass. Familiar names and words no longer spring readily to mind, and car keys acquire the power to teleport into jacket pockets where we could not possibly have left them. Some weakening in mental function appears to be inevitable as we age. But emerging science suggests that we might be able to slow and mitigate the decline by how we live and, in particular, whether and how we move our bodies. Past studies have found that people who run, weight train, dance, practice tai chi, or regularly garden have a lower risk of developing dementia than people who are not physically active at all. There also is growing evidence that combining physical activity with meditation might intensify the benefits of both pursuits. In an interesting study that I wrote about recently, for example, people with depression who meditated before they went for a run showed greater improvements in their mood than people who did either of those activities alone. But many people do not have the physical capacity or taste for running or other similarly vigorous activities. So for the new study, which was published in April in the Journal of Alzheimer’s Disease, researchers at the University of California, Los Angeles, and other institutions decided to test whether yoga, a relatively mild, meditative activity, could alter people’s brains and fortify their ability to think. © 2016 The New York Times Company

Keyword: Learning & Memory
Link ID: 22270 - Posted: 06.01.2016

By Gary Stix Scientists will never find a single gene for depression—nor two, nor 20. But among the 20,000 human genes and the hundreds of thousands of proteins and molecules that switch on those genes or regulate their activity in some way, there are clues that point to the roots of depression. Tools to identify biological pathways that are instrumental in either inducing depression or protecting against it have recently debuted—and hold the promise of providing leads for new drug therapies for psychiatric and neurological diseases. A recent paper in the journal Neuron illustrates both the dazzling complexity of this approach and the ability of these techniques to pinpoint key genes that may play a role in governing depression. Scientific American talked with the senior author on the paper—neuroscientist Eric Nestler from the Icahn School of Medicine at Mt. Sinai in New York. Nestler spoke about the potential of this research to break the logjam in pharmaceutical research that has impeded development of drugs to treat brain disorders. Scientific American: The first years in the war on cancer met with a tremendous amount of frustration. Things look like they're improving somewhat now for cancer. Do you anticipate a similar trajectory may occur in neuroscience for psychiatric disorders? Eric Nestler: I do. I just think it will take longer. I was in medical school 35 years ago when the idea that identifying a person's specific pathophysiology was put forward as a means of directing treatment of cancer. We're now three decades later finally seeing the day when that’s happening. I definitely think the same will occur for major brain disorders. The brain is just more complicated and the disorders are more complicated so it will take longer. © 2016 Scientific American

Keyword: Depression; Genes & Behavior
Link ID: 22267 - Posted: 05.31.2016

By Anil Ananthaswamy and Alice Klein Our brain’s defence against invading microbes could cause Alzheimer’s disease – which suggests that vaccination could prevent the condition. Alzheimer’s disease has long been linked to the accumulation of sticky plaques of beta-amyloid proteins in the brain, but the function of plaques has remained unclear. “Does it play a role in the brain, or is it just garbage that accumulates,” asks Rudolph Tanzi of Harvard Medical School. Now he has shown that these plaques could be defences for trapping invading pathogens. Working with Robert Moir at the Massachusetts General Hospital in Boston, Tanzi’s team has shown that beta-amyloid can act as an anti-microbial compound, and may form part of our immune system. .. To test whether beta-amyloid defends us against microbes that manage to get into the brain, the team injected bacteria into the brains of mice that had been bred to develop plaques like humans do. Plaques formed straight away. “When you look in the plaques, each one had a single bacterium in it,” says Tanzi. “A single bacterium can induce an entire plaque overnight.” Double-edged sword This suggests that infections could be triggering the formation of plaques. These sticky plaques may trap and kill bacteria, viruses or other pathogens, but if they aren’t cleared away fast enough, they may lead to inflammation and tangles of another protein, called tau, causing neurons to die and the progression towards © Copyright Reed Business Information Ltd.

Keyword: Alzheimers; Neuroimmunology
Link ID: 22265 - Posted: 05.31.2016

Robert Plomin, Scientists have investigated this question for more than a century, and the answer is clear: the differences between people on intelligence tests are substantially the result of genetic differences. But let's unpack that sentence. We are talking about average differences among people and not about individuals. Any one person's intelligence might be blown off course from its genetic potential by, for example, an illness in childhood. By genetic, we mean differences passed from one generation to the next via DNA. But we all share 99.5 percent of our three billion DNA base pairs, so only 15 million DNA differences separate us genetically. And we should note that intelligence tests include diverse examinations of cognitive ability and skills learned in school. Intelligence, more appropriately called general cognitive ability, reflects someone's performance across a broad range of varying tests. Genes make a substantial difference, but they are not the whole story. They account for about half of all differences in intelligence among people, so half is not caused by genetic differences, which provides strong support for the importance of environmental factors. This estimate of 50 percent reflects the results of twin, adoption and DNA studies. From them, we know, for example, that later in life, children adopted away from their biological parents at birth are just as similar to their biological parents as are children reared by their biological parents. Similarly, we know that adoptive parents and their adopted children do not typically resemble one another in intelligence. © 2016 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 22264 - Posted: 05.31.2016

By Roland Pease BBC Radio Science Unit Researchers have invented a DNA "tape recorder" that can trace the family history of every cell in an organism. The technique is being hailed as a breakthrough in understanding how the trillions of complex cells in a body are descended from a single egg. "It has the potential to provide profound insights into how normal, diseased or damaged tissues are constructed and maintained," one UK biologist told the BBC. The work appears in Science journal. The human body has around 40 trillion cells, each with a highly specialised function. Yet each can trace its history back to the same starting point - a fertilised egg. Developmental biology is the business of unravelling how the genetic code unfolds at each cycle of cell division, how the body plan develops, and how tissues become specialised. But much of what it has revealed has depended on inference rather than a complete cell-by-cell history. "I actually started working on this problem as a graduate student in 2000," confessed Jay Shendure, lead researcher on the new scientific paper. "Could we find a way to record these relationships between cells in some compact form we could later read out in adult organisms?" The project failed then because there was no mechanism to record events in a cell's history. That changed with recent developments in so called CRISPR gene editing, a technique that allows researchers to make much more precise alterations to the DNA in living organisms. The molecular tape recorder developed by Prof Shendure's team at the University of Washington in Seattle, US, is a length of DNA inserted into the genome that contains a series of edit points which can be changed throughout an organism's life. © 2016 BBC.

Keyword: Development of the Brain; Neurogenesis
Link ID: 22259 - Posted: 05.28.2016

By BENEDICT CAREY Suzanne Corkin, whose painstaking work with a famous amnesiac known as H.M. helped clarify the biology of memory and its disorders, died on Tuesday in Danvers, Mass. She was 79. Her daughter, Jocelyn Corkin, said the cause was liver cancer. Dr. Corkin met the man who would become a lifelong subject and collaborator in 1964, when she was a graduate student in Montreal at the McGill University laboratory of the neuroscientist Brenda Milner. Henry Molaison — known in published reports as H.M., to protect his privacy — was a modest, middle-aged former motor repairman who had lost the ability to form new memories after having two slivers of his brain removed to treat severe seizures when he was 27. In a series of experiments, Dr. Milner had shown that a part of the brain called the hippocampus was critical to the consolidation of long-term memories. Most scientists had previously thought that memory was not dependent on any one cortical area. Mr. Molaison lived in Hartford, and Dr. Milner had to take the train down to Boston and drive from there to Connecticut to see him. It was a long trip, and transporting him to Montreal proved to be so complicated, largely because of his condition, that Dr. Milner did it just once. Yet rigorous study of H.M., she knew, would require proximity and a devoted facility — with hospital beds — to accommodate extended experiments. The psychology department at the Massachusetts Institute of Technology offered both, and with her mentor’s help, Dr. Corkin landed a position there. Thus began a decades-long collaboration between Dr. Corkin and Mr. Molaison that would extend the work of Dr. Milner, focus intense interest on the hippocampus, and make H.M. the most famous patient in the history of modern brain science. © 2016 The New York Times Company

Keyword: Learning & Memory
Link ID: 22258 - Posted: 05.28.2016