Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By NICHOLAS WADE After decades of disappointingly slow progress, researchers have taken a substantial step toward a possible treatment for Duchenne muscular dystrophy with the help of a powerful new gene-editing technique. Duchenne muscular dystrophy is a progressive muscle-wasting disease that affects boys, putting them in wheelchairs by age 10, followed by an early death from heart failure or breathing difficulties. The disease is caused by defects in a gene that encodes a protein called dystrophin, which is essential for proper muscle function. Because the disease is devastating and incurable, and common for a hereditary illness, it has long been a target for gene therapy, though without success. An alternative treatment, drugs based on chemicals known as antisense oligonucleotides, is in clinical trials. But gene therapy — the idea of curing a genetic disease by inserting the correct gene into damaged cells — is making a comeback. A new technique, known as Crispr-Cas9, lets researchers cut the DNA of chromosomes at selected sites to remove or insert segments. Three research groups, working independently of one another, reported in the journal Science on Thursday that they had used the Crispr-Cas9 technique to treat mice with a defective dystrophin gene. Each group loaded the DNA-cutting system onto a virus that infected the mice’s muscle cells, and excised from the gene a defective stretch of DNA known as an exon. Without the defective exon, the muscle cells made a shortened dystrophin protein that was nonetheless functional, giving all of the mice more strength. The teams were led by Charles A. Gersbach of Duke University, Eric N. Olson of the University of Texas Southwestern Medical Center and Amy J. Wagers of Harvard University. © 2016 The New York Times Company
By Elizabeth Pennisi Whether foraging for food, caring for young, or defending the nest, the worker castes of carpenter ants toil selflessly for their queen and colony. Now, biologists have figured out how to make some of those worker ants labor even harder, or change their very jobs in ant society, all by making small chemical modifications to their DNA. The finding calls attention to a new source of behavioral flexibility, and drives home the idea that so-called epigenetic modifications can connect genes to the environment, linking nature to nurture. The work is “a pioneering study establishing a causal link between epigenetics and complex social behavior,” says Ehab Abouheif, an evolutionary developmental biologist at McGill University, Montreal, in Canada. “These mechanisms may extend far beyond ants to other organisms with social behavior.” Insect biologists have long debated whether the division of labor in these sophisticated species with castes is driven by colony needs or is innate. Evidence in honey bees had pointed toward a genetic difference between queens and workers. In the past several years, however, work in both honey bees and ants had indicated that epigenetic modifications—changes to DNA other than to its sequence of bases (or DNA “letters”)—influence caste choices, indicating environmental factors can be pivotal. But subsequent research about one type of change, methylation, led to contradictory conclusions. © 2016 American Association for the Advancement of Science.
Link ID: 21744 - Posted: 01.02.2016
By R. Douglas Fields We all heard the warning as kids: “That TV will rot your brain!” You may even find yourself repeating the threat when you see young eyes glued to the tube instead of exploring the real world. The parental scolding dates back to the black-and-white days of I Love Lucy, and today concern is growing amid a flood of video streaming on portable devices. But are young minds really being harmed? With brain imaging, the effects of regular TV viewing on a child's neural circuits are plain to see. Studies suggest watching television for prolonged periods changes the anatomical structure of a child's brain and lowers verbal abilities. Behaviorally, even more detrimental effects may exist: although a cause-and-effect relation is hard to prove, higher rates of antisocial behavior, obesity and mental health problems correlate with hours in front of the set. Now a new study hits the pause button on this line of thinking. The researchers conclude that the entire body of research up to now has overlooked an important confounding variable, heredity, that could call into question the conventional wisdom that TV is bad for the brain. Further study will be needed to evaluate this claim, but the combined evidence suggests we need a more nuanced attitude toward our viewing habits. To understand the argument against television, we should rewind to 2013, when a team ofresearchers at Tohoku University in Japan, led by neuroscientist Hikaru Takeuchi, first published findings from a study in which the brains of 290 children between the ages of five and 18 were imaged. The kids' TV viewing habits, ranging from zero to four hours each day, were also taken into account. © 2016 Scientific American
By Mitch Leslie Male mice bequeath an unexpected legacy to their progeny. Two studies published online this week in Science reveal that sperm from the rodents carry pieces of RNAs that alter the metabolism of their offspring. The RNAs spotlighted by the studies normally help synthesize proteins, so the findings point to an unconventional form of inheritance. The results are “exciting and surprising, but not impossible,” says geneticist Joseph Nadeau of the Pacific Northwest Diabetes Research Institute in Seattle, Washington. “Impossible” is exactly how biologists once described so-called epigenetic inheritance, in which something other than a DNA sequence passes a trait between generations. In recent years, however, researchers have found many examples. A male mouse’s diet and stress level, for instance, can tweak offspring metabolism. Researchers are still trying to determine how offspring inherit a father’s metabolic attributes and physiological condition. Some evidence implicates chemical modification of DNA. Other work by neuroscientist Tracy Bale of the University of Pennsylvania Perelman School of Medicine in Philadelphia and colleagues has found that mammalian sperm pack gene-regulating molecules called microRNAs. The new work highlights a different class of RNAs, transfer RNAs (tRNAs). In one study, genomicist Oliver Rando of the University of Massachusetts Medical School in Worcester and colleagues delved into a case of epigenetic inheritance in which the progeny of mice fed a low-protein diet show elevated activity of genes involved in cholesterol and lipid metabolism. When Rando’s group analyzed sperm from the protein-deprived males, they uncovered an increased abundance of fragments from several kinds of tRNAs. The researchers concluded the sperm acquired most of these fragments while passing through the epididymis, a duct from the testicle where the cells mature. © 2016 American Association for the Advancement of Science
Link ID: 21741 - Posted: 01.02.2016
By Gary Stix A lingering question asked by neuroscientists has to do with what, if anything, makes the male and female brain distinctive, whether in mice or (wo)men. There is still no concise answer. The best evidence from the most recent research suggests that both males and females share the same neural circuitry, but use it differently. Catherine Dulac, a professor of molecular and cellular biology at Harvard, and investigator at the Howard Hughes medical Institute, is a pioneer in exploring these questions. I talked to her briefly about her research, which also extends far beyond just the neurobiology of gender. Can you tell me in broad overview about what you study? I'm interested in understanding how the brain engages in instinctive social behaviors. There are a lot of instinctive behaviors such as eating and sleeping that are essential in animals and humans, but social behavior is a very distinctive and particularly interesting set of instinctive behaviors that we would like to understand at the neuronal level. What we would like to understand in mechanistic terms is how does an individual recognize other animals of its own species, for example how does an animal identifies a male, a female, or an infant, how does the brain processes these signals in order to trigger appropriate social behaviors such as mating, aggression or parenting. Can you tell me a little bit about your work of the last few years that relates to gender identification? © 2015 Scientific American
By Katrina Schwartz It has become a cultural cliché that raising adolescents is the most difficult part of parenting. It’s common to joke that when kids are in their teens they are sullen, uncommunicative, more interested in their phones than in their parents and generally hard to take. But this negative trope about adolescents misses the incredible opportunity to positively shape a kid’s brain and future life course during this period of development. “[Adolescence is] a stage of life when we can really thrive, but we need to take advantage of the opportunity,” said Temple University neuroscientist Laurence Steinberg at a Learning and the Brain conference in Boston. Steinberg has spent his career studying how the adolescent brain develops and believes there is a fundamental disconnect between the popular characterizations of adolescents and what’s really going on in their brains. Because the brain is still developing during adolescence, it has incredible plasticity. It’s akin to the first five years of life, when a child’s brain is growing and developing new pathways all the time in response to experiences. Adult brains are somewhat plastic as well — otherwise they wouldn’t be able to learn new things — but “brain plasticity in adulthood involves minor changes to existing circuits, not the wholesale development of new ones or elimination of others,” Steinberg said. Adolescence is the last time in a person’s life that the brain can be so dramatically overhauled. © 2015 KQED Inc.
By Karen Weintraub Mild cognitive impairment, or M.C.I., is not a disease in itself. Rather, it is a clinical description based on performance on a test of memory and thinking skills. Depending on its cause, mild cognitive impairment is potentially reversible. Poor performance on a cognitive test could be caused by certain medications, sleep apnea, depression or other problems, said Dr. Alvaro Pascual-Leone, a professor of neurology at Harvard Medical School and Beth Israel Deaconess Medical Center. In those cases, when the underlying disease is treated, cognitive abilities can bounce back. But in about half of people with M.C.I. – doctors are not sure of the exact number — memory problems are the first sign of impending Alzheimer’s disease. If M.C.I. progresses to Alzheimer’s, there is no recovery. Alzheimer’s is marked by an inexorable decline that is always fatal, although the path from the first signs of cognitive impairment to death may take three to 15 years, said Dr. David Knopman, a professor of neurology at the Mayo Clinic in Rochester, Minn. As many as 20 percent to 30 percent of those with M.C.I. who score below but near the cutoff for normal can cross back above in a subsequent cognitive test – perhaps because they are having a better day, he said. But someone whose score is borderline is at higher risk of developing Alzheimer’s than someone who scores higher, said Dr. Knopman, also vice chair of the medical and scientific advisory council of the Alzheimer’s Association. Doctors may be hesitant to label someone with early Alzheimer’s, which can be difficult to diagnose in the early stages, so they often call it mild cognitive impairment instead, said Dr. John C. Morris, a professor of neurology and the director of the Knight Alzheimer's Disease Research Center at Washington University School of Medicine in St. Louis. © 2015 The New York Times Company
Need to remember something important? Take a break. A proper one – no TV or flicking through your phone messages. It seems that resting in a quiet room for 10 minutes without stimulation can boost our ability to remember new information. The effect is particularly strong in people with amnesia, suggesting that they may not have lost the ability to form new memories after all. “A lot of people think the brain is a muscle that needs to be continually stimulated, but perhaps that’s not the best way,” says Michaela Dewar at Heriot-Watt University in Edinburgh, UK. New memories are fragile. They need to be consolidated before being committed to long-term storage, a process thought to happen while we sleep. But at least some consolidation may occur while we’re awake, says Dewar – all you need is a timeout. In 2012, Dewar’s team showed that having a rest helps a person to remember what they were told a few minutes earlier. And the effect seems to last. People who had a 10-minute rest after hearing a story remembered 10 per cent more of it a week later than those who played a spot-the-difference game immediately afterwards. “We dim the lights and ask them to sit in an empty, quiet room, with no mobile phones,” says Dewar. When asked what they had been thinking about afterwards, most volunteers said they had let their minds wander. Now Dewar, along with Michael Craig at the University of Edinburgh and their colleagues, have found that spatial memories can also be consolidated when we rest. © Copyright Reed Business Information Ltd.
Tim Radford British scientists believe they have made a huge step forward in the understanding of the mechanisms of human intelligence. That genetic inheritance must play some part has never been disputed. Despite occasional claims later dismissed, no-one has yet produced a single gene that controls intelligence. But Michael Johnson of Imperial College London, a consultant neurologist and colleagues report in Nature Neuroscience that they may have discovered a very different answer: two networks of genes, perhaps controlled by some master regulatory system, lie behind the human gift for lateral thinking, mental arithmetic, pub quizzes, strategic planning, cryptic crosswords and the ability to laugh at limericks. As usual, such research raises potentially politically-loaded questions about the nature of intelligence. “Intelligence is a composite measure of different cognitive abilities and how they are distributed in a population. It doesn’t measure any one thing. But it is measurable,” Dr Johnson said. About 40% of the variation in intelligence is explained by inheritance. The other factors are not yet certain. But the scientists raise the distant possibility that armed with the new information they may be able to devise ways to modify human intelligence. “The idea of ultimately using drugs to affect cognitive performance is not in any way new. We all drink coffee to improve our cognitive performance,” Dr Johnson said. “It’s about understanding the pathways that are related to cognitive ability both in health and disease, especially disease so one day we could help people with learning disabilities fulfill their potential. That is very important.” © 2015 Guardian News and Media Limited
Rae Ellen Bichell Ever notice the catnaps that older relatives take in the middle of the day? Or how grandparents tend to be early risers? You're not alone. Colleen McClung did, too. A neuroscientist at the University of Pittsburgh Medical Center, McClung wanted to know what was going on in the brain that changes people's daily rhythms as they age. We all have a set of so-called clock genes that keep us on a 24-hour cycle. In the morning they wind us up, and at night they help us wind down. A study out Monday in Proceedings of the National Academy of Sciences found that those genes might beat to a different rhythm in older folks. "When you think about the early bird dinner specials, it sort of fits in with their natural shift in circadian rhythms," says McClung. "There is a core set of genes that has been described in every animal — every plant all the way down from fungus to humans — and they're pretty much the same set of genes." The genes are the master controllers of a bunch of other genes that control processes ranging from metabolism to sleep. When you woke up this morning, the timekeeping genes told a gland in your brain to give a jolt of the stress hormone cortisol to wake up. Tonight, they'll tell a gland to spit out melatonin, a hormone that makes you sleepy. "You can think of them as sort of the conductor of an orchestra," says McClung. They make sure all the other genes keep time. © 2015 npr
A study of mice shows how proteasomes, a cell’s waste disposal system, may break down during Alzheimer’s disease, creating a cycle in which increased levels of damaged proteins become toxic, clog proteasomes, and kill neurons. The study, published in Nature Medicine and supported by the National Institutes of Health, suggests that enhancing proteasome activity with drugs during the early stages of Alzheimer’s may prevent dementia and reduce damage to the brain. “This exciting research advances our understanding of the role of the proteasomes in neurodegeneration and provides a potential way to alleviate symptoms of neurodegenerative disorders,” said Roderick Corriveau, Ph.D., program director at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS), which provided funding for the study. The proteasome is a hollow, cylindrical structure which chews up defective proteins into smaller, pieces that can be recycled into new proteins needed by a cell. To understand how neurodegenerative disorders affect proteasomes, Natura Myeku, Ph.D., a postdoctoral fellow working with Karen E. Duff, Ph.D., professor of pathology and cell biology at Columbia University, New York City, focused on tau, a structural protein that accumulates into clumps called tangles in the brain cells of patients with Alzheimer’s disease and several other neurodegenerative disorders known as tauopathies. Using a genetically engineered mouse model of tauopathy, as well as looking at cells in a dish, the scientists discovered that as levels of abnormal tau increased, the proteasome activity slowed down.
Link ID: 21716 - Posted: 12.22.2015
By John Bohannon In July 1984, a man broke into the apartment of Jennifer Thompson, a 22-year-old in North Carolina, and threatened her with a knife. She negotiated, convincing him to not kill her. Instead, he raped her and fled. Just hours later, a sketch artist worked with Thompson to create an image of the assailant's face. Then the police showed her a series of mug shots of similar-looking men. Thompson picked out 22-year-old Ronald Cotton, whose photograph was on file because of a robbery committed in his youth. When word reached Cotton that the police were looking for him, he walked into a precinct voluntarily. He was eventually sentenced to life in prison based on Thompson's testimony. Eleven years later, after DNA sequencing technology caught up, samples taken from Thomson's body matched a different man who finally confessed. Cotton was set free. When Thompson first identified Cotton by photo, she was not convinced of her choice. "I think this is the guy," she told the police after several minutes of hesitation. As time went on, she grew surer. By the time Thompson faced Cotton in court a year later, her doubts were gone. She confidently pointed to him as the man who raped her. Because of examples like these, the U.S. justice system has been changing how eyewitnesses are used in criminal cases. Juries are told to discount the value of eyewitness testimony and ignore how confident the witnesses may be about whom they think they saw. Now, a new study of robbery investigations suggests that these changes may be doing more harm than good. © 2015 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 21715 - Posted: 12.22.2015
Megan Scudellari In 1997, physicians in southwest Korea began to offer ultrasound screening for early detection of thyroid cancer. News of the programme spread, and soon physicians around the region began to offer the service. Eventually it went nationwide, piggybacking on a government initiative to screen for other cancers. Hundreds of thousands took the test for just US$30–50. LISTEN James Harkin, a researcher for the British TV trivia show QI, talks to Adam Levy about how he finds facts and myths for the show — and then runs a mini-quiz to see whether the Podcast team can discern science fact from science fiction 00:00 Across the country, detection of thyroid cancer soared, from 5 cases per 100,000 people in 1999 to 70 per 100,000 in 2011. Two-thirds of those diagnosed had their thyroid glands removed and were placed on lifelong drug regimens, both of which carry risks. Such a costly and extensive public-health programme might be expected to save lives. But this one did not. Thyroid cancer is now the most common type of cancer diagnosed in South Korea, but the number of people who die from it has remained exactly the same — about 1 per 100,000. Even when some physicians in Korea realized this, and suggested that thyroid screening be stopped in 2014, the Korean Thyroid Association, a professional society of endocrinologists and thyroid surgeons, argued that screening and treatment were basic human rights. © 2015 Nature Publishing Group,
Human memory is about to get supercharged. A memory prosthesis being trialled next year could not only restore long-term recall but may eventually be used to upload new skills directly to the brain – just like in the film The Matrix. The first trials will involve people with epilepsy. Seizures can sometimes damage the hippocampus, causing the brain to lose its ability to form long-term memories. To repair this ability, Theodore Berger at the University of Southern California and his colleagues used electrodes already implanted in people’s brains as part of epilepsy treatment to record electrical activity associated with memory. The team then developed an algorithm that could predict the neural activity thought to occur when a short-term memory becomes a long-term memory, as it passes through the hippocampus. Early next year, Berger’s team will use this algorithm to instruct the electrodes to predict and then mimic the activity that should occur when long-term memories are formed. “Hopefully, it will repair their long-term memory,” says Berger. Previous studies using animals suggest that the prosthesis might even give people a better memory than they could expect naturally. A similar approach could eventually be used to implant new memories into the brain. Berger’s team recorded brain activity in a rat that had been trained to perform a specific task. The memory prosthesis then replicated that activity in a rat that hadn’t been trained. The second rat was able to learn the task much faster than the first rat – as if it already had some memory of the task. © Copyright Reed Business Information Ltd.
Jon Hamilton Taking antidepressants during the second or third trimester of pregnancy may increase the risk of having a child with autism spectrum disorder, according to a study of Canadian mothers and children published Monday in JAMA Pediatrics. But scientists not involved in the research say the results are hard to interpret and don't settle the long-running debate about whether expectant mothers with depression should take antidepressants. "This study doesn't answer the question," says Bryan King, program director of the autism center at Seattle Children's Hospital and a professor of psychiatry and behavioral sciences at the University of Washington. "My biggest concern is that it will be over-interpreted," says King, who wrote an editorial that accompanied the study. "It kind of leaves you more confused," says Alan Brown, a professor of psychiatry and epidemiology at Columbia University who studies risk factors for autism. "Mothers shouldn't get super worried about it," he says. One reason it's confusing is that there's strong evidence that mothers with depression are more likely than other women to have a child with autism, whether or not they take antidepressants during pregnancy. King and Brown say that makes it very hard to disentangle the effects of depression itself from those of the drugs used to treat it. © 2015 npr
By Elizabeth Pennisi Imagine trying to train wild sea lions—without them ever seeing you. That was Peter Cook's challenge 8 years ago when he was trying to figure out whether poisonous algae were irrevocably damaging the animals’ brains. With a lot of patience and some luck, the comparative neuroscientist from Emory University in Atlanta has succeeded, and the news isn't good. Toxins from the algae mangle a key memory center, likely making it difficult for sick animals to hunt or navigate effectively, Cook and his colleagues report today. "Sea lions can be seen as sentinels of human health," says Kathi Lefebvre, a research biologist at the Northwest Fisheries Science Center in Seattle, Washington, who was not involved with the work. As oceans warm, toxic algae proliferate and cause so-called red tides because the water looks reddish. So "understanding these toxins in wild animals is going to become more important," she says. Red tides are produced by algae called diatoms. They make a toxin called domoic acid, which is consumed by other plankton that in turn become food for fish and other organisms. Predators such as anchovies, sardines, and other schooling fish accumulate this toxin in their bodies. So when algal populations explode, say, because of warming water, domoic acid concentrations increase in these animals to a point that they affect the sea lions that feast on them. Scientists first recognized this problem in 1998, after hundreds of sea lions were found stranded or disoriented along California's coast. Since then, researchers have studied sick and dead sea lions and documented that the toxin causes seizures and damages the brain, sometimes killing the animal. © 2015 American Association for the Advancement of Science.
The clock is ticking for Ronald Cohn. He wants to use CRISPR gene editing to correct the genes of his friend’s 13-year-old son. The boy, Gavriel, has Duchenne muscular dystrophy, a genetic disease in which muscles degenerate. Breathing and heart problems often start by the time people with the condition are in their early twenties. Life expectancy is about 25 years. By the standards of science, the field of CRISPR gene editing is moving at a lightning fast pace. Although the technique was only invented a few years ago, it is already being used for research by thousands of labs worldwide to make extremely precise changes to DNA. A handful of people have already been treated using therapies enabled by the technology, and last week an international summit effectively endorsed the idea of gene editing embryos. It is too soon to try the technique out, but the summit concluded that basic research on embryos should be permitted, alongside a debate on how we should use the technology. But for people like Cohn, progress can’t come fast enough. Gavriel was diagnosed at age 4. He has already lost the use of his legs but still has some movement in his upper body, and uses a manual wheelchair. Cohn, a clinician at the Hospital for Sick Children in Toronto, estimates that he has three years to develop and test a CRISPR-based treatment if he is to help Gavriel. Muscular dystrophy is caused by a faulty gene for the protein dystrophin, which holds our muscles together. Gavriel has a duplicated version of the gene. This week, Cohn’s team published a paper describing how they grew Gavriel’s cells in a dish and used CRISPR gene-editing techniques to snip out the duplication. With the duplication removed, his cells produced normal dystrophin protein. © Copyright Reed Business Information Ltd.
By ALAN SCHWARZ Andrew Rios’s seizures began when he was 5 months old and only got worse. At 18 months, when an epilepsy medication resulted in violent behavior, he was prescribed the antipsychotic Risperdal, a drug typically used to treat schizophrenia and bipolar disorder in adults, and rarely used for children as young as 5 years. From Our Advertisers When Andrew screamed in his sleep and seemed to interact with people and objects that were not there, his frightened mother researched Risperdal and discovered that the drug was not approved, and had never even been studied, in children anywhere near as young as Andrew. “It was just ‘Take this, no big deal,’ like they were Tic Tacs,” said Genesis Rios, a mother of five in Rancho Dominguez, Calif. “He was just a baby.” Cases like that of Andrew Rios, in which children age 2 or younger are prescribed psychiatric medications to address alarmingly violent or withdrawn behavior, are rising rapidly, data shows. Many doctors worry that these drugs, designed for adults and only warily accepted for certain school-age youngsters, are being used to treat children still in cribs despite no published research into their effectiveness and potential health risks for children so young. Almost 20,000 prescriptions for risperidone (commonly known as Risperdal), quetiapine (Seroquel) and other antipsychotic medications were written in 2014 for children 2 and younger, a 50 percent jump from 13,000 just one year before, according to the prescription data company IMS Health. Prescriptions for the antidepressant fluoxetine (Prozac) rose 23 percent in one year for that age group, to about 83,000. The company’s data does not indicate how many children received these prescriptions (many children receive several prescriptions a year), but previous studies suggest that the number is at least 10,000. IMS Health researched the data at the request of The New York Times. © 2015 The New York Times Company
By Andrea Anderson Mom's ovaries could hold clues to some autism cases, new research suggests—and this time it's not because of genetic vulnerabilities carried in her eggs. A new, large-scale study out of Sweden suggests that women with polycystic ovarian syndrome (PCOS)—an endocrine disorder that affects 5 to 10 percent of women of childbearing age—have an increased risk of giving birth to children with autism spectrum disorder (ASD). The Karolinska Institute's Renee Gardner, along with colleagues from Sweden and the U.S., tapped into a Swedish national population health database to look at potential ties between PCOS and ASD. As they reported online December 8 in Molecular Psychiatry, the team looked at 23,748 individuals with ASD and nearly 209,000 unaffected individuals, all born in Sweden between 1984 and 2007. Although identifying information about the individuals was removed, the researchers had access to information about their relationships to others in the database as well as documented diagnoses and use of health care services. The group found that ASD was 59 percent more prevalent in children born to women with PCOS—a relationship that was independent of PCOS complications such as increased neonatal distress or C-section delivery. This risk level is roughly comparable with that of having a father over age 50 (estimated to be 66 percent) but lower than it is in those with certain rare genetic syndromes or mutations. The authors of the analysis believe PCOS increases ASD risk in offspring to a greater extent than maternal infection, one of many factors previously implicated in autism. © 2015 Scientific American
Laura Sanders You can thank your parents for your funny-looking hippocampus. Genes influence the three-dimensional shape of certain brain structures, scientists report in a paper posted online December 1 at bioRxiv.org. Showing a new way that genes help sculpt the brain opens up more ways to explore how the brain develops and operates. Earlier work linked genes to simple measurements of brain structures, such as overall volume or length. The new work goes beyond that by mathematically analyzing complex 3-D shapes and tying those shapes to a particular genetic makeup. A team led by researchers at Massachusetts General Hospital and Harvard Medical School analyzed MRI brain scans and genome data from 1,317 healthy young adults. Particular genetic profiles influenced the 3-D shape of structures including the hippocampus, caudate and cerebellum, the scientists found. In some brains, for instance, genes played a role in making the seahorse-shaped right hippocampus skinnier on the top and wider on the bottom. Genes also influenced whether the tail of the caudate was short or long. Quirks of brain structure shapes might play a role in disorders such as schizophrenia, autism spectrum disorder and bipolar disorder, which are known to be influenced by genes, the authors write. Citations T. Ge et al. Heritability of neuroanatomical shape. bioRxiv.org. Posted December 1, 2015. doi: 10.1101/033407. © Society for Science & the Public 2000 - 2015