Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Lisa Rapaport (Reuters Health) - Attention deficit hyperactivity disorder (ADHD), usually diagnosed in children, may show up for the first time in adulthood, two recent studies suggest. And not only can ADHD appear for the first time after childhood, but the symptoms for adult-onset ADHD may be different from symptoms experienced by kids, the researchers found. “Although the nature of symptoms differs somewhat between children and adults, all age groups show impairments in multiple domains – school, family and friendships for kids and school, occupation, marriage and driving for adults,” said Stephen Faraone, a psychiatry researcher at SUNY Upstate Medical University in Syracuse, New York and author of an editorial accompanying the two studies in JAMA Psychiatry. Faraone cautions, however, that some newly diagnosed adults might have had undetected ADHD as children. Support from parents and teachers or high intelligence, for example, might prevent ADHD symptoms from emerging earlier in life. It’s not clear whether study participants “were completely free of psychopathology prior to adulthood,” Faraone said in an email. One of the studies, from Brazil, tracked more than 5,200 people born in 1993 until they were 18 or 19 years old. © 2016 Scientific American
By Diana Kwon More than one in 10 Americans older than 12 takes antidepressants, according to a 2011 report by the National Center for Health Statistics. A significant but unknown number of children younger than 12 take them, too. Although most such drugs are not approved for young children, doctors have prescribed them off-label for years because they have been thought to have relatively mild side effects. Yet recent reports have revealed that important data about the safety of these drugs—especially their risks for children and adolescents—have been withheld from the medical community and the public. In the latest and most comprehensive analysis, published in January in the BMJ, researchers at the Nordic Cochrane Center in Copenhagen showed that pharmaceutical companies have not been revealing the full extent of serious harm in clinical study reports, which are detailed documents sent to regulatory authorities such as the U.S. Food and Drug Administration and the European Medicines Agency (EMA) when applying for approval of a new drug. The researchers examined reports from 70 double-blind, placebo-controlled trials of two common categories of antidepressants—selective serotonin reuptake inhibitors (SSRIs) and serotonin and norepinephrine reuptake inhibitors (SNRIs)—and found that the occurrence of suicidal thoughts and aggressive behavior doubled in children and adolescents who used these drugs. The investigators discovered that some of the most revealing information was buried in appendices where individual patient outcomes are listed. For example, they found clear instances of suicidal thinking that had been passed off as “emotional lability” or “worsening depression” in the report itself. This information, however, was available for only 32 out of the 70 trials. “We found that a lot of the appendices were often only available on request to the authorities, and the authorities had never requested them,” says Tarang Sharma, a Ph.D. student at Cochrane and lead author of the study. “I'm actually kind of scared about how bad the actual situation would be if we had the complete data.” © 2016 Scientific American
Laura Sanders In mice, a long course of antibiotics that wiped out gut bacteria slowed the birth of new brain cells and impaired memory, scientists write May 19 in Cell Reports. The results reinforce evidence for a powerful connection between bacteria in the gut and the brain (SN: 4/2/16, p. 23). After seven weeks of drinking water spiked with a cocktail of antibiotics, mice had fewer newborn nerve cells in a part of the hippocampus, a brain structure important for memory. The mice’s ability to remember previously seen objects also suffered. Further experiments revealed one way bacteria can influence brain cell growth and memory. Injections of immune cells called Ly6Chi monocytes boosted the number of new nerve cells. Themonocytes appear to carry messages from gut to brain, Susanne Wolf of the Max Delbrück Center for Molecular Medicine in Berlin and colleagues found. Exercise and probiotic treatment with eight types of live bacteria also increased the number of newborn nerve cells and improved memory in mice treated with antibiotics. The results help clarify the toll of prolonged antibiotic treatment, and hint at ways to fight back, the authors write. L. Möhle et al. Ly6Chi monocytes provide a link between antibiotic-induced changes in gut microbiota and adult hippocampal neurogenesis. Cell Reports. Vol. 15, May 31, 2016. doi: 10.1016/j.celrep.2016.04.074. © Society for Science & the Public 2000 - 2016
By Sarah Kaplan You probably wouldn't be surprised if a scientist told you that your genes influence when you hit puberty, how tall you are, what your BMI will be and whether you're likely to develop male pattern baldness. But what if he said that the same gene could hold sway over all four things? That finding comes from a study published Monday in the journal Nature Genetics. Using data from dozens of genome-wide association studies (big scans of complete sets of DNA from many thousands of people), researchers at the New York Genome Center and the genetic analysis company 23andMe found examples of single "multitasking" genes that influence diverse and sometimes seemingly disparate traits. The scientists say that the links they uncovered could help researchers understand how certain genes work, and figure out better ways of treating some of the health problems they might control. "Most studies tend to go one disease at a time," said Joseph Pickrell, a professor at Columbia University and the New York Genome Center's lead investigator on the project. "But if we can try to make these sorts of connections between what you might think of as unrelated traits ... that gives us another angle of attack to understand the connections between these different diseases." To start, Pickrell and his team sought out genome-wide association studies (GWAS) identifying particular genetic variants associated with 42 different traits. Many had to do with diseases (for example, studies that linked certain genes to the risk of developing Alzheimer's or type 2 diabetes) and other personal health traits (body mass index, blood type, cholesterol levels).
Keyword: Genes & Behavior
Link ID: 22225 - Posted: 05.18.2016
By JONATHAN BALCOMBE Washington — IN March, two marine biologists published a study of giant manta rays responding to their reflections in a large mirror installed in their aquarium in the Bahamas. The two captive rays circled in front of the mirror, blew bubbles and performed unusual body movements as if checking their reflection. They made no obvious attempt to interact socially with their reflections, suggesting that they did not mistake what they saw as other rays. The scientists concluded that the mantas seemed to be recognizing their reflections as themselves. Mirror self-recognition is a big deal. It indicates self-awareness, a mental attribute previously known only among creatures of noted intelligence like great apes, dolphins, elephants and magpies. We don’t usually think of fishes as smart, let alone self-aware. As a biologist who specializes in animal behavior and emotions, I’ve spent the past four years exploring the science on the inner lives of fishes. What I’ve uncovered indicates that we grossly underestimate these fabulously diverse marine vertebrates. The accumulating evidence leads to an inescapable conclusion: Fishes think and feel. Because fishes inhabit vast, obscure habitats, science has only begun to explore below the surface of their private lives. They are not instinct-driven or machinelike. Their minds respond flexibly to different situations. They are not just things; they are sentient beings with lives that matter to them. A fish has a biography, not just a biology. Those giant manta rays have the largest brains of any fish, and their relative brain-to-body size is comparable to that of some mammals. So, an exception? Then you haven’t met the frillfin goby. © 2016 The New York Times Company
By Julia Shaw You see a crime take place. You are interviewed about it. You give a statement about what you saw. Do you think that at a later date you would be able to detect whether someone had tampered with your statement? Or re-written parts of it? This is currently a hot topic in the UK, where a very recently published inquiry into the so-called Hillsborough disaster, in which 96 people were crushed to death during a soccer match in 1989, found that testimonies had been deliberately altered by police. Research published earlier this year by the false memory dream team at the University of California, looked directly into the implications of such police (mis)conduct. They found that it is possible that changed statements can go unnoticed by the person who gave the original testimony, and may even develop into a false memory that accommodates the false account. To describe this effect, the researchers came up with the term "memory blindness"—the phenomenon of failing to recognize our own memories. The term was intended to mirror the ‘choice blindness’ literature. Choice blindness is forgetting choices that we have made. The researchers wanted to know “Can choice blindness have lasting effects on eyewitness memory?” To examine this, PhD Student Kevin Cochran and his colleagues conducted two experiments. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 22218 - Posted: 05.16.2016
Bret Stetka Last year, in an operating room at the University of Toronto, a 63-year-old women with Alzheimer's disease experienced something she hadn't for 55 years: a memory of her 8-year-old self playing with her siblings on their family farm in Scotland. The woman is a patient of Dr. Andres Lozano, a neurosurgeon who is among a growing number of researchers studying the potential of deep brain stimulation to treat Alzheimer's and other forms of dementia. If the approach pans out it could provide options for patients with fading cognition and retrieve vanished memories. Right now, deep brain stimulation is used primarily to treat Parkinson's disease and tremor, for which it's approve by the Food and Drug Administration. DBS involves delivering electrical impulses to specific areas of the brain through implanted electrodes. The technique is also approved for obsessive-compulsive disorder and is being looked at for a number of other brain disorders, including depression, chronic pain and, as in Lozano's work, dementia. In 2008 Lozano's group published a study in which an obese patient was treated with deep brain stimulation of the hypothalamus. Though no bigger than a pea, the hypothalamus is a crucial bit of brain involved in appetite regulation and other bodily essentials such as temperature control, sleep and circadian rhythms. It seemed like a reasonable target in trying to suppress excessive hunger. To the researcher's surprise, following stimulation the patient reported a sensation of deja vu. He also perceived feeling 20 years younger and recalled a memory of being in a park with friends, including an old girlfriend. With increasing voltages his memories became more vivid. He remembered their clothes. © 2016 npr
Keyword: Learning & Memory
Link ID: 22213 - Posted: 05.14.2016
Laura Sanders Brain waves during REM sleep solidify memories in mice, scientists report in the May 13 Science. Scientists suspected that the eye-twitchy, dream-packed slumber known as rapid eye movement sleep was important for memory. But REM sleep’s influence on memory has been hard to study, in part because scientists often resorted to waking people or animals up — a stressful experience that might influence memory in different ways. Richard Boyce of McGill University in Montreal and colleagues interrupted REM sleep in mice in a more delicate way. Using a technique called optogenetics, the researchers blocked a brain oscillation called theta waves in the hippocampus, a brain structure involved in memory, during REM sleep. This light touch meant that the mice stayed asleep but had fewer REM-related theta waves in their hippocampi. Usually, post-learning sleep helps strengthen memories. But mice with disturbed REM sleep had memory trouble, the researchers found. Curious mice will spend more time checking out an object that’s been moved to a new spot than an unmoved object. But after the sleep treatment, the mice seemed to not remember objects’ earlier positions, spending equal time exploring an unmoved object as one in a new place. These mice also showed fewer signs of fear in a place where they had previously suffered shocks. Interfering with theta waves during other stages of sleep didn’t seem to cause memory trouble, suggesting that something special happens during REM sleep. R. Boyce et al. Causal evidence for the role of REM sleep theta rhythm in contextual memory consolidation. Science. Vol. 352, p. 812, May 13, 2016. doi: 10.1126/science.aad5252. © Society for Science & the Public 2000 - 2016.
By Emily Underwood One of the telltale signs of Alzheimer’s disease (AD) is sticky plaques of ß-amyloid protein, which form around neurons and are thought by a large number of scientists to bog down information processing and kill cells. For more than a decade, however, other researchers have fingered a second protein called tau, found inside brain cells, as a possible culprit. Now, a new imaging study of 10 people with mild AD suggests that tau deposits—not amyloid—are closely linked to symptoms such as memory loss and dementia. Although this evidence won’t itself resolve the amyloid-tau debate, the finding could spur more research into new, tau-targeting treatments and lead to better diagnostic tools, researchers say. Scientists have long used an imaging technique called positron emission tomography (PET) to visualize ß-amyloid deposits marked by radioactive chemical tags in the brains of people with AD. Combined with postmortem analyses of brain tissue, these studies have demonstrated that people with AD have far more ß-amyloid plaques in their brains than healthy people, at least as a general rule. But they have also revealed a puzzle: Roughly 30% of people without any signs of dementia have brains “chock-full” of ß-amyloid at autopsy, says neurologist Beau Ances at Washington University in St. Louis in Missouri. That mystery has inspired many in the AD field to ask whether a second misfolded protein, tau, is the real driver of the condition’s neurodegeneration and symptoms, or at least an important accomplice. Until recently, the only ways to test that hypothesis were to measure tau in brain tissue after a person died, or in a sample of cerebrospinal fluid (CSF) extracted from a living person by needle. But in the past several years, researchers have developed PET imaging agents that can harmlessly bind to tau in the living brain. The more tau deposits found in the temporal lobe, a brain region associated with memory, the more likely a person was to show deficits on a battery of memory and attention tests, the team reports today in Science Translational Medicine. © 2016 American Association for the Advancement of Science.
Erika Check Hayden The largest-ever genetics study in the social sciences has turned up dozens of DNA markers that are linked to the number of years of formal education an individual completes. The work, reported this week in Nature, analysed genetic material from around 300,000 people. “This is good news,” says Stephen Hsu, a theoretical physicist at Michigan State University in East Lansing, who studies the genetics of intelligence. “It shows that if you have enough statistical power you can find genetic variants that are associated with cognitive ability.” Yet the study’s authors estimate that the 74 genetic markers they uncovered comprise just 0.43% of the total genetic contribution to educational achievement (A. Okbay et al. Nature http://dx.doi.org/10.1038/nature17671; 2016). By themselves, the markers cannot predict a person’s performance at school. And because the work examined only people of European ancestry, it is unclear whether the results apply to those with roots in other regions, such as Africa or Asia. The findings have proved divisive. Some researchers hope that the work will aid studies of biology, medicine and social policy, but others say that the emphasis on genetics obscures factors that have a much larger impact on individual attainment, such as health, parenting and quality of schooling. © 2016 Nature Publishing Group
Sara Reardon As a medical student in Paris in the 1980s, Eric Vilain found himself pondering the differences between men and women. What causes them to develop differently, and what happens when the process goes awry? At the time, he was encountering babies that defied simple classification as a boy or girl. Born with disorders of sex development (DSDs), many had intermediate genitalia — an overlarge clitoris, an undersized penis or features of both sexes. Then, as now, the usual practice was to operate. And the decision of whether a child would be left with male or female genitalia was often made not on scientific evidence, says Vilain, but on practicality: an oft-repeated, if insensitive, line has it that “it's easier to dig a hole than build a pole”. Vilain found the approach disturbing. “I was fascinated and shocked by how the medical team was making decisions.” Vilain has spent the better part of his career studying the ambiguities of sex. Now a paediatrician and geneticist at the University of California, Los Angeles (UCLA), he is one of the world's foremost experts on the genetic determinants of DSDs. He has worked closely with intersex advocacy groups that campaign for recognition and better medical treatment — a movement that has recently gained momentum. And in 2011, he established a major longitudinal study to track the psychological and medical well-being of hundreds of children with DSDs. © 2016 Nature Publishing Group
By Hazem Zohny Here is a picture of the nine-dot problem. The task seems simple enough: connect all nine dots with four straight lines, but, do so without lifting the pen from the paper or retracing any line. If you don’t already know the solution, give it a try – although your chances of figuring it out within a few minutes hover around 0 percent. In fact, even if I were to give you a hint like “think outside of the box,” you are unlikely to crack this deceptively (and annoyingly!) simple puzzle. And yet, if we were to pass a weak electric current through your brain (specifically your anterior temporal lobe, which sits somewhere between the top of your ear and temple), your chances of solving it may increase substantially. That, at least, was the finding from a study where 40 percent of people who couldn’t initially solve this problem managed to crack it after 10 minutes of transcranial direct current stimulation (tDCS) – a technique for delivering a painlessly weak electric current to the brain through electrodes on the scalp. How to explain this? It is an instance of the alleged power of tDCS and similar neurostimulation techniques. These are increasingly touted as methods that can “overclock” the brain in order to boost cognition, improve our moods, make us stronger, and even alter our moral dispositions. The claims are not completely unfounded: there is evidence that some people become slightly better at holding and manipulating information in their minds after a bout of tDCS. It also appears to reduce some people’s likelihood of formulating false memories, and seems to have a lasting improvement on some people’s ability to work with numbers. It can even appear to boost creativity, enhancing the ability of some to make abstract connections between words to come up with creative analogies. But it goes further, with some evidence that it can help people control their urges as well improve their mood. And beyond these psychological effects, tDCS of the part of the brain responsible for movement seems to improve muscular endurance and reduce fatigue. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 22205 - Posted: 05.11.2016
By PAM BELLUCK BALTIMORE — Leave it to the youngest person in the lab to think of the Big Idea. Xuyu Qian, 23, a third-year graduate student at Johns Hopkins, was chatting in late January with Hongjun Song, a neurologist. Dr. Song was wondering how to test their three-dimensional model of a brain — well, not a brain, exactly, but an “organoid,” essentially a tiny ball of brain cells, grown from stem cells and mimicking early brain development. “We need a disease,” Dr. Song said. Mr. Qian tossed out something he’d seen in the headlines: “Why don’t we check out this Zika virus?” Within a few weeks — a nanosecond compared with typical scientific research time — that suggestion led to one of the most significant findings in efforts to answer a central question: How does the Zika virus cause brain damage, including the abnormally small heads in babies born to infected mothers? The answer could spur discoveries to prevent such devastating neurological problems. And time is of the essence. One year after the virus was first confirmed in Latin America, with the raging crisis likely to reach the United States this summer, no treatment or vaccine exists. “We can’t wait,” said Dr. Song, at the university’s Institute for Cell Engineering, where he and his wife and research partner, Dr. Guo-Li Ming, provided a pipette-and-petri-dish-level tour. “To translate our work for the clinic, to the public, normally it takes years. This is a case where we can make a difference right away.” The laboratory’s initial breakthrough, published in March with researchers at two other universities, showed that the Zika virus attacked and killed so-called neural progenitor cells, which form early in fetal development and generate neurons in the brain. © 2016 The New York Times Company
By Geraldine Dawson There’s a popular saying in the autism community: “If you’ve met one person with autism, you’ve met one person with autism.” Although this phrase is meant to convey the remarkable variation in abilities and disabilities among people with autism spectrum disorder (ASD), we’re learning that it also applies to the extraordinary variability in how ASD develops. When I first began doing research on autism decades ago, we thought of it as one condition and aimed to discover its “cause.” Now we know ASD is actually a group of lifelong conditions that can arise from a complex combination of multiple genetic and environmental factors. In the same way that each person with ASD has a unique personality and profile of talents and disabilities, each also has a distinct developmental history shaped by a specific combination of genetic and environmental factors. More evidence of this extraordinary variety will be presented this week in Baltimore, where nearly 2,000 of the world’s leading autism researchers will gather for the International Meeting for Autism Research (IMFAR). As president of the International Society for Autism Research, which sponsors the conference, I am more impressed than ever with the progress we are making. New findings being presented at the conference will highlight the importance of the prenatal period in understanding how various environmental factors such as exposure to alcohol, smoking and certain chemical compounds can increase risk for ASD. The impact of many environmental factors depends, however, on an individual’s genetic background and the timing of the exposure. Other research links inflammation—detected in blood spot tests taken at birth—with a higher likelihood of an ASD diagnosis later on. Researchers suggest that certain factors such as maternal infection and other factors during pregnancy may influence an infant’s immune system and contribute to risk. As our knowledge of these risk factors grows, so do the opportunities for promoting healthy pregnancies and better outcomes. © 2016 Scientific American
Chris Woolston A story about epigenetics in the 2 May issue of The New Yorker has been sharply criticized for inaccurately describing how genes are regulated. The article by Siddhartha Mukherjee — a physician, cancer researcher and award-winning author at Columbia University in New York — examines how environmental factors can change the activity of genes without altering the DNA sequence. Jerry Coyne, an evolutionary ecologist at the University of Chicago in Illinois, posted two widely discussed blog posts calling the piece “superficial and misleading”, largely because it ignored key aspects of gene regulation. Other researchers quoted in the blog posts called the piece “horribly damaging” and “a truly painful read”. Mukherjee responded by publishing a point-by-point rebuttal online. Speaking to Nature, he says he now realizes that he erred by omitting key areas of the science, but that he didn’t mean to mislead. “I sincerely thought that I had done it justice,” he says. Mukherjee’s article, ‘Same But Different’, takes a personal view of epigenetics — a term whose definition is highly contentious in the field. The story features his mother and aunt, identical twins who have distinct personalities. Mukherjee, who won a Pulitzer Prize in 2011 for his best-selling book The Emperor of All Maladies: A Biography of Cancer (Scribner, 2010), writes that identical twins differ because: “Chance events — injuries, infections, infatuations; the haunting trill of that particular nocturne — impinge on one twin and not on the other. Genes are turned on and off in response to these events, as epigenetic marks are gradually layered above genes, etching the genome with its own scars, calluses, and freckles.” The article is drawn from a book by Mukherjee that is due out later this month, called The Gene: An Intimate History (Scribner, 2016). © 2016 Nature Publishing Group
By DAN BARRY IDIOT. Imbecile. Cretin. Feebleminded. Moron. Retarded. Offensive now but once quite acceptable, these terms figured in the research for a lengthy article I wrote in 2014 about 32 men who spent decades eviscerating turkeys in a meat-processing plant in Iowa — all for $65 a month, along with food and lodging in an ancient former schoolhouse on a hill. These were men with intellectual disability, which meant they had significant limitations in reasoning, learning and problem solving, as well as in adaptive behavior. But even though “intellectual disability” has been the preferred term for more than a decade, it gave my editors and me pause. We wondered whether readers would instantly understand what the phrase meant. What’s more, advocates and academicians were recommending that I suppress my journalistic instinct to tighten the language. I was told that it was improper to call these men “intellectually disabled,” instead of “men with intellectual disability.” Their disability does not define them; they are human beings with a disability. This linguistic preference is part of society’s long struggle to find the proper terminology for people with intellectual disability, and reflects the discomfort the subject creates among many in the so-called non-disabled world. It speaks to a continuing sense of otherness; to perceptions of what is normal, and not. “It often doesn’t matter what the word is,” said Michael Wehmeyer, the director and senior scientist at the Beach Center on Disability at the University of Kansas. “It’s that people associate that word with what their perceptions of these people are — as broken, or as defective, or as something else.” For many years, the preferred term was, simply, idiot. When Massachusetts established a commission on idiocy in the mid-1840s, it appointed Dr. Samuel G. Howe, an abolitionist and early disability rights advocate, as its chairman. The commission argued for the establishment of schools to help this segment of society, but made clear that it regarded idiocy “as an outward sign of an inward malady.” © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22195 - Posted: 05.09.2016
By Aleszu Bajak In its May 2 issue, The New Yorker magazine published a report titled “Same But Different,” with the subhead: “How epigenetics can blur the line between nature and nurture.” The piece was written by Siddhartha Mukherjee, a physician and author of the Pulitzer prize-winning book “The Emperor of all Maladies: A Biography of Cancer.” In his New Yorker story, Mukherjee, with deft language and colorful anecdotes, examines a topic that is very much du jour in science writing: Epigenetics. Google defines epigenetics as “the study of changes in organisms caused by modification of gene expression, rather than alteration of the genetic code itself.” Merriam Webster’s definition is similar — but not exactly the same: “The study of heritable changes in gene function that do not involve changes in DNA sequence.” The slight variation in definition is telling in itself — and it’s really that “heritable” part that has sparked intense interest not just among scientists, but in the popular mind. Steven Henikoff, a molecular biologist at the Fred Hutchinson Cancer Research Center in Seattle, called Siddhartha Mukherjee’s lyrical take on epigenetics “baloney.” It’s the idea that external factors like diet, or stress or even lifestyle choices can impact not just your own genes, but the genetic information you pass down to all of your descendants. Spend your life smoking cigarettes and eating fatty foods, the thinking goes, and you’ll not just make yourself sick, you’ll predispose your offspring — and their offspring, and their offspring — to associated diseases as well. It’s heady stuff, but much of it remains speculative and poorly supported, which is where Mukherjee may have run into trouble. The publication of his story — an excerpt from his forthcoming book “The Gene: An Intimate History” — was met with swift criticism from biologists working in epigenetics and the broader field of gene regulation. They argue that Mukherjee played fast and loose with his description of epigenetic processes and misled readers by casting aside decades of research into how genes are regulated during development. Copyright 2016 Undark
Link ID: 22194 - Posted: 05.09.2016
By Jocelyn Kaiser Gene therapy is living up to its promise of halting a rare, deadly brain disease in young boys. In a new study presented in Washington, D.C., yesterday at the annual meeting of the American Society of Gene and Cell Therapy, all but one of 17 boys with adrenoleukodystrophy (ALD) remained relatively healthy for up to 2 years after having an engineered virus deliver into their cells a gene to replenish a missing protein needed by the brain. The results, which expand on an earlier pilot study, bring this ALD therapy one step closer to the clinic. About one in 21,000 boys are born with ALD, which is caused by a flaw in a gene on the X chromosome that prevents cells from making a protein that the cells need to process certain fats—females have a backup copy of the gene on their second X chromosome. Without that protein, the fats build up and gradually destroy myelin sheaths that protect nerves in the brain. In the cerebral form of ALD, which begins in childhood, patients quickly lose vision and mobility, usually dying by age 12. The disease achieved some degree of fame with the 1992 film Lorenzo’s Oil, inspired by a family’s struggle to prolong their son’s life with a homemade remedy. The only currently approved treatment for ALD is a bone marrow transplant -- white blood cells in the marrow go to the brain and turn into glial cells that produce normal ALD proteins. But bone marrow transplants carry many risks, including immune rejection, and matching donors can’t always be found. As an alternative, in the late 2000s, French researchers treated the bone cells of two boys with a modified virus carrying the ALD gene. They reported in Science in 2009 that this halted progression of the disease. © 2016 American Association for the Advancement of Science
By John Elder Robison Manipulating your brain with magnetic fields sounds like science fiction. But the technique is real, and it’s here. Called transcranial magnetic stimulation (TMS), it is approved as a therapy for depression in the US and UK. More controversially, it is being studied as a way to treat classic symptoms of autism, such as emotional disconnection. With interest and hopes rising, it’s under the spotlight at the International Meeting for Autism Research in Baltimore, Maryland, next week. I can bear witness to the power of TMS, which induces small electrical currents in neurons. As someone with Asperger’s, I tried it for medical research, and described its impact in my book Switched On. After TMS, I could see emotional cues in other people – signals I had always been blind to, but that many non-autistic people pick up with ease. That sounds great, so why the need for debate? Relieving depression isn’t controversial, because there is no question people suffer as a result of it. I too felt that I suffered – from emotional disconnection. But changing “emotional intelligence” to relieve that comes closer to changing the essence of how we think. Yes, emerging brain therapies like TMS have great potential. Several of the volunteers who went into the TMS lab at Harvard Medical School emerged with new self-awareness, and lasting changes. While I can’t speak with certainty for the others, I believe some of us have a degree of emotional insight that we didn’t have before. I certainly feel better able to fit in. As fellow participant Michael Wilcox put it, we have more emotional reactions to things we see or read. © Copyright Reed Business Information Ltd.
Link ID: 22187 - Posted: 05.07.2016
By Ann Gibbons We may not be raring to go on a Monday morning, but humans are the Energizer Bunnies of the primate world. That’s the conclusion of a new study that, for the first time, measures precisely how many calories humans and apes burn each day. Compared with chimpanzees and other apes, our revved-up internal engines burn calories 27% faster, according to a paper in Nature this week. This higher metabolic rate equips us to quickly fuel energy-hungry brain cells, sustaining our bigger brains. And lest we run out of gas when food is short, the study also found that humans are fatter than other primates, giving us energy stores to draw on in lean times. “The brilliant thing here is showing for the first time that we do have a higher metabolic rate, and we do use more energy,” says paleoanthropologist Leslie Aiello, president of the Wenner-Gren Foundation for Anthropological Research in New York City. “Humans during evolution have become more and more hypermetabolic,” says biological anthropologist Carel van Schaik of the University of Zurich in Switzerland. “We turned up the thermostat.” For decades, researchers assumed that “there weren’t any differences in the rate at which different species burned calories,” says biological anthropologist Herman Pontzer of Hunter College in New York City, lead author of the new study. Comparing humans and other primates, they saw little difference in basal metabolic rate, which reflects the total calories used by our organs while we are at rest. © 2016 American Association for the Advancement of Science