Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Andy Coghlan Don’t go to bed angry. Now there’s evidence for this proverb: it’s harder to suppress bad memories if you sleep on them. The discovery could reveal new ways to treat people who suffer from conditions like post-traumatic stress disorder, and reinforces an earlier idea that it is possible to suppress bad memories through sleep deprivation. “The results are of major interest for treating the frequent clinical problem of unwanted memories, memories of traumatic events being the most prominent example,” says Christoph Nissen at the University of Freiburg Medical Center in Germany, who was not involved in the work. In the study, 73 male students memorised 26 mugshots, each paired with a disturbing image, such as a mutilated body, corpse or crying child. The next day they were asked to recall the images associated with half the mugshots and actively try to exclude memories of the rest of the associated images. The group were then directed to memorise another 26 pairs of mugshots and nasty images. Half an hour later they again thought about half the associated images and actively suppressed memories of the rest. Finally, they were asked to describe the image associated with each of the 52 mugshots. The idea was to see if trying to suppress a bad memory works better before or after sleep. © Copyright Reed Business Information Ltd.
By Virginia Morell At last, scientists may have an answer to a question every dog owner asks: Does your pet remember the things you do together? For people, at least, the ability to consciously recall personal experiences and events is thought to be linked to self-awareness. It shapes how we think about the past—and how we predict the future. Now, a new study suggests that dogs also have this type of memory, indicating that the talent may be more common in other animals than previously recognized. The study, “is a creative approach to trying to capture what’s on a dog’s mind,” says Alexandra Horowitz, a dog cognition scientist at Barnard College in New York City who was not involved in the research. The idea that nonhuman animals can consciously remember things they’ve done or witnessed in the past, called episodic memory, is controversial—largely because it’s thought that these animals aren’t self-aware. But scientists have shown that species like Western scrub jays, hummingbirds, rats, and the great apes—those that have to recall complex sequences of information in order to survive—have “episodiclike” memory. For instance, the jays remember what food they’ve hidden, where they stashed it, when they did so, and who was watching while they did it. But what about recalling things that aren’t strictly necessary for survival, or someone else’s actions? To find out whether dogs can remember such details, scientists asked 17 owners to teach their pets a trick called “do as I do.” The dogs learned, for instance, that after watching their owner jump in the air, they should do the same when commanded to “do it!” © 2016 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 22906 - Posted: 11.25.2016
Ian Sample Science editor A leading psychologist whose research on human memory exposed her to death threats, lawsuits, personal abuse and a campaign to have her sacked has won a prestigious prize for her courage in standing up for science. Professor Elizabeth Loftus endured a torrent of abuse from critics who objected to her work on the unreliable nature of eyewitness testimonies, and her defining research on how people can develop rich memories for events that never happened. The work propelled Loftus into the heart of the 1990 “memory wars”, when scores of people who had gone into therapy with depression, eating disorders and other common psychological problems, came out believing they had recovered repressed memories for traumatic events, often involving childhood abuse. Loftus, now a professor of law and cognitive science at the University of California, Irvine, performed a series of experiments that showed how exposure to inaccurate information and leading questions could corrupt eyewitness testimonies. More controversially, she demonstrated how therapy and hypnosis could plant completely false childhood memories in patients. She went on to become an expert witness or consultant for hundreds of court cases. In the 1990s, thousands of repressed memory cases came to light, with affected patients taking legal action against family members, former neighbours, doctors, dentists and teachers. The accusations tore many families apart. As an expert witness in such cases, Loftus came under sustained attack from therapists and patients who were convinced the new-found memories were accurate. The abuse marked a distinct shift away from the good-natured debates she was used to having in academic journals. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22890 - Posted: 11.19.2016
By Jessica Hamzelou You’ve got a spare hour before a big exam. How should you spend it? It seems napping is just as effective as revising, and could even have a longer-lasting impact. Repeatedly revising information to learn it makes sense. “Any kind of reactivation of a memory trace will lead to it being strengthened and reconsolidated,” says James Cousins at the Duke-NUS Medical School in Singapore. “With any memory, the more you recall it, the stronger the memory trace.” However, sleep is also thought to be vital for memory. A good night’s sleep seems to help our brains consolidate what we’ve learned in the day, and learning anything when you’re not well rested is tricky. Many people swear by a quick afternoon kip. So if you’ve got an hour free, is it better to nap or revise? Cousins, along with Michael Chee and their colleagues, also at Duke-NUS Medical School, set out to compare the two options. The team mocked-up a real student experience, and had 72 volunteers sit through presentations of about 12 different species of ants and crabs. The participants were asked to learn all about these animals, including their diets and habitats, for example. After 80 minutes of this, the students were given an hour to either watch a film, have a nap, or revise what they had just learned. After this hour, they had another 80 minutes of learning. Then they had to sit an exam in which they were asked 360 questions about the ants and the crabs. “The napping group got the best scores,” says Cousins, whose work was presented at the Society for Neuroscience annual meeting in San Diego, California on Tuesday. © Copyright Reed Business Information Ltd.
Kathleen Taylor The global rise in dementia should surprise no one. The figures — such as the 9.9 million new diagnoses each year — have been known for decades. As slow as we are to accept such vast changes on a personal, societal and political level, so research is slow to uncover why our brains become fragile with age. Neuroscientist and writer Kathleen Taylor's The Fragile Brain is about that research. But it is much more than a simple reflection on the best published hypotheses. Taylor has crafted a personal, astonishingly coherent review of our current state of knowledge about the causes of Alzheimer's disease and dementia, as well as possible solutions, from lifestyle adjustments to drug developments. Filled with elegant metaphors, her study covers the detail of molecular biology and larger-scale analysis, including epidemiological observations and clinical studies. It extends to dementia due to multiple sclerosis, stroke and encephalitis. For instance, some 5–30% of people who have a first stroke develop dementia. But the book's focus is Alzheimer's disease, and rightly so: it is what up to 80% of people with dementia are diagnosed with. Taylor begins with a shocking juxtaposition, setting the costs of age-related disorders and of dementia alongside the scarcity in funding. In Britain, Australia and the United States, for example, funding for dementia research is a fraction of that for cancer — in the United States, just 18%. She contextualizes with reflections on the history of dementia research, deftly unravelling the roles of pioneering scientists Alois Alzheimer, Franz Nissl and Emil Kraepelin in describing the condition. © 2016 Macmillan Publishers Limited,
By Marian Vidal-Fernandez, Ana Nuevo-Chiquero, The title of this article might trigger self-satisfied smiles among first-borns, and some concerns among the rest of us. Many studies show children born earlier in the family enjoy better wages and more education, but until now we didn’t really know why. Our recently published findings are the first to suggest advantages of first born siblings start very early in life—around zero to three years old! We observe parents changing their behaviour as new children are born, and offering less cognitive stimulation to children of higher birth order. It now seems clear that for those born and raised in high-income countries such as the United States, the UK and Norway, earlier-born children enjoy higher wages and education as adults—known as the “birth order effect”. Comparing two siblings, the greater the difference in their birth order, the greater the relative benefit to the older child. However, to date we’ve had no evidence that explains where such differences come from. We know it’s not an effect of family size, because the effect remains when comparing siblings within the same family and families with the same number of children. While it makes sense that parents earn more money and gain experience as they get older and have more children, they also need to divide their economic resources and attention among any children that arrive after the first born. We wondered where in childhood these differences began, and what the cause or causes might be. © 2016 Scientific American,
Laura Sanders A protein that can switch shapes and accumulate inside brain cells helps fruit flies form and retrieve memories, a new study finds. Such shape-shifting is the hallmark move of prions — proteins that can alternate between two forms and aggregate under certain conditions. In fruit flies’ brain cells, clumps of the prionlike protein called Orb2 stores long-lasting memories, report scientists from the Stowers Institute for Medical Research in Kansas City, Mo. Figuring out how the brain forms and calls up memories may ultimately help scientists devise ways to restore that process in people with diseases such as Alzheimer’s. The new finding, described online November 3 in Current Biology, is “absolutely superb,” says neuroscientist Eric Kandel of Columbia University. “It fills in a lot of missing pieces.” People possess a version of the Orb2 protein called CPEB, a commonality that suggests memory might work in a similar way in people, Kandel says. It’s not yet known whether people rely on the prion to store long-term memories. “We can’t be sure, but it’s very suggestive,” Kandel says. When neuroscientist Kausik Si and colleagues used a genetic trick to inactivate Orb2 protein, male flies were worse at remembering rejection. These lovesick males continued to woo a nonreceptive female long past when they should have learned that courtship was futile. In different tests, these flies also had trouble remembering that a certain odor was tied to food. |© Society for Science & the Public 2000 - 2016. All rights reserved.
By Virginia Morell Human hunters may be making birds smarter by inadvertently shooting those with smaller brains. That’s the conclusion of a new study, which finds that hunting may be exerting a powerful evolutionary force on bird populations in Denmark, and likely wherever birds are hunted. But the work also raises a red flag for some researchers who question whether the evolution of brain size can ever be tied to a single factor. The new work “broadens an emerging view that smarts really do matter in the natural, and increasingly human-dominated, world,” says John Marzluff, a wildlife biologist and expert on crow cognition at the University of Washington in Seattle who was not involved with the work. Hunting and fishing are known to affect many animal populations. For instance, the pike-perch in the Finnish Archipelago Sea has become smaller over time thanks to fishing, which typically removes the largest individuals from a population. This pressure also causes fish to reach sexual maturity earlier. On land, natural predators like arctic foxes and polar bears can also drive their prey species to become smarter because predators are most likely to catch those with smaller brains. For instance, a recent study showed that common eiders (maritime ducks) that raise the most chicks also have the largest heads and are better at forming protective neighborhood alliances than ducks with smaller heads—and presumably, brains. © 2016 American Association for the Advancement of Science
Bruce Bower Many preschoolers take a surprisingly long and bumpy mental path to the realization that people can have mistaken beliefs — say, thinking that a ball is in a basket when it has secretly been moved to a toy box. Traditional learning curves, in which kids gradually move from knowing nothing to complete understanding, don’t apply to this landmark social achievement and probably to many other types of learning, a new study concludes. Kids ranging in age from 3 to 5 often go back and forth between passing and failing false-belief tests for several months to more than one year, say psychologist Sara Baker of the University of Cambridge and her colleagues. A small minority of youngsters jump quickly from always failing to always passing these tests, the scientists report October 20 in Cognitive Psychology. “If these results are replicated, it will surprise a lot of researchers that there is such a low level of sudden insight into false beliefs,” says psychologist Malinda Carpenter, currently at the Max Planck Institute for Evolutionary Anthropology in Leipzig. Early childhood researchers generally assume that preschoolers either pass or fail false-belief tests, with a brief transition between the two, explains Carpenter, who did not participate in the new study. Grasping that others sometimes have mistaken beliefs is a key step in social thinking. False-belief understanding may start out as something that can be indicated nonverbally but not described. Human 2-year-olds and even chimpanzees tend to look toward spots where a person would expect to find a hidden item that only the children or apes have seen moved elsewhere (SN Online: 10/6/16). © Society for Science & the Public 2000 - 2016
By Ruth Williams .Newly made cells in the brains of mice adopt a more complex morphology and connectivity when the animals encounter an unusual environment than if their experiences are run-of-the-mill. Researchers have now figured out just how that happens. According to a study published today (October 27) in Science, a particular type of cell—called an interneuron—in the hippocampus processes the animals’ experiences and subsequently shapes the newly formed neurons. “We knew that experience shapes the maturation of these new neurons, but what this paper does is it lays out the entire circuit through which that happens,” said Heather Cameron, a neuroscientist at the National Institute of Mental Health in Bethesda who was not involved with the work. “It’s a really nicely done piece of work because they go step-by-step and show all of the cells that are involved and how they’re connected.” Most of the cells in the adult mammalian brain are mature and don’t divide, but in a few regions, including an area of the hippocampus called the dentate gyrus, neurogenesis occurs. The dentate gyrus is thought to be involved in the formation of new memories. In mice, for instance, exploring novel surroundings electrically activates the dentate gyrus and can affect the production, maturation, and survival of the newly born cells. Now, Alejandro Schinder and his team at the Leloir Institute in Buenos Aires, Argentina, have investigated the process in detail. © 1986-2016 The Scientist
By Catherine Caruso Babies and children undergo massive brain restructuring as they mature, and for good reason—they have a whole world of information to absorb during their sprint toward adulthood. This mental renovation doesn’t stop there, however. Adult brains continue to produce new cells and restructure themselves throughout life, and a new study in mice reveals more about the details of this process and the important role environmental experience plays. Through a series of experiments, researchers at the Leloir Institute in Buenos Aires showed that when adult mice are exposed to stimulating environments, their brains are able to more quickly integrate new brain cells into existing neural networks through a process that involves new and old cells connecting to one another via special helper cells called interneurons. The adult mammalian brain, long believed to lack the capacity to make new cells, has two main areas that continuously produce new neurons throughout life. One of these areas, the hippocampus (which is involved in memory, navigation, mood regulation and stress response) produces new neurons in a specialized region called the dentate gyrus. Many previous studies have focused on how the dentate gyrus produces new neurons and what happens to these neurons as they mature, but Alejandro Schinder and his colleagues at Leloir wanted to go one step further and understand how new neurons produced by the dentate gyrus are incorporated into the existing neural networks of the brain, and whether environment affects this process. © 2016 Scientific American
By Steven C. Pan A good night’s sleep can be transformative. Among its benefits are improved energy and mood, better immune system functioning and blood sugar regulation, and greater alertness and ability to concentrate. Given all of these benefits, the fact that a third of the human lifespan is spent sleeping makes evolutionary sense. However, sleep appears to have another important function: helping us learn. Across a plethora of memory tasks—involving word lists, maze locations, auditory tones, and more—going to sleep after training yields better performance than remaining awake. This has prompted many sleep researchers to reach a provocative conclusion: beyond merely supporting learning, sleep is vital, and perhaps even directly responsible, for learning itself. Recent discoveries from neuroscience provide insights into that possibility. Sleep appears to be important for long-term potentiation, a strengthening of signals between neurons that is widely regarded as a mechanism of learning and memory. Certain memories acquired during the day appear to be reactivated and “replayed” in the brain during sleep, which may help make them longer lasting. In some instances the amount of improvement that occurs on memory tasks positively correlates with the length of time spent in certain stages of sleep. These and other findings are generating great excitement among sleep researchers, as well as prompting heated debates about the degree to which sleep may or may not be involved in learning. To date, most sleep and learning research has focused on recall, which is the capacity to remember information. However, new research by Stéphanie Mazza and colleagues at the University of Lyon, recently published in the journal Psychological Science,suggests another potential benefit of sleep: improved relearning. © 2016 Scientific American
By Agata Blaszczak-Boxe Some rodents have a sweet tooth. And sometimes, you need to get crafty to reach your sugar fix. Rats have been filmed for the first time using hooked tools to get chocolate cereal – a manifestation of their critter intelligence. Akane Nagano and Kenjiro Aoyama, of Doshisha University in Kyotanabe, Japan, placed eight brown rats in a transparent box and trained them to pull small hooked tools to obtain the cereal that was otherwise beyond their reach. In one experiment they gave them two similar hooked tools, one of which worked well for the food retrieval task, and the other did not. The rats quickly learned to choose the correct tool for the job, selecting it 95 per cent of the time. The experiments showed that the rats understood the spatial arrangement between the food and the tool. The team’s study is the first to demonstrate that rats are able to use tools, says Nagano. The rats did get a little confused in the final experiment. When the team gave them a rake that looked the part but with a bottom was too soft and flimsy to move the cereal, they still tried to use it as much as the working tool that was also available. But, says Nagano, it is possible their eyesight was simply not good enough for them to tell that the flimsy tool wasn’t up to the task. The rodents’ crafty feat places them in the ever-growing club of known tool-using animals such as chimps, bearded capuchin monkeys, New Caledonian crows, alligators and even some fish. © Copyright Reed Business Information Ltd.
By Gareth Cook According to the American Psychiatric Association, about 5 percent of American children suffer from Attention Deficit Hyperactivity Disorder (ADHD), yet the diagnosis is given to some 15 percent of American children, many of whom are placed on powerful drugs with lifelong consequences. This is the central fact of the journalist Alan Schwarz’s new book, ADHD Nation. Explaining this fact—how it is that perhaps two thirds of the children diagnosed with ADHD do not actually suffer from the disorder—is the book’s central mystery. The result is a damning indictment of the pharmaceutical industry, and an alarming portrait of what is being done to children in the name of mental health. What prompted you to write this book? In 2011, having spent four years exposing the dangers of concussions in the National Football League and youth sports for The New York Times, I wanted another project. I had heard that high school students in my native Westchester County (just north of New York City) were snorting Adderall before the S.A.T.'s to focus during the test. I was horrified and wanted to learn more. I saw it not as a "child psychiatry" story, and not as a "drug abuse" story, but one about academic pressure and the demands our children feel they're under. When I looked deeper, it was obvious that our nationwide system of ADHD treatment was completely scattershot—basically, many doctors were merely prescribing with little thought into whether a kid really had ADHD or not, and then the pills would be bought and sold among students who had no idea what they were messing with. I asked the ADHD and child-psychiatry establishment about this, and they denied it was happening. They denied that there were many false diagnoses. They denied that teenagers were buying and selling pills. They denied that the national diagnosis rates reported by the C.D.C.—then 9.5 percent of children aged 4-17, now 11 percent and still growing—were valid. They basically denied that anything about their world was malfunctioning at all. In the end, they doth protest too much. I wrote about 10 front-page stories for The New York Times on the subject from 2012-2014. © 2016 Scientific American,
Erin Ross The teenage brain has been characterized as a risk-taking machine, looking for quick rewards and thrills instead of acting responsibly. But these behaviors could actually make teens better than adults at certain kinds of learning. "In neuroscience, we tend to think that if healthy brains act in a certain way, there should be a reason for it," says Juliet Davidow, a postdoctoral researcher at Harvard University in the Affective Neuroscience and Development Lab and the lead author of the study, which was published Wednesday in the journal Neuron. But scientists and the public often focus on the negatives of teen behavior, so she and her colleagues set out to test the hypothesis that teenagers' drive for rewards, and the risk-taking that comes from it, exist for a reason. When it comes to what drives reward-seeking in teens, fingers have always been pointed at the striatum, a lobster-claw-shape structure in the brain. When something surprising and good happens — say, you find $20 on the street — your body produces the pleasure-related hormone dopamine, and the striatum responds. "Research shows that the teenage striatum is very active," says Davidow. This suggests that teens are hard-wired to seek immediate rewards. But, she adds, it's also shown that their prefrontal cortex, which helps with impulse control, isn't fully developed. Combined, these two things have given teens their risky rep. But the striatum isn't just involved in reward-seeking. It's also involved in learning from rewards, explains Daphna Shohamy, a cognitive neuroscientist at the Zuckerman Mind Brain Behavior Institute at Columbia University who worked on the study. She wanted to see if teenagers would be better at this type of learning than adults would. © 2016 npr
Richard A. Friedman There’s a reason adults don’t pick up Japanese or learn how to kite surf. It’s ridiculously hard. In stark contrast, young people can learn the most difficult things relatively easily. Polynomials, Chinese, skateboarding — no problem! Neuroplasticity — the brain’s ability to form new neural connections and be influenced by the environment — is greatest in childhood and adolescence, when the brain is still a work in progress. But this window of opportunity is finite. Eventually it slams shut. Or so we thought. Until recently, the conventional wisdom within the fields of neuroscience and psychiatry has been that development is a one-way street, and once a person has passed through his formative years, experiences and abilities are very hard, if not impossible, to change. What if we could turn back the clock in the brain and recapture its earlier plasticity? This possibility is the focus of recent research in animals and humans. The basic idea is that during critical periods of brain development, the neural circuits that help give rise to mental states and behaviors are being sculpted and are particularly sensitive to the effects of experience. If we can understand what starts and stops these periods, perhaps we can restart them. Think of the brain’s sensitive periods as blown glass: The molten glass is very malleable, but you have a relatively brief time before it cools and becomes crystalline. Put it back into the furnace, and it can once again change shape. © 2016 The New York Times Company
Dean Burnett Throughout history, people have always worried about new technologies. The fear that the human brain cannot cope with the onslaught of information made possible by the latest development was first voiced in response to the printing press, back in the sixteenth century. Swap “printing press” for “internet” and you have the exact same concerns today, regularly voiced in the mainstream media, and usually focused on children. But is there any legitimacy to these claims? Or are they just needless scaremongering? There are several things to bear in mind when considering how our brains deal with the internet. The human brain is always dealing with a constant stream of rich information - that’s what the real world is First, don’t forget that “the internet” is a very vague term, given that it contains so many things across so many formats. You could, for instance, develop a gambling addiction via online casinos or poker sites. This is an example of someone’s brain being negatively affected via the internet, but it would be difficult to argue that the internet is the main culprit, any more than a gambling addiction obtained via a real world casino can be blamed on “buildings”; it’s just the context in which the problem occurred. However, the internet does give us a far more direct, constant and wide ranging access to information than pretty much anything else in human history. So how could, or does, this affect us and our brains? © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22736 - Posted: 10.10.2016
/ By Seth Mnookin When Henry Molaison died at a Connecticut nursing home in 2008, at the age of 82, a front-page obituary in The New York Times called him “the most important patient in the history of brain science.” It was no exaggeration: Much of what we know about how memory works is derived from experiments on Molaison, a patient with severe epilepsy who in 1953 had undergone an operation that left him without medial temporal lobes and the ability to form new memories. The operation didn’t completely stop Molaison’s seizures — the surgeon, William Beecher Scoville, had done little more than guess at the locus of his affliction — but by chance, it rendered him a near-perfect research subject. Not only could postoperative changes in his behavior be attributed to the precise area of his brain that had been removed, but the fact that he couldn’t remember what had happened 30 seconds earlier made him endlessly patient and eternally willing to endure all manner of experiments. It didn’t take long for those experiments to upend our understanding of the human brain. By the mid-1950s, studies on Molaison (known until his death only as Patient H.M.) had shown that, contrary to popular belief, memories were created not in the brain as a whole, but in specific regions — and that different types of memories were formed in different ways. Molaison remained a research subject until his death, and for the last 41 years of his life, the person who controlled access to him, and was involved in virtually all the research on him, was an MIT neuroscientist named Suzanne Corkin. Copyright 2016 Undark
Keyword: Learning & Memory
Link ID: 22729 - Posted: 10.05.2016
Jon Hamilton Want to be smarter? More focused? Free of memory problems as you age? If so, don't count on brain games to help you. That's the conclusion of an exhaustive evaluation of the scientific literature on brain training games and programs. It was published Monday in the journal Psychological Science in the Public Interest. "It's disappointing that the evidence isn't stronger," says Daniel Simons, an author of the article and a psychology professor at the University of Illinois at Urbana-Champaign. "It would be really nice if you could play some games and have it radically change your cognitive abilities," Simons says. "But the studies don't show that on objectively measured real-world outcomes." The evaluation, done by a team of seven scientists, is a response to a very public disagreement about the effectiveness of brain games, Simons says. In October 2014, more than 70 scientists published an open letter objecting to marketing claims made by brain training companies. Pretty soon, another group, with more than 100 scientists, published a rebuttal saying brain training has a solid scientific base. "So you had two consensus statements, each signed by many, many people, that came to essentially opposite conclusions," Simons says. © 2016 npr
Keyword: Learning & Memory
Link ID: 22727 - Posted: 10.05.2016
By Deborah R. Glasofer, Joanna Steinglass Every day on the dot of noon, Jane* would eat her 150-calorie lunch: nonfat yogurt and a handful of berries. To eat earlier, she felt, would be “gluttonous.” To eat later would disrupt the dinner ritual. Jane's eating initially became more restrictive in adolescence, when she worried about the changes her body was undergoing in the natural course of puberty. When she first settled on her lunchtime foods and routine—using a child-size spoon to “make the yogurt last” and sipping water between each bite—she felt accomplished. Jane enjoyed her friends' compliments about her “incredible willpower.” In behavioral science terms, her actions were goal-directed, motivated by achieving a particular outcome. In relatively short order, she got the result she really wanted: weight loss. Years later Jane, now in her 30s and a newspaper reporter, continued to eat the same lunch in the same way. Huddled over her desk in the newsroom, she tried to avoid unwanted attention and feared anything that might interfere with the routine. She no longer felt proud of her behavior. Her friends stopped complimenting her “self-control” years ago, when her weight plummeted perilously low. So low that she has had to be hospitalized on more than one occasion. The longed-for weight loss did not make her feel better about herself or her appearance. Jane's curly hair, once shiny and thick, dulled and thinned; her skin and eyes lost their brightness. There were other costs as well—to her relationships, to her career. Instead of dreaming about a great romance, Jane would dream of the cupcakes she could not let herself have at her niece's birthday party. Instead of thinking about the best lead for her next story, she obsessed over calories and exercise. © 2016 Scientific American