Chapter 9. Homeostasis: Active Regulation of the Internal Environment
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Dina Fine Maron Whenever the fictional character Popeye the Sailor Man managed to down a can of spinach, the results were almost instantaneous: he gained superhuman strength. Devouring any solid object similarly did the trick for one of the X-Men. As we age and begin to struggle with memory problems, many of us would love to reach for an edible mental fix. Sadly, such supernatural effects remain fantastical. Yet making the right food choices may well yield more modest gains. A growing body of evidence suggests that adopting the Mediterranean diet, or one much like it, can help slow memory loss as people age. The diet's hallmarks include lots of fruits and vegetables and whole grains (as opposed to ultrarefined ones) and a moderate intake of fish, poultry and red wine. Dining mainly on single ingredients, such as pumpkin seeds or blueberries, however, will not do the trick. What is more, this diet approach appears to reap brain benefits even when adopted later in life—sometimes aiding cognition in as little as two years. “You will not be Superman or Superwoman,” says Miguel A. Martínez González, chair of the department of preventive medicine at the University of Navarra in Barcelona. “You can keep your cognitive abilities or even improve them slightly, but diet is not magic.” Those small gains, however, can be meaningful in day-to-day life. Scientists long believed that altering diet could not improve memory. But evidence to the contrary started to emerge about 10 years ago. © 2015 Scientific American
Link ID: 21350 - Posted: 08.28.2015
Dan Charles Ah, sugar — we love the sweetness, but not the calories. For more than a century, food technologists have been on a quest for the perfect, guilt-free substitute. Ah, sugar — we love the sweetness, but not the calories. For more than a century, food technologists have been on a quest for the perfect, guilt-free substitute. Ryan Kellman/NPR There's a new candidate in the century-old quest for perfect, guiltless sweetness. I encountered it at the annual meeting of the Institute of Food Technologists, a combination of Super Bowl, Mecca, and Disneyland for the folks who put the processing in processed food. It was right in the middle of the vast exhibition hall, at the Tate & Lyle booth. This is the company that introduced the British Empire to the sugar cube, back in 1875. A century later, it invented sucralose, aka Splenda. "We have a deep understanding of sweetening," says Michael Harrison, Tate & Lyle's vice president of new product development. This year, his company launched its latest gift to your sweet tooth. It's called allulose. "This is a rare sugar. A sugar that's found in nature," Harrison explains. Chemically speaking, it's almost identical to ordinary sugar. It has the same chemical formula as fructose and glucose, but the atoms of hydrogen and oxygen are arranged slightly differently. © 2015 NPR
Aaron E. Carroll If there is one health myth that will not die, it is this: You should drink eight glasses of water a day. It’s just not true. There is no science behind it. And yet every summer we are inundated with news media reports warning that dehydration is dangerous and also ubiquitous. These reports work up a fear that otherwise healthy adults and children are walking around dehydrated, even that dehydration has reached epidemic proportions. Let’s put these claims under scrutiny. I was a co-author of a paper back in 2007 in the BMJ on medical myths. The first myth was that people should drink at least eight 8-ounce glasses of water a day. This paper got more media attention (even in The Times) than pretty much any other research I’ve ever done. It made no difference. When, two years later, we published a book on medical myths that once again debunked the idea that we need eight glasses of water a day, I thought it would persuade people to stop worrying. I was wrong again. Many people believe that the source of this myth was a 1945 Food and Nutrition Board recommendation that said people need about 2.5 liters of water a day. But they ignored the sentence that followed closely behind. It read, “Most of this quantity is contained in prepared foods.” Water is present in fruits and vegetables. It’s in juice, it’s in beer, it’s even in tea and coffee. Before anyone writes me to tell me that coffee is going to dehydrate you, research shows that’s not true either. Although I recommended water as the best beverage to consume, it’s certainly not your only source of hydration. You don’t have to consume all the water you need through drinks. You also don’t need to worry so much about never feeling thirsty. The human body is finely tuned to signal you to drink long before you are actually dehydrated. © 2015 The New York Times Company
Link ID: 21335 - Posted: 08.25.2015
By Gretchen Vogel Researchers may have finally explained how an obesity-promoting gene variant induces some people to put on the pounds. Using state-of-the-art DNA editing tools, they have identified a genetic switch that helps govern the body’s metabolism. The switch controls whether common fat cells burn energy rather than store it as fat. The finding suggests the tantalizing prospect that doctors might someday offer a gene therapy to melt extra fat away. Along with calories and exercise, genes influence a person’s tendency to gain—and keep—extra pounds. One of the genes with the strongest link to obesity is called FTO. People with certain versions of the gene are several kilos heavier on average and significantly more likely to be obese. Despite years of study, no one had been able to figure out what the gene does in cells or how it influences weight. There was some evidence FTO helped control other genes, but it was unclear which ones. Some researchers had looked for activity of FTO in various tissues, without finding any clear signals. Melina Claussnitzer, Manolis Kellis, and their colleagues at Harvard University, Massachusetts Institute of Technology, and the Broad Institute in Cambridge, turned to data from the Roadmap Epigenomics Project, an 8-year effort that identified the chemical tags on DNA that influence the function of genes. The researchers used those epigenetic tags to look at whether FTO was turned on or off in 127 cell types. The gene seemed to be active in developing fat cells called adipocyte progenitor cells. © 2015 American Association for the Advancement of Science
Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.
By Gretchen Reynolds Sticking to a diet requires self-control and a willingness to forgo present pleasures for future benefits. Not surprisingly, almost everyone yields to temptation at least sometimes, opting for the cookie instead of the apple. Wondering why we so often override our resolve, scientists at the Laboratory for Social and Neural Systems Research at the University of Zurich recently considered the role of stress, which is linked to a variety of health problems, including weight gain. (There’s something to the rom-com cliché of the jilted lover eating ice cream directly from the carton.) But just how stress might drive us to sweets has not been altogether clear. It turns out that even mild stress may immediately alter the workings of our brains in ways that undermine willpower. For their study, published this month in Neuron, researchers recruited 51 young men who said they were trying to maintain a healthy diet and lifestyle. The men were divided into two groups, one of which served as a control, and then all were asked to skim through images of different kinds of food on a computer screen, rating them for taste and healthfulness. Next, the men in the experimental group were told to plunge a hand into a bowl of icy water for as long as they could, a test known to induce mild physiological and psychological stress. Relative to the control group, the men developed higher levels of cortisol, a stress hormone. After that, men from each group sat in a brain-scanning machine and watched pictures of paired foods flash across a screen. Generally, one of the two foods was more healthful than the other. The subjects were asked to click rapidly on which food they would choose to eat, knowing that at the end of the test they would actually be expected to eat one of these picks (chosen at random from all of their choices). © 2015 The New York Times Company
By Mitch Leslie Some microbes that naturally dwell in our intestines might be bad for our eyes, triggering autoimmune uveitis, one of the leading causes of blindness. A new study suggests that certain gut residents produce proteins that enable destructive immune cells to enter the eyes. The idea that gut microbes might promote autoimmune uveitis “has been there in the back of our minds,” says ocular immunologist Andrew Taylor of the Boston University School of Medicine, who wasn’t connected to the research. “This is the first time that it’s been shown that the gut flora seems to be part of the process.” As many as 400,000 people in the United States have autoimmune uveitis, in which T cells—the commanders of the immune system—invade the eye and damage its middle layer. All T cells are triggered by specific molecules called antigens, and for T cells that cause autoimmune uveitis, certain eye proteins are the antigens. Even healthy people carry these T cells, yet they don't usually swarm the eyes and unleash the disease. That's because they first have to be triggered by their matching antigen. However, those proteins don't normally leave the eye. So what could stimulate the T cells? One possible explanation is microbes in the gut. In the new study, immunologist Rachel Caspi of the National Eye Institute in Bethesda, Maryland, and colleagues genetically engineered mice so their T cells recognized one of the same eye proteins targeted in autoimmune uveitis. The rodents developed the disease around the time they were weaned. But dosing the animals with four antibiotics that killed off most of their gut microbes delayed the onset and reduced the severity of the disease. © 2015 American Association for the Advancement of Science.
Carl Zimmer You are what you eat, and so were your ancient ancestors. But figuring out what they actually dined on has been no easy task. There are no Pleistocene cookbooks to consult. Instead, scientists must sift through an assortment of clues, from the chemical traces in fossilized bones to the scratch marks on prehistoric digging sticks. Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains. Roughly seven million years ago, our ancestors split off from the apes. As far as scientists can tell, those so-called hominins ate a diet that included a lot of raw, fiber-rich plants. After several million years, hominins started eating meat. The oldest clues to this shift are 3.3-million-year-old stone tools and 3.4-million-year-old mammal bones scarred with cut marks. The evidence suggests that hominins began by scavenging meat and marrow from dead animals. At some point hominins began to cook meat, but exactly when they invented fire is a question that inspires a lot of debate. Humans were definitely making fires by 300,000 years ago, but some researchers claim to have found campfires dating back as far as 1.8 million years. Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants. © 2015 The New York Times Company
By James Gallagher Health editor, BBC News website Fat or carbs? Scientists have shed new light on which diet might be more effective at reducing fat Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows. Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake. Experts say the most effective diet is one people can stick to. It has been argued that restricting carbs is the best way to get rid of a "spare tyre" as it alters the body's metabolism. The theory goes that fewer carbohydrates lead to lower levels of insulin, which in turn lead to fat being released from the body's stores. "All of those things do happen with carb reduction and you do lose body fat, but not as much as when you cut out the fat," said lead researchers Dr Kevin Hall, from the US-based National Institute of Diabetes and Digestive and Kidney Diseases. Cutting down on carbohydrates might not be as effective after all, the study suggests In the study, 19 obese people were initially given 2,700 calories a day. Then, over a period of two weeks they tried diets which cut their calorie intake by a third, either by reducing carbohydrates or fat. The team analysed the amount of oxygen and carbon dioxide being breathed out and the amount of nitrogen in participants' urine to calculate precisely the chemical processes taking place inside the body. The results published in Cell Metabolism showed that after six days on each diet, those reducing fat intake lost an average 463g of body fat - 80% more than those cutting down on carbs, whose average loss was 245g. © 2015 BBC.
Link ID: 21296 - Posted: 08.15.2015
By Dina Fine Maron Don’t stress too much about cutting calories if you want to shed pounds—focus on getting more exercise. That’s the controversial message beverage giant Coca-Cola is backing in its new campaign to curb obesity. Coke is pushing this idea via a new Coke-backed nonprofit called Global Energy Balance Network, The New York Times reported on August 9. Money from Coke, the Times reported, is also financing studies that support the notion that exercise trumps diet. But is there any merit to such a stance? Not much, says Rutgers University–based diet and behavior expert Charlotte Markey. She is the author of an upcoming cover story in Scientific American MIND on this topic, and spoke about the Coke claims with Scientific American on Monday. In your fall Scientific American MIND feature you write “study after study shows that working out is not terribly effective for weight loss on its own.” Why is that? Exercise increases appetite, and most people just make up for whatever they exercised off. There’s a lot of wonderful reasons to exercise and I always suggest it to people who are trying to lose weight—some sort of exercise regimen keeps them focused on their health and doing what is good for them, and it’s psychologically healthy. But in and of itself it won’t usually help people lose weight. Two years ago there was a review study in Frontiers in Psychology that concluded dieting often actually led to weight gain. Why would that happen? When people try to diet, they try to restrict themselves, which often leads to overeating. They cut out food groups which make those food groups more desirable to them. They think too much about short-term goals and don’t think about sustainable changes. But if you are going to lose weight, you have to change your behaviors for the rest of your life or otherwise you gain it back. That’s not a sexy message because it seems daunting. © 2015 Scientific American
Link ID: 21285 - Posted: 08.12.2015
By Anahad O’Connor Coca-Cola, the world’s largest producer of sugary beverages, is backing a new “science-based” solution to the obesity crisis: To maintain a healthy weight, get more exercise and worry less about cutting calories. The beverage giant has teamed up with influential scientists who are advancing this message in medical journals, at conferences and through social media. To help the scientists get the word out, Coke has provided financial and logistical support to a new nonprofit organization called the Global Energy Balance Network, which promotes the argument that weight-conscious Americans are overly fixated on how much they eat and drink while not paying enough attention to exercise. “Most of the focus in the popular media and in the scientific press is, ‘Oh they’re eating too much, eating too much, eating too much’ — blaming fast food, blaming sugary drinks and so on,” the group’s vice president, Steven N. Blair, an exercise scientist, says in a recent video announcing the new organization. “And there’s really virtually no compelling evidence that that, in fact, is the cause.” Health experts say this message is misleading and part of an effort by Coke to deflect criticism about the role sugary drinks have played in the spread of obesity and Type 2 diabetes. They contend that the company is using the new group to convince the public that physical activity can offset a bad diet despite evidence that exercise has only minimal impact on weight compared with what people consume. This clash over the science of obesity comes in a period of rising efforts to tax sugary drinks, remove them from schools and stop companies from marketing them to children. In the last two decades, consumption of full-calorie sodas by the average American has dropped by 25 percent. © 2015 The New York Times Company
Link ID: 21283 - Posted: 08.10.2015
By Ariana Eunjung Cha Everyone knows that a diet full of white bread, pasta and rice is bad for your waistline. Now scientists say these types of refined carbs could also impact your mind — putting post-menopausal women at higher risk for depression. In a new study published in the the American Journal of Clinical Nutrition, researchers looked at data from more than 70,000 women who participated in the National Institutes of Health's women's health initiative between 1994 and 1998. They found that the more women consumed added sugars and refined grains and the higher their score on the glycemic index (GI) — a measure of the rate carbohydrates are broken down and absorbed by the body — the more they were at risk of new-onset depression. Those who had a different sort of diet — one with more dietary fiber, whole grains, vegetables and non-juice fruits — had a decreased risk. "This suggests that dietary interventions could serve as treatments and preventive measures for depression," wrote James Gangswisch, an assistant professor of psychiatry at Columbia University Medical Center, and his co-authors. The researchers explained that refined foods trigger a hormonal response in the body to reduce blood sugar levels. That is believed to lead to the "sugar high" and subsequent "crash" some people say they feel after eating such foods. This can lead to mood changes, fatigue and other symptoms of depression.
By Mitch Leslie If you need to lose a lot of weight, surgeons have a drastic option: They can reroute and sometimes remove parts of your stomach, making it smaller. But instead of limiting the amount of food you can eat, the surgery may work by triggering long-term changes in the types of microbes that inhabit your intestines, a new study suggests. If so, altering the kinds of microbes that live in your gut may be a simpler—and safer—route to weight loss. The research provides “some of the best evidence in humans so far” that bariatric surgery works “in part by changing the bacteria in your gut,” says David Cummings, an endocrinologist at the University of Washington, Seattle, who was not involved with the work. Weight loss isn’t the only benefit of so-called bariatric surgery. If a patient has diabetes, for instance, it will usually disappear. The surgery alters metabolism and digestive system functions in several ways, and researchers are still trying to pin down why it’s effective. “This is not about making your stomach small,” says Randy Seeley, an obesity and diabetes researcher at the University of Michigan, Ann Arbor, who wasn’t connected to the study. One way that bariatric surgery might trigger its effects is through its influence on the microbiota, the swarms of microbes that dwell in our intestines and help us digest food. Studies have found that bariatric surgery dramatically alters the microbiota’s makeup in mice and humans. Two years ago, scientists put mice through a Roux-en-Y gastric bypass—a type of bariatric surgery that involves reducing the stomach to a small pouch and stitching it to the middle part of the small intestine—and then transplanted microbes from the slimmed down animals into mice that lacked intestinal bacteria. The recipient rodents lost 5% of their body weight in 2 weeks. But these studies only checked for short-term changes. © 2015 American Association for the Advancement of Science
Link ID: 21269 - Posted: 08.05.2015
Richard Harris One of the frequent trials of parenthood is dealing with a picky eater. About 20 percent of children ages 2 to 6 have such a narrow idea of what they want to eat that it can make mealtime a battleground. A study published Monday in the journal Pediatrics shows that, in extreme cases, picky eating can be associated with deeper trouble, such as depression or social anxiety. The study followed a broad spectrum of children who had come to Duke University for routine medical care. Most kids dislike some foods (broccoli is a common villain), but the researchers counted a child as a severely picky eater if his or her food choices were so limited that it made meals at home difficult, and meals out all but impossible. Those extreme cases were rare — just 3 percent of all kids. But, as a group, they were twice as likely as the children who weren't picky to have a diagnosis of depression, and seven times as likely to have been diagnosed with social anxiety, according to the study. Nancy Zucker, director of the Duke Center for Eating Disorders, says parents of children who are extremely finicky may find it useful to seek help, because the kids may not simply outgrow the behavior on their own. And even if they eventually do, it can be disruptive to child and family alike in the meantime. A big question is what to do about less extreme cases, which in the Duke study made up 17 percent of all children. These children have a list of foods that they are reluctant to stray beyond. © 2015 NPR
By Sophia Kercher As Kathleen Emmets was undergoing cancer treatment in New York over the past few years, her weight began to drop. Even though she was often nauseated and paralyzed by chemotherapy-induced neuropathy, she joked that thinness was the “bonus of cancer,” and found herself looking in the mirror and admiring her deep and hollow collarbone. Ms. Emmets, now 39, filled her closet with extra-small size clothes. At night she pressed her fingers against her protruding bones, saying to herself, “I’m finally skinny.” But it was only when her cancer treatment changed that it became clear that the body-image issues she had been grappling with since her early 20s — when she would eat next to nothing and walk for six hours a day to deal with stress — had begun to resurface. When the new treatment didn’t make her sick, her appetite returned, and she began to gain weight. But instead of celebrating this sign of improving health, Ms. Emmets says she missed her size 2 jeans and was appalled by her round belly and full breasts. Her husband watched with concern as her body appeared stronger but she began imposing her own food restrictions and started shrinking again. “During your cancer treatment, you have no control over your body — you give up your body to your doctor,” said Ms. Emmets, who wrote about her experiences on the website The Manifest-Station. “You are willing to do it because you want to live. Food restriction is the one thing that you can do to have some sense of control when everything is chaotic.” While it isn’t known how often cancer triggers or reawakens an eating disorder, doctors and nutrition experts who work with cancer patients share anecdotal reports of patients who emerge from a difficult round of cancer treatment and weight loss only to begin struggling with a serious eating disorder that threatens their postcancer health. © 2015 The New York Times Company
Keyword: Anorexia & Bulimia
Link ID: 21236 - Posted: 07.30.2015
Steve Connor Anxiety and depression could be linked to the presence of bacteria in the intestines, scientists have found. A study on laboratory mice has shown that anxious and depressive behaviour brought on by exposure to stress in early life appears only to be triggered if microbes are present in the gut. The study, published in Nature Communications, demonstrates a clear link between gut microbiota – the microbes living naturally in the intestines – and the triggering of the behavioural signs of stress. “We have shown for the first time in an established mouse model of anxiety and depression that bacteria play a crucial role in inducing this abnormal behaviour,” said Premysl Bercik of McMaster University in Hamilton, Canada, the lead author of the study. The scientists called for further research to see if the conclusions applied to humans, and whether therapies that that target intestinal microbes can benefit patients with psychiatric disorders. Previous research on mice has indicated that gut microbes play an important role in behaviour. For instance, mice with no gut bacteria – called “germ-free” mice – are less likely to show anxiety-like behaviour than normal mice. The latest study looked at mice that had been exposed to a stressful experience in early life, such as being separated from their mothers. When these mice grow up they display anxiety and depression-like behaviour and have abnormal levels of the stress hormone corticosterone in their blood, as well as suffering from gut dysfunction based on the release of the neurotransmitter acetylcholine.
Link ID: 21232 - Posted: 07.29.2015
By Roni Caryn Rabin “Fat” cartoon characters may lead children to eat more junk food, new research suggests, but there are ways to counter this effect. The findings underscore how cartoon characters, ubiquitous in children’s books, movies, television, video games, fast-food menus and graphic novels, may influence children’s behavior in unforeseen ways, especially when it comes to eating. Researchers first randomly showed 60 eighth graders a svelte jelly-bean-like cartoon character or a similar rotund character and asked them to comment on the images. Then they thanked them and gestured toward bowls of Starburst candies and Hershey’s Kisses, saying, “You can take some candy.” Children who had seen the rotund cartoon character helped themselves to more than double the number of candies as children shown the lean character, taking 3.8 candies on average, compared with 1.7 taken by children shown the lean bean character. (Children in a comparison group shown an image of a coffee mug took 1.5 candies on average.) But activating children’s existing health knowledge can counter these effects, the researchers discovered. In a separate experiment, they showed 167 elementary school children two red Gumby-like cartoon characters, one fat and one thin, and then asked them to “taste test” some cookies. But they also asked the children to “think about things that make you healthy,” such as getting enough sleep versus watching TV, or drinking soda versus milk. Some children were asked the health questions before being given the cookie taste test, while others were asked the questions after the taste test. Remarkably, the children who were asked about healthy habits before doing the taste test ate fewer cookies — even if they had first been exposed to the rotund cartoon character. Those who were shown the rotund figure ate 4.2 cookies on average if they were asked about healthy habits after eating the cookies, compared to three cookies if they were asked about healthy habits before doing the taste test. Children who saw the normal weight character and who were asked about healthy habits after the taste test also ate about three cookies. © 2015 The New York Times Company
By THE ASSOCIATED PRESS WASHINGTON — Move over sweet and salty: Researchers say we have a distinct and basic taste for fat, too. But it's nowhere near as delicious as it sounds. They propose expanding our taste palate to include fat along with sweet, salty, bitter, sour and relative newcomer umami. A research team at Purdue University tested look-alike mixtures with different tastes. More than half of the 28 special tasters could distinguish fatty acids from the other tastes, according to a study published in the journal Chemical Senses. Past research showed fat had a distinct feel in the mouth, but scientists removed texture and smell clues and people could still tell the difference. "The fatty acid part of taste is very unpleasant," study author Richard Mattes, a Purdue nutrition science professor, said Thursday. "I haven't met anybody who likes it alone. You usually get a gag reflex." Stinky cheese has high levels of the fat taste and so does food that goes rancid, Mattes said. Yet we like it because it mixes well and brings out the best of other flavors, just like the bitter in coffee or chocolate, he added. To qualify as a basic taste, a flavor has to have unique chemical signature, have specific receptors in our bodies for the taste, and people have to distinguish it from other tastes. Scientists had found the chemical signature and two specific receptors for fat, but showing that people could distinguish it was the sticky point. Initially Mattes found that people couldn't quite tell fat tastes when given a broad array of flavors. But when just given yucky tastes — bitter, umami, sour — they could find the fat. © 2015 The New York Times Company
By Andrea Alfano Unexpectedly losing a loved one launched 18-year-old Debra* into an episode of major depression, triggering dangerous delusions that landed her in a hospital. Her doctor immediately started her on an antidepressant and on risperidone (Risperdal), an antipsychotic. In little more than a month, her weight shot up by 15 pounds. “Gaining weight made it even more difficult for me to want to leave my house because I felt self-conscious,” Debra says. In the medical community, antipsychotics are well known to cause significant weight gain. Gains of 20 to 35 pounds or more over the course of a year or two are not unusual. Debra's doctor never warned her, though, leaving her feeling like she was losing herself both mentally and physically. The situation is not uncommon, according to psychiatrist Matthew Rudorfer, chief of the somatic treatments program at the National Institute of Mental Health, who points out that although the U.S. Food and Drug Administration carefully tracks acute side effects such as seizures, it pays less attention to longer-term complications such as weight change. Perhaps taking their cue from the FDA, doctors tend to downplay weight-related risks that accompany many psychiatric drugs, Rudorfer says. But for Debra and many others, these side effects are not trivial. The three types of psychiatric drugs that can seriously affect body weight are reviewed below. According to a 2014 review of eight studies, as many as 55 percent of patients who take modern antipsychotics experience weight gain—a side effect that appears to be caused by a disruption of the chemical signals controlling appetite. Olanzapine (Zyprexa) and clozapine (Clozaril) are the top two offenders; studies have shown that on average these drugs cause patients to gain more than eight pounds in just 10 weeks. These two drugs also bear the highest risk of metabolic syndrome, which encompasses weight gain and other related disorders, including type 2 diabetes, according to a 2011 study of 90 people with schizophrenia. Although most antipsychotics are associated with weight gain, aripiprazole (Abilify) and ziprasidone (Geodon) stand out for their lower risk. © 2015 Scientific American
by Bethany Brookshire For some of us, a weekly case of the Mondays isn’t just because of traffic, work pileups or our soulless office space. It’s because we had to get up early, and sleeping in on the weekend was so incredibly glorious. Besides, because we slept in on Sunday, we didn’t get to the gym until the afternoon, we cooked a late dinner for a friend and then we couldn’t fall asleep at all and so stayed up playing around on the Internet. OK, maybe that’s just me. But you get the general idea. Our obligations — work, family and friends — often don’t line up with when our bodies want to sleep. Scientists call this phenomenon social jetlag. And it may make for more than just miserable Mondays. Social jetlag may also be associated with wider waistlines. As we learn more about how our body clocks work, it might help to think about how our own schedules can shift. Some of us love late nights and can’t help glaring at those who hop out of bed for a 5 a.m. workout (again, maybe that’s just me). But in fact our chronotypes aren’t a result of willpower. Instead they fall in a natural curve. About two-thirds of people are neutral, but a few fall at each end of the spectrum, rising extra early, or staying up until the wee hours. But even those in the middle are still getting up a little bit too early and staying up a little bit too late. We try to make up for it on days off, sleeping in or falling asleep early for a few extra hours of rest. But the result of that shift in sleep schedule? Jetlag. “It’s the equivalent of taking a flight one direction every Friday and going back every Sunday,” says Michael Parsons, a behavioral geneticist at the Medical Research Council Harwell in England. © Society for Science & the Public 2000 - 2015