Chapter 9. Homeostasis: Active Regulation of the Internal Environment
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Susan Gaidos CHICAGO — Eating a high-fat diet as a youngster can affect learning and memory during adulthood, studies have shown. But new findings suggest such diets may not have long-lasting effects. Rats fed a high-fat diet for nearly a year recovered their ability to navigate their surroundings. University of Texas at Dallas neuroscientist Erica Underwood tested spatial memory for rats fed a high-fat diet for either 12 weeks or 52 weeks, immediately after weaning. After rats placed in a chamber-filled box containing Lego-like toys became familiar with the box, the researchers moved the toys to new chambers. Later, when placed in the box, rats who ate high-fat foods for 12 weeks appeared confused and had difficulty finding the toys. But rats that ate high-fat foods for nearly a year performed as well as those fed a normal diet. Underwood repeated the experiment, posing additional spatial memory tests to new groups of rats. The findings were the same: Over the long-term, rats on high-fat diets recovered their ability to learn and remember. Studies of brain cells revealed that rats on the long-term high-fat diet showed reduced excitability in nerve cells from the hippocampus, the same detrimental effects seen in rats on the short-term high-fat diet. “The physiology that should create a dumber animal is there, but not the behavior,” said Lucien Thompson of UT Dallas, who oversaw the study. Underwood and Thompson speculate that some other part of the brain may be compensating for this reduction in neural response. © Society for Science & the Public 2000 - 2015.
Peter Andrey Smith Nearly a year has passed since Rebecca Knickmeyer first met the participants in her latest study on brain development. Knickmeyer, a neuroscientist at the University of North Carolina School of Medicine in Chapel Hill, expects to see how 30 newborns have grown into crawling, inquisitive one-year-olds, using a battery of behavioural and temperament tests. In one test, a child's mother might disappear from the testing suite and then reappear with a stranger. Another ratchets up the weirdness with some Halloween masks. Then, if all goes well, the kids should nap peacefully as a noisy magnetic resonance imaging machine scans their brains. “We try to be prepared for everything,” Knickmeyer says. “We know exactly what to do if kids make a break for the door.” Knickmeyer is excited to see something else from the children — their faecal microbiota, the array of bacteria, viruses and other microbes that inhabit their guts. Her project (affectionately known as 'the poop study') is part of a small but growing effort by neuroscientists to see whether the microbes that colonize the gut in infancy can alter brain development. The project comes at a crucial juncture. A growing body of data, mostly from animals raised in sterile, germ-free conditions, shows that microbes in the gut influence behaviour and can alter brain physiology and neurochemistry. © 2015 Nature Publishing Group
By ERICA GOODE Women who suffer from anorexia are often thought of as having an extraordinary degree of self-control, even if that discipline is used self-destructively. But a new study suggests that the extreme dieting characteristic of anorexia may instead be well-entrenched habit — behavior governed by brain processes that, once set in motion, are inflexible and slow to change. The study’s findings may help explain why the eating disorder, which has the highest mortality rate of any mental illness, is so stubbornly difficult to treat. But they also add to increasing evidence that the brain circuits involved in habitual behavior play a role in disorders where people persist in making self-destructive choices no matter the consequences, like cocaine addiction or compulsive gambling. In the case of anorexia, therapists often feel helpless to interrupt the relentless dieting that anorexic patients pursue. Even when patients say they want to recover, they often continue to eat only low-fat, low-calorie foods. Neither psychiatric medications nor talk therapies that are used successfully for other eating disorders are much help in most cases. And research suggests that 50 percent or more of hospitalized anorexic patients who are discharged at a normal weight will relapse within a year. “The thing about people with anorexia nervosa is that they can’t stop,” said Dr. Joanna E. Steinglass, an associate professor in clinical psychiatry at the New York State Psychiatric Institute at Columbia University Medical Center and a co-author of the new study, which appears in the journal Nature Neuroscience. “They come into treatment saying they want to get better, and they can’t do it,” Dr. Steinglass added. Karin Foerde, a research scientist at the psychiatric institute and Columbia, was the lead author on the study. © 2015 The New York Times Company
By Nicholas Bakalar There may be a link between later bedtimes and weight gain, new research suggests. Researchers studied 3,342 adolescents starting in 1996, following them through 2009. At three points over the years, all reported their normal bedtimes, as well as information on fast food consumption, exercise and television time. The scientists calculated body mass index at each interview. After controlling for age, sex, race, ethnicity and socioeconomic status, the researchers found that each hour later bedtime during the school or workweek was associated with about a two-point increase in B.M.I. The effect was apparent even among people who got a full eight hours of sleep, and neither TV time nor exercise contributed to the effect. But fast food consumption did. The study, in the October issue of Sleep, raises questions, said the lead author, Lauren D. Asarnow, a graduate student at the University of California, Berkeley. “First, what is driving this relationship?” she said. “Is it metabolic changes that happen when you stay up late? And second, if we change sleep patterns, can we change eating behavior and the course of weight change?” The scientists acknowledge that their study had limitations. Their sleep data depended on self-reports, and they did not have complete diet information. Also, they had no data on waist circumference, which, unlike B.M.I., can help distinguish between lean muscle and abdominal fat. © 2015 The New York Times Company
By Kelly Servick Children born to obese mothers arrive already predisposed to obesity and other health problems themselves. Exactly what happens in the uterus to transmit this risk still isn’t clear, but a new study on mice points to the placenta as a key actor. The study shows that a hormone acting on the placenta can protect the offspring of obese mice from being born overweight. It suggests ways to break the cycle of obesity in humans—although other researchers caution there's a long way to go. Researchers discovered decades ago that conditions in the uterus can “program” a fetus to be more susceptible to certain health problems. People conceived during the 1944 famine in the Netherlands, for example, suffered higher rates of cardiovascular disease, diabetes, cancer, and other problems later in life. Recent animal studies suggest that malnourishment in the womb changes the expression of DNA in ways that can be passed down for generations. But researchers are now increasingly concerned with the opposite problem. Obese women tend to give birth to larger babies with more body fat, and these children are more likely to develop metabolic syndrome—the cluster of conditions including obesity and high blood sugar that can lead to diabetes and heart disease. To probe the roots of fetal “overgrowth,” developmental biologists at the University of Colorado, Denver, looked to the placenta—the whoopee cushion–shaped organ wedged between the fetus and the wall of the uterus, where branching arteries from the umbilical cord take up oxygen and nutrients from the mother’s blood vessels. The placenta “has always been viewed as a passive organ—whatever happens to the mother is translated toward the fetus,” says lead author Irving Aye, now at the University of Cambridge in the United Kingdom. However, recent research has shown that the placenta is less an indiscriminate drainpipe than a subtle gatekeeper. © 2015 American Association for the Advancement of Science.
by Bethany Brookshire Last weekend, I ran the Navy-Air Force half-marathon. After pounding pavement for an hour or so, my legs began to feel light. Slightly numb. I felt fantastic. I had to remind myself to run, not to stop and dance, and that singing along to my candy-pop workout music — even at mile 10 — is not socially acceptable. It’s the hope of this euphoria — this runner’s high — that keeps me running. We’re not totally sure what’s responsible for this incredible high. Some studies call out our body’s endorphins. Others point to cannabinoids — chemicals related to the active compound in marijuana. A new study suggests that the appetite hormone leptin may play a role in getting us going. And from an evolutionary perspective, it makes good sense. When our dinner might make a quick getaway, it’s important to link our drive to run with our need to feed. But it’s probably not the whole story. Like many other neurobiological events, the exact recipe for runner’s high is complex and hazy. It takes a whole suite of chemicals to help us get started and to make sure we want to go the distance. Those who get runner’s high know it when they feel it. But a clinical definition is a little more slippery. “I remember someone saying the runner’s high was the moment when the body was disconnected from the brain,” says Francis Chaouloff, who studies running and motivation in mice at the French Institute of Health and Medical Research in Bordeaux. This sense of extreme euphoria, he says, is generally limited to people running or exercising for long periods of time, over many miles or hours. © Society for Science & the Public 2000 - 2015.
By Sarah C. P. Williams When the human body needs extra energy, the brain tells fat cells to release their stores. Now, for the first time, researchers have visualized the nerves that carry those messages from brain to fat tissue. The activation of these nerves in mice, they found, helps the rodents lose weight—an observation that could lead to new slimming treatments for obese people. “The methods used here are really novel and exciting,” says neuroendocrinologist Heike Muenzberg-Gruening of Louisiana State University’s Pennington Biomedical Research Center in Baton Rouge, who was not involved in the new study. “Their work has implications for obesity research and also for studying these nerves in other tissues.” Diagrams of the chatter between the brain and fat tissues have long included two-way arrows: Fat cells produce the hormone leptin, which travels to the brain to lower appetite and boost metabolism. In turn, the brain sends signals to the fat cells when it’s time to break down their deposits of fatty molecules, such as lipids, into energy. Researchers hypothesized that there must be a set of nerve cells that hook up to traditional fat tissue to carry these messages, but they’d never been able to indisputably see or characterize them. Now they have. Thanks to two forms of microscopy, neurobiologist Ana Domingos, of the Instituto Gulbenkian de Ciência in Oeiras, Portugal, produced images showing bundles of nerves clearly enveloping fat cells in mice. She and her colleagues went on to show, using various stains, that the nerves were a type belonging to the sympathetic nervous system that stretches outward from the spinal cord and keeps the body’s systems in balance. © 2015 American Association for the Advancement of Science
Link ID: 21448 - Posted: 09.26.2015
By Sarah C. P. Williams Immune cells are usually described as soldiers fighting invading viruses and bacteria. But they may also be waging another battle: the war against fat. When mice lack a specific type of immune cell, researchers have discovered, they become obese and show signs of high blood pressure, high cholesterol, and diabetes. The findings have yet to be replicated in humans, but they are already helping scientists understand the triggers of metabolic syndrome, a cluster of conditions associated with obesity. The new study “definitely moves the field forward,” says immunologist Vishwa Deep Dixit of the Yale School of Medicine, who was not involved in the work. “The data seem really solid.” Scientists already know that there is a correlation between inflammation—a heightened immune response—and obesity. But because fat cells themselves can produce inflammatory molecules, distinguishing whether the inflammation causes weight gain or is just a side effect has been tricky. When he stumbled on this new cellular link between obesity and the immune system, immunologist Yair Reisner of the Weizmann Institute of Science in Rehovot, Israel, was studying something completely different: autoimmune diseases. An immune molecule called perforin had already been shown to kill diseased cells by boring a hole in their outer membrane. Reisner’s group suspected that dendritic cells containing perforin might also be destroying the body’s own cells in some autoimmune diseases. To test the idea, Reisner and his colleagues engineered mice to lack perforin-wielding dendritic cells, and then waited to see whether they developed any autoimmune conditions. © 2015 American Association for the Advancement of Science
Eating two and a half times more than you should will leave you overweight and prone to type 2 diabetes, although no one is entirely sure why. Now a team that fed volunteers a whopping 6000 calories a day have found some clues. Obesity is only one problem caused by eating too much. An overly large food intake can also increase a person’s risk of diabetes, heart disease and some cancers, but no one is sure why this should be the case. Resistance to the hormone insulin seems to play a role. When a healthy person eats a meal, their blood glucose levels rise, and the body responds by making insulin. This hormone prompts the body to store un-needed glucose, but people who develop insulin resistance are not able to absorb excess glucose in the same way. This means that, after eating, their blood glucose levels remain high, and over time, this can damage the kidneys, nervous system and heart, for example. Guenther Boden and Salim Merali at Temple University, Philadelphia, and their team set out to investigate how overeating might lead to insulin resistance. They fed six healthy male volunteers 6000 calories’ worth of food every day for a week – around two and a half times what they should have been eating. “It was a regular, American diet, composed of pizzas, hamburgers and that sort of thing,” says Merali. Each volunteer stayed at a hospital for the duration of the experiment, where they were bed-bound, carefully monitored and prevented from doing any sort of exercise. © Copyright Reed Business Information Ltd.
Link ID: 21400 - Posted: 09.12.2015
By Nicholas Bakalar Being obese at age 50 may be tied to an increased risk of developing Alzheimer’s disease at a younger age. Previous studies have shown that being overweight at midlife is associated with an increased risk of developing Alzheimer’s. Now researchers have found that it also predicts occurrence at a younger age. Scientists studied 1,394 cognitively normal people, average age around 60, following them for an average of 14 years. During the study, 142 developed Alzheimer’s. After controlling for age, race, level of education and cardiovascular risk factors, they found that each unit increase in B.M.I., or body mass index, at age 50 was associated with a 6.7-month decrease in the age of onset of Alzheimer’s. The study, in Molecular Psychiatry, also found an association of higher B.M.I. with larger deposits of neurofibrillary tangles on autopsy, one of the characteristics of brain damage in Alzheimer’s disease. “Age of onset is not as well studied as risk,” said the senior author, Dr. Madhav Thambisetty, a neurologist at the National Institute on Aging. “As we try to cure Alzheimer’s disease, we also want to delay the onset of symptoms. Until we know what factors accelerate onset, we won’t be able to test any potential interventions. And that is perhaps as important as the search for treatment.” © 2015 The New York Times Company
by Bethany Brookshire You’ve already had a muffin. And a half. You know you’re full. But there they are, fluffy and delicious, waiting for the passersby in the office. Just thinking about them makes your mouth water. Maybe if you just slice one into quarters. I mean, that barely counts… And then we give in, our brains overriding our body’s better judgment. When I catch myself once again polishing off a whole plate of baked goods, I wish that there was something I could do, some little pill I could take that would make that last delicious bite look — and taste — a little less appealing. But the more scientists learn about the human body, the more they come to understand that there is no one set of hormones for hungry, with a separate set that kicks off your ice cream binge. Instead, our guts and their hormones are firmly entwined with our feelings of reward and motivation. That close relationship shows just how important it is to our bodies to keep us fed, and how hard it is to stop us from overeating. Researchers have long divided our feeding behavior into two distinct categories. One, the homeostatic portion, is primarily concerned with making sure we’ve got enough energy to keep going and is localized to the lateral hypothalamus in the brain. The reward-related, or “hedonic,” component is centralized in the mesolimbic dopamine system, areas of the brain usually referenced when we talk about the effects of sex, drugs and rock ’n’ roll. © Society for Science & the Public 2000 - 2015
By Dina Fine Maron Whenever the fictional character Popeye the Sailor Man managed to down a can of spinach, the results were almost instantaneous: he gained superhuman strength. Devouring any solid object similarly did the trick for one of the X-Men. As we age and begin to struggle with memory problems, many of us would love to reach for an edible mental fix. Sadly, such supernatural effects remain fantastical. Yet making the right food choices may well yield more modest gains. A growing body of evidence suggests that adopting the Mediterranean diet, or one much like it, can help slow memory loss as people age. The diet's hallmarks include lots of fruits and vegetables and whole grains (as opposed to ultrarefined ones) and a moderate intake of fish, poultry and red wine. Dining mainly on single ingredients, such as pumpkin seeds or blueberries, however, will not do the trick. What is more, this diet approach appears to reap brain benefits even when adopted later in life—sometimes aiding cognition in as little as two years. “You will not be Superman or Superwoman,” says Miguel A. Martínez González, chair of the department of preventive medicine at the University of Navarra in Barcelona. “You can keep your cognitive abilities or even improve them slightly, but diet is not magic.” Those small gains, however, can be meaningful in day-to-day life. Scientists long believed that altering diet could not improve memory. But evidence to the contrary started to emerge about 10 years ago. © 2015 Scientific American
Link ID: 21350 - Posted: 08.28.2015
Dan Charles Ah, sugar — we love the sweetness, but not the calories. For more than a century, food technologists have been on a quest for the perfect, guilt-free substitute. Ah, sugar — we love the sweetness, but not the calories. For more than a century, food technologists have been on a quest for the perfect, guilt-free substitute. Ryan Kellman/NPR There's a new candidate in the century-old quest for perfect, guiltless sweetness. I encountered it at the annual meeting of the Institute of Food Technologists, a combination of Super Bowl, Mecca, and Disneyland for the folks who put the processing in processed food. It was right in the middle of the vast exhibition hall, at the Tate & Lyle booth. This is the company that introduced the British Empire to the sugar cube, back in 1875. A century later, it invented sucralose, aka Splenda. "We have a deep understanding of sweetening," says Michael Harrison, Tate & Lyle's vice president of new product development. This year, his company launched its latest gift to your sweet tooth. It's called allulose. "This is a rare sugar. A sugar that's found in nature," Harrison explains. Chemically speaking, it's almost identical to ordinary sugar. It has the same chemical formula as fructose and glucose, but the atoms of hydrogen and oxygen are arranged slightly differently. © 2015 NPR
Aaron E. Carroll If there is one health myth that will not die, it is this: You should drink eight glasses of water a day. It’s just not true. There is no science behind it. And yet every summer we are inundated with news media reports warning that dehydration is dangerous and also ubiquitous. These reports work up a fear that otherwise healthy adults and children are walking around dehydrated, even that dehydration has reached epidemic proportions. Let’s put these claims under scrutiny. I was a co-author of a paper back in 2007 in the BMJ on medical myths. The first myth was that people should drink at least eight 8-ounce glasses of water a day. This paper got more media attention (even in The Times) than pretty much any other research I’ve ever done. It made no difference. When, two years later, we published a book on medical myths that once again debunked the idea that we need eight glasses of water a day, I thought it would persuade people to stop worrying. I was wrong again. Many people believe that the source of this myth was a 1945 Food and Nutrition Board recommendation that said people need about 2.5 liters of water a day. But they ignored the sentence that followed closely behind. It read, “Most of this quantity is contained in prepared foods.” Water is present in fruits and vegetables. It’s in juice, it’s in beer, it’s even in tea and coffee. Before anyone writes me to tell me that coffee is going to dehydrate you, research shows that’s not true either. Although I recommended water as the best beverage to consume, it’s certainly not your only source of hydration. You don’t have to consume all the water you need through drinks. You also don’t need to worry so much about never feeling thirsty. The human body is finely tuned to signal you to drink long before you are actually dehydrated. © 2015 The New York Times Company
Link ID: 21335 - Posted: 08.25.2015
By Gretchen Vogel Researchers may have finally explained how an obesity-promoting gene variant induces some people to put on the pounds. Using state-of-the-art DNA editing tools, they have identified a genetic switch that helps govern the body’s metabolism. The switch controls whether common fat cells burn energy rather than store it as fat. The finding suggests the tantalizing prospect that doctors might someday offer a gene therapy to melt extra fat away. Along with calories and exercise, genes influence a person’s tendency to gain—and keep—extra pounds. One of the genes with the strongest link to obesity is called FTO. People with certain versions of the gene are several kilos heavier on average and significantly more likely to be obese. Despite years of study, no one had been able to figure out what the gene does in cells or how it influences weight. There was some evidence FTO helped control other genes, but it was unclear which ones. Some researchers had looked for activity of FTO in various tissues, without finding any clear signals. Melina Claussnitzer, Manolis Kellis, and their colleagues at Harvard University, Massachusetts Institute of Technology, and the Broad Institute in Cambridge, turned to data from the Roadmap Epigenomics Project, an 8-year effort that identified the chemical tags on DNA that influence the function of genes. The researchers used those epigenetic tags to look at whether FTO was turned on or off in 127 cell types. The gene seemed to be active in developing fat cells called adipocyte progenitor cells. © 2015 American Association for the Advancement of Science
Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.
By Gretchen Reynolds Sticking to a diet requires self-control and a willingness to forgo present pleasures for future benefits. Not surprisingly, almost everyone yields to temptation at least sometimes, opting for the cookie instead of the apple. Wondering why we so often override our resolve, scientists at the Laboratory for Social and Neural Systems Research at the University of Zurich recently considered the role of stress, which is linked to a variety of health problems, including weight gain. (There’s something to the rom-com cliché of the jilted lover eating ice cream directly from the carton.) But just how stress might drive us to sweets has not been altogether clear. It turns out that even mild stress may immediately alter the workings of our brains in ways that undermine willpower. For their study, published this month in Neuron, researchers recruited 51 young men who said they were trying to maintain a healthy diet and lifestyle. The men were divided into two groups, one of which served as a control, and then all were asked to skim through images of different kinds of food on a computer screen, rating them for taste and healthfulness. Next, the men in the experimental group were told to plunge a hand into a bowl of icy water for as long as they could, a test known to induce mild physiological and psychological stress. Relative to the control group, the men developed higher levels of cortisol, a stress hormone. After that, men from each group sat in a brain-scanning machine and watched pictures of paired foods flash across a screen. Generally, one of the two foods was more healthful than the other. The subjects were asked to click rapidly on which food they would choose to eat, knowing that at the end of the test they would actually be expected to eat one of these picks (chosen at random from all of their choices). © 2015 The New York Times Company
By Mitch Leslie Some microbes that naturally dwell in our intestines might be bad for our eyes, triggering autoimmune uveitis, one of the leading causes of blindness. A new study suggests that certain gut residents produce proteins that enable destructive immune cells to enter the eyes. The idea that gut microbes might promote autoimmune uveitis “has been there in the back of our minds,” says ocular immunologist Andrew Taylor of the Boston University School of Medicine, who wasn’t connected to the research. “This is the first time that it’s been shown that the gut flora seems to be part of the process.” As many as 400,000 people in the United States have autoimmune uveitis, in which T cells—the commanders of the immune system—invade the eye and damage its middle layer. All T cells are triggered by specific molecules called antigens, and for T cells that cause autoimmune uveitis, certain eye proteins are the antigens. Even healthy people carry these T cells, yet they don't usually swarm the eyes and unleash the disease. That's because they first have to be triggered by their matching antigen. However, those proteins don't normally leave the eye. So what could stimulate the T cells? One possible explanation is microbes in the gut. In the new study, immunologist Rachel Caspi of the National Eye Institute in Bethesda, Maryland, and colleagues genetically engineered mice so their T cells recognized one of the same eye proteins targeted in autoimmune uveitis. The rodents developed the disease around the time they were weaned. But dosing the animals with four antibiotics that killed off most of their gut microbes delayed the onset and reduced the severity of the disease. © 2015 American Association for the Advancement of Science.
Carl Zimmer You are what you eat, and so were your ancient ancestors. But figuring out what they actually dined on has been no easy task. There are no Pleistocene cookbooks to consult. Instead, scientists must sift through an assortment of clues, from the chemical traces in fossilized bones to the scratch marks on prehistoric digging sticks. Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains. Roughly seven million years ago, our ancestors split off from the apes. As far as scientists can tell, those so-called hominins ate a diet that included a lot of raw, fiber-rich plants. After several million years, hominins started eating meat. The oldest clues to this shift are 3.3-million-year-old stone tools and 3.4-million-year-old mammal bones scarred with cut marks. The evidence suggests that hominins began by scavenging meat and marrow from dead animals. At some point hominins began to cook meat, but exactly when they invented fire is a question that inspires a lot of debate. Humans were definitely making fires by 300,000 years ago, but some researchers claim to have found campfires dating back as far as 1.8 million years. Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants. © 2015 The New York Times Company
By James Gallagher Health editor, BBC News website Fat or carbs? Scientists have shed new light on which diet might be more effective at reducing fat Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows. Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake. Experts say the most effective diet is one people can stick to. It has been argued that restricting carbs is the best way to get rid of a "spare tyre" as it alters the body's metabolism. The theory goes that fewer carbohydrates lead to lower levels of insulin, which in turn lead to fat being released from the body's stores. "All of those things do happen with carb reduction and you do lose body fat, but not as much as when you cut out the fat," said lead researchers Dr Kevin Hall, from the US-based National Institute of Diabetes and Digestive and Kidney Diseases. Cutting down on carbohydrates might not be as effective after all, the study suggests In the study, 19 obese people were initially given 2,700 calories a day. Then, over a period of two weeks they tried diets which cut their calorie intake by a third, either by reducing carbohydrates or fat. The team analysed the amount of oxygen and carbon dioxide being breathed out and the amount of nitrogen in participants' urine to calculate precisely the chemical processes taking place inside the body. The results published in Cell Metabolism showed that after six days on each diet, those reducing fat intake lost an average 463g of body fat - 80% more than those cutting down on carbs, whose average loss was 245g. © 2015 BBC.
Link ID: 21296 - Posted: 08.15.2015