Chapter 13. Homeostasis: Active Regulation of the Internal Environment

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1819

By Lisa Sanders, M.D. The early-morning light wakened the middle-aged man early on a Saturday morning in 2003. He felt his 51-year-old wife move behind him and turned to see her whole body jerking erratically. He was a physician, a psychiatrist, and knew immediately that she was having a seizure. He grabbed his phone and dialed 911. His healthy, active wife had never had a seizure before. But this was only the most recent strange episode his wife had been through over the past 18 months. A year and a half earlier, the man returned to his suburban Pittsburgh home after a day of seeing patients and found his wife sitting in the kitchen, her hair soaking wet. He asked if she had just taken a shower. No, she answered vaguely, without offering anything more. Before he could ask her why she was so sweaty, their teenage son voiced his own observations. Earlier that day, the boy reported, “She wasn’t making any sense.” That wasn’t like her. Weeks later, his daughter reported that when she arrived home from school, she heard a banging sound in a room in the attic. She found her mother under a futon bed, trying to sit up and hitting her head on the wooden slats underneath. Her mother said she was looking for something, but she was obviously confused. The daughter helped her mother up and brought her some juice, which seemed to help. With both episodes, the children reported that their mother didn’t seem upset or distressed. The woman, who had trained as a psychiatrist before giving up her practice to stay with the kids, had no recollection of these odd events. The Problem Is Sugar Her husband persuaded her to see her primary-care doctor. Upon hearing about these strange spells, the physician said she suspected that her patient was having episodes of hypoglycemia. Very low blood sugar sends the body into a panicked mode of profuse sweating, shaking, weakness and, in severe cases, confusion. She referred her to a local endocrinologist. © 2020 The New York Times Company

Keyword: Epilepsy
Link ID: 27341 - Posted: 07.02.2020

By Abigail Zuger, M.D. Do I dare to eat a Cheeto? I do not; I can’t even let one into the house. The same goes for its delectably plump twin, the Cheez Doodle; its tasty rotund cousin, the Cheez Ball; and its heavenly brother by another mother, that sandwich of two Cheezy crackers glued together with peanut butter. I dare not even walk down the supermarket aisle where this neon orange family lives, for while others may succumb to chocolate or pastry, my Waterloo is this cheesy goodness — let’s call it Cheez. One Cheez Doodle would lead to a bag, then to more bags, and then to the certain catastrophe of a larger, sicker me. I know these delicacies are terrible for a person’s health. How exactly do I know that? It’s not because I’m a medical professional, that’s for sure; there were zero discussions of Cheez in our pre- or post-graduate training. I know because I just know, is all. Overprocessed chemical-laden stuff is bad for you; it’s pure malevolent junk. Everyone knows that. George Zaidan, an MIT-trained chemist of contrarian bent, knows it too. That is, he knows it to be piously reiterated received wisdom, and thus legitimate fodder for dissection, examination, refutation, and cheerfully self-indulgent obscenity-laden riffs. Further, he has chosen this junk food truth as an excellent starting point for “Ingredients: The Strange Chemistry of What We Put in Us and On Us,” an entertaining and enlightening jaunt around the perimeters of exactly what we can ever hope science can teach us about stuff that is good and bad for us. And it all begins with a single Cheeto, the putative first brick on the winding golden road to nutritional hell.

Keyword: Obesity
Link ID: 27329 - Posted: 06.27.2020

By Susan Burton “Diet” is a strange word, used to describe both a deviation from the norm and the norm itself: the foods that make up a day, a week, a lifetime. From the beginning, my diet was a big part of my story, even the one that others told about me. “All babies like rice cereal,” my mother will say. “But you didn’t.” In the high chair, I would tighten my lips and turn away. When I was two, at the first preschool parent-teacher conference, they told my mother, “Susan never eats snack.” Recalling encounters with foods I disliked as a small child raises an old alarm in me. A sip of a soda at the zoo one afternoon, the prickling shock of the bubbles. It would be more than a decade before I would try something fizzy again. Melba toast at a white-tablecloth restaurant in Chicago. The next day, I vomited. The bright yellow worm of mustard on a hot dog at a public beach. The jagged chopped nuts on a hot-fudge sundae, even though I’d asked for it plain. In any choice related to food, I always preferred plain. I went through primary school never eating a salad or a single bite of fruit. The term “picky eater” didn’t apply to me. Picky eaters had to be reminded to pay attention to their plates. But I never forgot about food, in the way you never forget about anything you fear. I was scared of feeling sick. I was scared of not liking tastes. I was scared of something getting in me that I could never get out. I was scared of something happening to my body that would make me not me. In many ways, my adolescence was stereotypical. I was an awkward middle-schooler who transformed herself with the help of Seventeen magazine. I stood in bleachers at Friday-night football games. I read Sylvia Plath and wrote furiously in my journal. I learned to smoke cigarettes on a weekday afternoon in a wood-panelled car. I signed the notes I passed in class “Love, Susan.” I tried to be the perfect teen-age girl. But I was also a troubled one, and the dark part of my adolescence became its heart. © 2020 Condé Nast.

Keyword: Anorexia & Bulimia
Link ID: 27306 - Posted: 06.17.2020

Natalie Dombois for Quanta Magazine It’s not surprising that the fruit fly larva in the laboratory of Jimena Berni crawls across its large plate of agar in search of food. “A Drosophila larva is either eating or not eating, and if it’s not eating, it wants to eat,” she said. The surprise is that this larva can search for food at all. Owing to a suite of genetic tricks performed by Berni, it has no functional brain. In fact, the systems that normally relay sensations of touch and feedback from its muscles have also been shut down. Berni, an Argentinian neuroscientist whose investigations of fruit fly nervous systems recently earned her a group leader position at the University of Sussex, is learning what the tiny cluster of neurons that directly controls the larva’s muscles does when it’s allowed to run free, entirely without input from the brain or senses. How does the animal forage when it’s cut off from information about the outside world? The answer is that it moves according to a very particular pattern of random movements, a finding that thrilled Berni and her collaborator David Sims, a professor of marine ecology at the Marine Biological Association in Plymouth, U.K. For in its prowl for food, this insensate maggot behaves exactly like an animal Sims has studied for more than 25 years — a shark. In neuroscience, the usual schema for considering behavior has it that the brain receives inputs, combines them with stored information, then decides what to do next. This corresponds to our own intuitions and experiences, because we humans are almost always responding to what we sense and remember. But for many creatures, useful information isn’t always available, and for them something else may also be going on. When searching their environment, sharks and a diverse array of other species, now including fruit fly larvae, sometimes default to the same pattern of movement, a specific type of random motion called a Lévy walk. All Rights Reserved © 2020

Keyword: Learning & Memory; Aggression
Link ID: 27301 - Posted: 06.13.2020

Ruth Williams Research teams in the US and Japan have each discovered independently and by unrelated routes a population of hypothalamic neurons in mice that induce the low body temperature, reduced metabolism, and inactivity characteristic of hibernation and torpor. The two papers are published today (June 11) in Nature. “Trying to pin down which neurons are involved with initiating torpor and hibernation . . . is certainly something that biologists have been interested in for several years now,” says biologist Steven Swoap of Williams College who was not involved in the research. “Both of [the teams] come at it from a different angle and almost end up in the same place, so they complement each other in that way, which is pretty nice,” he adds. Hibernation and daily torpor are both forms of mammalian suspended animation and share a number of features. Both involve significant, but regulated, drops in body temperature, metabolism, heart rate, breathing rate, and activity, and both are thought to be ways of preserving energy when food is scarce. While hibernation lasts for weeks or months, however, daily torpor lasts several hours each day. Why some mammals such as bears and certain primates and rodents have the ability to enter periods of dormancy while others don’t is unknown. But the diversity of hibernator species suggests that the biological mechanisms controlling such states may also be preserved, albeit unused, in non-hibernating species. This tantalizing possibility sparks ideas of sending dormant astronauts on extended space journeys as well as more down-to-earth notions of temporarily lowering body temperature and metabolism to preserve tissues in patients with, for example, traumatic injuries. © 1986–2020 The Scientist.

Keyword: Sleep
Link ID: 27300 - Posted: 06.13.2020

By David Templeton For much of the 20th century, most people thought that stress caused stomach ulcers. But that belief was largely dismissed 38 years ago when a study, which led to a Nobel Prize in 2016, described the bacterium that generates inflammation in the gastrointestinal tract and causes peptic ulcers and gastritis. “The history of the idea that stress causes ulcers took a side step with the discovery of Helicobacter pylori,” said Dr. David Levinthal, director of the University of Pittsburgh Neurogastroenterology & Motility Center. “For the longest time — most of the 20th century — the dominant idea was that stress was the cause of ulcers until the early 1980s with discovery of Helicobacter pylori that was tightly linked to the risk of ulcers. That discovery was critical but maybe over-generalized as the only cause of ulcers.” Now in an important world first, a study co-authored by Levinthal and Peter Strick, both from the Pitt School of Medicine, has explained what parts of the brain’s cerebral cortex influence stomach function and how it can affect health. “Our study shows that the activity of neurons in the cerebral cortex, the site of conscious mental function, can impact the ability of bacteria to colonize the stomach and make the person more sensitive to it or more likely to harbor the bacteria,” Levinthal said. The study goes far beyond ulcers by also providing evidence against the longstanding belief that the brain’s influence on the stomach was more reflexive and with limited, if any, involvement of the thinking brain. And for the first time, the study also provides a general blueprint of neural wiring that controls the gastrointestinal tract. © 2020 StarTribune.

Keyword: Obesity
Link ID: 27286 - Posted: 06.06.2020

In a nationwide study, NIH funded researchers found that the presence of abnormal bundles of brittle blood vessels in the brain or spinal cord, called cavernous angiomas (CA), are linked to the composition of a person’s gut bacteria. Also known as cerebral cavernous malformations, these lesions which contain slow moving or stagnant blood, can often cause hemorrhagic strokes, seizures, or headaches. Current treatment involves surgical removal of lesions when it is safe to do so. Previous studies in mice and a small number of patients suggested a link between CA and gut bacteria. This study is the first to examine the role the gut microbiome may play in a larger population of CA patients. Led by scientists at the University of Chicago, the researchers used advanced genomic analysis techniques to compare stool samples from 122 people who had at least one CA as seen on brain scans, with those from age- and sex-matched, control non-CA participants, including samples collected through the American Gut Project(link is external). Initially, they found that on average the CA patients had more gram-negative bacteria whereas the controls had more gram-positive bacteria, and that the relative abundance of three gut bacterial species distinguished CA patients from controls regardless of a person’s sex, geographic location, or genetic predisposition to the disease. Moreover, gut bacteria from the CA patients appeared to produce more lipopolysaccharide molecules which have been shown to drive CA formation in mice. According to the authors, these results provided the first demonstration in humans of a “permissive microbiome” associated with the formation of neurovascular lesions in the brain.

Keyword: Stroke
Link ID: 27280 - Posted: 06.04.2020

Ruth Williams Experiments in mice and observations in humans have suggested the bone protein osteocalcin acts as a hormone regulating, among other things, metabolism, fertility, exercise capacity and acute stress. That interpretation is now partially in doubt. Two independent papers published yesterday (May 28) in PLOS Genetics, each of which presents a new osteocalcin knockout mouse strain, report that glucose metabolism and fertility were unaffected in the animals. While some researchers praise the studies, others highlight weaknesses. “I thought they were very good papers. I think the authors should be congratulated for very comprehensive studies of both skeletal and extraskeletal functions of osteocalcin,” says emeritus bone researcher Caren Gundberg of Yale School of Medicine who was not involved in the research. Skeletal biologist Gerard Karsenty of Columbia University disagrees. “There have been 25 laboratories in the world . . . that have shown osteocalcin is a hormone,” says Karsenty. These two papers “do not affect the work of [those] groups,” he adds, “because they are . . . technically flawed.” This tiny protein, one of the most abundant in the body, is produced and secreted by bone-forming osteoblast cells. In the 40 or so years since osteocalcin’s discovery, its precise function, or functions—whether in the bone or endocrine system—have not been fully pinned down. Studies from Karsenty’s lab more than 10 years ago were the first to indicate that osteocalcin could act as a hormone, regulating glucose metabolism. But the suggested hormonal function has been questioned for its relevance to humans. For example, while studies in people have shown that levels of osteocalcin in the blood are correlated with diabetes, whether this is a cause or effect is unclear. © 1986–2020 The Scientist.

Keyword: Hormones & Behavior; Obesity
Link ID: 27275 - Posted: 06.03.2020

By Nicholas Bakalar Eating foods high in flavonoids — a group of nutrients found in many fruits and vegetables — may lower your risk for dementia, researchers report. The study, in the American Journal of Clinical Nutrition, looked at 2,801 men and women who were 50 and older and free of dementia at the start. Over an average of 20 years of follow-up, researchers gathered diet information at five periodic health examinations; during that time, 193 of the participants developed Alzheimer’s disease or other forms of dementia. Compared with those in the 15th percentile or lower for flavonoid intake, those in the 60th or higher had a 42 to 68 percent lower risk for dementia, depending on the type of flavonoid consumed. Intake of one type of flavonoid, anthocyanins, abundant in blueberries, strawberries and red wine, had the strongest association with lowered risk. Apples, pears, oranges, bananas and tea also contributed. The study controlled for many health and behavioral characteristics, including how strongly participants adhered to the government’s Dietary Guidelines for Americans, which in addition to fruits and vegetables emphasize whole grains, lean meats and other heart-healthy foods. The senior author, Paul F. Jacques, a scientist with the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University, said that the amount consumed by those who benefited the most was not large. Their monthly average was about seven half-cup servings of strawberries or blueberries, eight apples or pears, and 17 cups of tea. “It doesn’t take much,” he said. “A couple of servings of berries a week, maybe an apple or two.” © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27255 - Posted: 05.20.2020

By Susan Burton I ordered heritage flour from Minnesota and made a loaf of bread with a crackling crust. Those are facts. But what is the tone of that sentence? Am I bragging about my baking prowess, my ingredient sourcing, and the privilege that allows me to spend the pandemic in the kitchen? Or is the sentence a setup to a tear-down of entitlement? Or the beginning of an essay about an activity that brings many, including me, comfort amid uncertainty? All of these; none of them. Really I am writing that sentence the way I have always written any sentence about food: As someone with an eating disorder, someone who is working toward recovery but is not yet recovered. Stay-at-home orders present special challenges for people with eating disorders. The kitchen is always there: You can’t get away from it. You can’t get away from food online, either, where it’s more present than ever: Sourdough starters and bean shortages and the ease with which people with healthier, typical relationships with food joke about these things, or fill their Instagrams with photos of family meals. I don’t begrudge others that ease; I long for it. Eating disorders are isolating. They are often misunderstood, perceived as the kind of thing you could get over if you just got a grip. Right now, many in our country are suffering profoundly, facing death and loss of livelihoods. Being able to afford food is a marker of privilege. Shouldn’t our primary relationship with food be one of gratitude for it? It’s not that simple for people with eating disorders. For someone with an active eating disorder, food can be an agent of destruction. For someone in recovery, isolation can prompt a shift to old coping mechanisms. Eating disorder outreach has risen online: On Instagram, @covid19eatingsupport provides “meal support” — somebody to eat with. The National Eating Disorders Association offers video sessions that explore subjects such as family dynamics during quarantine and eating disorders during midlife. © 2020 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 27246 - Posted: 05.14.2020

African Americans with severe sleep apnea and other adverse sleep patterns are much more likely to have high blood glucose levels — a risk factor for diabetes — than those without these patterns, according to a new study funded in part by the National Heart, Lung, and Blood Institute (NHLBI), part of the National Institutes of Health. The findings suggest that better sleep habits may lead to better blood glucose control and prove beneficial for type 2 diabetes prevention and diabetes management in African Americans, who are at higher risk for type 2 diabetes than other groups. They also point to the importance of screening for sleep apnea to help fight the potential for uncontrolled blood sugar in this high-risk group, the researchers said. Previous studies have linked disturbed sleep patterns, including sleep apnea, to increased blood glucose levels in white and Asian populations. But this new study is one of the few to use objective measurements to link these disturbed sleep patterns to increased blood glucose levels in black men and women, the researchers said. Their findings appear online on April 28 in the Journal of the American Heart Association. “The study underscores the importance of developing interventions to promote regular sleep schedules, particularly in those with diabetes,” said Yuichiro Yano, M.D., Ph.D., the lead study author and a researcher in the Department of Family Medicine and Community Health at Duke University. “It also reaffirms the need to improve the screening and diagnosis of sleep apnea, both in African Americans and other groups.”

Keyword: Sleep
Link ID: 27219 - Posted: 04.29.2020

By Roni Caryn Rabin Obesity may be one of the most important predictors of severe coronavirus illness, new studies say. It’s an alarming finding for the United States, which has one of the highest obesity rates in the world. Though people with obesity frequently have other medical problems, the new studies point to the condition in and of itself as the most significant risk factor, after only older age, for being hospitalized with Covid-19, the illness caused by the coronavirus. Young adults with obesity appear to be at particular risk, studies show. The research is preliminary, and not peer reviewed, but it buttresses anecdotal reports from doctors who say they have been struck by how many seriously ill younger patients of theirs with obesity are otherwise healthy. No one knows why obesity makes Covid-19 worse, but hypotheses abound. Some coronavirus patients with obesity may already have compromised respiratory function that preceded the infection. Abdominal obesity, more prominent in men, can cause compression of the diaphragm, lungs and chest capacity. Obesity is known to cause chronic, low-grade inflammation and an increase in circulating, pro-inflammatory cytokines, which may play a role in the worst Covid-19 outcomes. Some 42 percent of American adults — nearly 80 million people — live with obesity. That is a prevalence rate far exceeding those of other countries hit hard by the coronavirus, like China and Italy. The new findings about obesity risks are bad news for all Americans, but particularly for African-Americans and other people of color, who have higher rates of obesity and are already bearing a disproportionate burden of Covid-19 deaths. High rates of obesity are also prevalent among low-income white Americans, who may also be adversely affected, experts say. More than half of Covid-19 deaths in the United States so far have been in New York and New Jersey, but the new findings mean the coronavirus could exact a steep toll in regions like the South and the Midwest, where obesity is more prevalent than in the Northeast. © 2020 The New York Times Company

Keyword: Obesity; Neuroimmunology
Link ID: 27205 - Posted: 04.17.2020

According to a recent analysis of data from two major eye disease studies, adherence to the Mediterranean diet – high in vegetables, whole grains, fish, and olive oil – correlates with higher cognitive function. Dietary factors also seem to play a role in slowing cognitive decline. Researchers at the National Eye Institute (NEI), part of the National Institutes of Health, led the analysis of data from the Age-Related Eye Disease Study (AREDS) and AREDS2. They published their results today in Alzheimer’s and Dementia: the Journal of the Alzheimer’s Association. “We do not always pay attention to our diets. We need to explore how nutrition affects the brain and the eye” said Emily Chew, M.D., director of the NEI Division of Epidemiology and Clinical Applications and lead author of the studies. The researchers examined the effects of nine components of the Mediterranean diet on cognition. The diet emphasizes consumption of whole fruits, vegetables, whole grains, nuts, legumes, fish, and olive oil, as well as reduced consumption of red meat and alcohol. AREDS and AREDS2 assessed over years the effect of vitamins on age-related macular degeneration (AMD), which damages the light-sensitive retina. AREDS included about 4,000 participants with and without AMD, and AREDS2 included about 4,000 participants with AMD. The researchers assessed AREDS and AREDS2 participants for diet at the start of the studies. The AREDS study tested participants’ cognitive function at five years, while AREDS2 tested cognitive function in participants at baseline and again two, four, and 10 years later. The researchers used standardized tests based on the Modified Mini-Mental State Examination to evaluate cognitive function as well as other tests. They assessed diet with a questionnaire that asked participants their average consumption of each Mediterranean diet component over the previous year.

Keyword: Alzheimers; Obesity
Link ID: 27192 - Posted: 04.15.2020

By Jennifer Couzin-Frankel In college in the 1990s, Alix Timko wondered why she and her friends didn’t have eating disorders. “We were all in our late teens, early 20s, all vaguely dissatisfied with how we looked,” says Timko, now a psychologist at Children’s Hospital of Philadelphia. Her crowd of friends matched the profile she had seen in TV dramas—overachievers who exercised regularly and whose eating was erratic, hours of fasting followed by “a huge pizza.” “My friends and I should have had eating disorders,” she says. “And we didn’t.” It was an early clue that her understanding of eating disorders was off the mark, especially for the direst diagnosis of all: anorexia nervosa. Anorexia is estimated to affect just under 1% of the U.S. population, with many more who may go undiagnosed. The illness manifests as self-starvation and weight loss so extreme that it can send the body into a state resembling hibernation. Although the disorder also affects boys and men, those who have it are most often female, and about 10% of those affected die. That’s the highest mortality rate of any psychiatric condition after substance abuse, on par with that of childhood leukemia. With current treatments, about half of adolescents recover, and another 20% to 30% are helped. As a young adult, Timko shared the prevailing view of the disease: that it develops when girls, motivated by a culture that worships thinness, exert extreme willpower to stop themselves from eating. Often, the idea went, the behavior arises in reaction to parents who are unloving, controlling, or worse. But when Timko began to treat teens with anorexia and their families, that narrative crumbled—and so did her certainties about who is at risk. Many of those young people “don’t have body dissatisfaction, they weren’t on a diet, it’s not about control,” she found. “Their mom and dad are fabulous and would move heaven and Earth to get them better.” © 2020 American Association for the Advancement of Science

Keyword: Anorexia & Bulimia
Link ID: 27181 - Posted: 04.10.2020

Stephanie Preston The media is replete with COVID-19 stories about people clearing supermarket shelves – and the backlash against them. Have people gone mad? How can one individual be overfilling his own cart, while shaming others who are doing the same? As a behavioral neuroscientist who has studied hoarding behavior for 25 years, I can tell you that this is all normal and expected. People are acting the way evolution has wired them. The word “hoarding” might bring to mind relatives or neighbors whose houses are overfilled with junk. A small percentage of people do suffer from what psychologists call “hoarding disorder,” keeping excessive goods to the point of distress and impairment. But hoarding is actually a totally normal and adaptive behavior that kicks in any time there is an uneven supply of resources. Everyone hoards, even during the best of times, without even thinking about it. People like to have beans in the pantry, money in savings and chocolates hidden from the children. These are all hoards. Most Americans have had so much, for so long. People forget that, not so long ago, survival often depended on working tirelessly all year to fill root cellars so a family could last through a long, cold winter – and still many died. Similarly, squirrels work all fall to hide nuts to eat for the rest of the year. Kangaroo rats in the desert hide seeds the few times it rains and then remember where they put them to dig them back up later. A Clark’s nutcracker can hoard over 10,000 pine seeds per fall – and even remember where it put them. © 2010–2020, The Conversation US, Inc.

Keyword: Obesity; Attention
Link ID: 27149 - Posted: 03.30.2020

By Jane E. Brody Many people who have struggled for years with excess weight know that the hardest and often the most frustrating job is not getting it off but keeping it off. Recent decades have seen countless popular diet schemes that promised to help people shed unwanted pounds, and as each of these diets failed in the long run, they spawned their successors. A diet, after all, is something people go on to go off. Most people think of a diet as a means to an end, and few who go on a food-restricted diet to lose weight expect to have to eat that way indefinitely. And therein lies the rub, with the current unchecked epidemic of obesity as the sorry result. We live in a land of incredible excess. Rich or poor, most of us are surrounded by calorie-rich vittles, many of them tasty but deficient in ingredients that nourish healthy bodies. “We can’t go two minutes without being assaulted by a food cue,” said Suzanne Phelan, lead author of an encouraging new study in the journal Obesity. Even the most diligent dieters can find it hard to constantly resist temptation. And once people fall off the diet wagon, they often stay off, and their hard-lost pounds reappear a lot faster than it took to shed them. But these facts need not discourage anyone from achieving lasting weight loss. Researchers have identified the strategies and thought processes that have enabled many thousands of people to lose a significant amount of weight and keep it off for many years, myself among them. The new study led by Dr. Phelan, professor of kinesiology and public health at California Polytechnic State University, identified habits and strategies that can be keys to success for millions. Yes, like most sensible weight-loss plans, they involve healthful eating and regular physical activity. But they also include important self-monitoring practices and nonpunitive coping measures that can be the crucial to long-term weight management. © 2020 The New York Times Company

Keyword: Obesity
Link ID: 27119 - Posted: 03.16.2020

Laura Reiley A study published in the journal Cell Metabolism by a group of Yale researchers found that the consumption of the common artificial sweetener sucralose (which is found in Splenda, Zerocal, Sukrana, SucraPlus and other brands) in combination with carbohydrates can swiftly turn a healthy person into one with high blood sugar. From whole grain English muffins to reduced-sugar ketchup, sucralose is found in thousands of baked goods, condiments, syrups and other consumer packaged goods — almost all of them containing carbs. The finding, which researchers noted has yet to be replicated in other studies, raises new questions about the use of artificial sweeteners and their effects on weight gain and overall health. In the Yale study, researchers took 60 healthy-weight individuals and separated them into three groups: A group that consumed a regular-size beverage containing the equivalent of two packets of sucralose sweetener, a second group that consumed a beverage sweetened with table sugar at the equivalent sweetness, and a third control group that had a beverage with the artificial sweetener as well as a carbohydrate called maltodextrin. The molecules of maltodextrin don’t bind to taste receptors in the mouth and are impossible to detect. While the sensation of the third group’s beverage was identical to the Splenda-only group, only this group exhibited significant adverse health effects. The artificial sweetener by itself seemed to be fine, the researchers discovered, but that changed when combined with a carbohydrate. Seven beverages over two weeks and the previously healthy people in this group became glucose intolerant, a metabolic condition that results in elevated blood glucose levels and puts people at an increased risk for diabetes.

Keyword: Obesity; Chemical Senses (Smell & Taste)
Link ID: 27113 - Posted: 03.12.2020

By Susana Martinez-Conde Parents tend to be just a bit biased about their children’s looks (not me though—my kids are objectively beautiful), but as it turns out, this type of self-deception is not as benign as one might think. According to recent research, many parents appear to suffer from a sort of denial concerning their kids’ weights, which poses a considerable obstacle to remediating childhood obesity by way of promoting healthy eating habits at home. The latest of such studies was published last month in the American Journal of Human Biology, and conducted by a team of scientists at the University of Coimbra in Portugal. Daniela Rodrigues and her collaborators, Aristides Machado-Rodrigues and Cristina Padez, recruited hundreds of parents and children for their research. All the participating children were between 6 and 10 years old and attended elementary school in Portugal. A total of 834 parents completed questionnaires that included a variety of questions, such as whether they thought that their children’s weight was a bit too little, a bit too much, way too much, or just fine. In turn, the team collected the weights and heights of the 793 participating children, at their respective schools. The results were in line with the researchers’ predictions, but nonetheless remarkable. Of the 33% parents who misperceived their children’s weight, 93% underestimated it. Moreover, parents who underestimated their kids’ weights were 10 to 20 times more likely to have an obese child. Several factors were associated with the parental weight underestimation, including a higher BMI (body mass index) for the mothers, younger ages for the children, lower household income (for girls) and urban living (for boys). However, such associations did not explain why parents underestimated their children’s weights to begin with. © 2020 Scientific American

Keyword: Obesity; Attention
Link ID: 27106 - Posted: 03.09.2020

Amelia Hill A low carbohydrate diet may prevent and even reverse age-related damage to the brain, research has found. By examining brain scans, researchers found that brain pathways begin to deteriorate in our late 40s – earlier than was believed. “Neurobiological changes associated with ageing can be seen at a much younger age than would be expected, in the late 40s,” said Lilianne R Mujica-Parodi, a professor in the department of biomedical engineering at Stony Brook University in New York. “However, the study also suggests that this process may be prevented or reversed based on dietary changes that involve minimising the consumption of simple carbohydrates,” added Mujica-Parodi. To better understand how diet influences brain ageing, researchers concentrated on young people whose brains showed no signs of ageing. This is the period during which prevention may be most effective. Using brain scans of nearly 1,000 individuals between the ages of 18 to 88, researchers found that the damage to neural pathways accelerated depending on where the brain was getting its energy from. Glucose, they found, decreased the stability of the brain’s networks while ketones – produced by the liver during periods of carbohydrate restrictive diets – made the networks more stable. “What we found with these experiments involves both bad and good news,” said Mujica-Parodi, “The bad news is that we see the first signs of brain ageing much earlier than was previously thought. “However, the good news is that we may be able to prevent or reverse these effects with diet … by exchanging glucose for ketones as fuel for neurons,” she added in the study, which is published in PNAS. © 2020 Guardian News & Media Limited

Keyword: Alzheimers; Obesity
Link ID: 27103 - Posted: 03.07.2020

By Gretchen Reynolds Taking up exercise could alter our feelings about food in surprising and beneficial ways, according to a compelling new study of exercise and eating. The study finds that novice exercisers start to experience less desire for fattening foods, a change that could have long-term implications for weight control. The study also shows, though, that different people respond quite differently to the same exercise routine and the same foods, underscoring the complexities of the relationship between exercise, eating and fat loss. I frequently write about exercise and weight, in part because weight control is a pressing motivation for so many of us to work out, myself included. But the effects of physical activity on waistlines are not straightforward and coherent. They are, in fact, distressingly messy. Both personal experience and extensive scientific studies tell us that a few people will lose considerable body fat when they start exercising; others will gain; and most will drop a few pounds, though much less than would be expected given how many calories they are burning during their workouts. At the same time, physical activity seems to be essential for minimizing weight gain as we age and maintaining weight loss if we do manage to shed pounds. Precisely how exercise influences weight in this topsy-turvy fashion is uncertain. On the one hand, most types of exercise increase appetite in most people, studies show, tempting us to replace calories, blunting any potential fat loss and even initiating weight creep. But other evidence suggests that physical fitness may affect people’s everyday responses to food, which could play a role in weight maintenance. In some past studies, active people of normal weight displayed less interest in high-fat, calorie-dense foods than inactive people who were obese. © 2020 The New York Times Company

Keyword: Obesity
Link ID: 27077 - Posted: 02.27.2020