Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Susan Milius In nighttime flying duels, Mexican free-tailed bats make short, wavering sirenlike waaoo-waaoo sounds that jam each other’s sonar. These “amazing aerial battles” mark the first examples of echolocating animals routinely sabotaging the sonar signals of their own kind, says Aaron Corcoran of Wake Forest University in Winston-Salem, N.C. Many bats, like dolphins, several cave-dwelling birds and some other animals, locate prey and landscape features by pinging out sounds and listening for echoes. Some prey, such as tiger moths, detect an incoming attack and make frenzied noises that can jam bat echolocation, Corcoran and his colleagues showed in 2009 (SN: 1/31/09, p. 10). And hawkmoths under attack make squeaks with their genitals in what also may be defensive jamming (SN Online: 7/3/13). But Corcoran didn’t expect bat-on-bat ultrasonic warfare. He was studying moths dodging bats in Arizona’s Chiricahua Mountains when his equipment picked up a feeding buzz high in the night sky. A free-tailed bat was sending faster and faster echolocation calls to refine the target position during the final second of an attack. (Bats, the only mammals known with superfast muscles, can emit more than 150 sounds a second.) Then another free-tailed bat gave a slip-sliding call. Corcoran, in a grad student frenzy of seeing his thesis topic as relevant to everything, thought the call would be a fine way to jam a buzz. “Then I totally told myself that’s impossible — that’s too good to be true.” Five years later he concluded he wasn’t just hearing things. He and William Conner, also of Wake Forest, report in the Nov. 7 Science that the up-and-down call can cut capture success by about 70 percent. Using multiple microphones, he found that one bat jams another, swoops toward the moth and gets jammed itself. © Society for Science & the Public 2000 - 201
Link ID: 20435 - Posted: 12.20.2014
by Andy Coghlan If quitting smoking is one of your New Year's resolutions, we might have just the thing. Cytisine, a plant extract commonly used in eastern Europe to wean people off cigarettes, appears to be much better at the task than nicotine replacement patches and gums. Not to be confused with the DNA building block cytosine, cytisine is an alkaloid extract from the laburnum or golden rain tree (Laburnum anagyroides), which grows all over Europe. It works by blocking nicotine's access to the brain's pleasure receptors. Like nicotine, cytisine is toxic when ingested in large amounts but is safe at low doses. It is produced commercially mainly in Bulgaria and Poland, and has been used as a quitting aid in eastern European countries since the 1960s but is largely unknown elsewhere. Clinical trials carried out in the 60s and 70s did not meet US and European standards so did not lead to wider adoption. Researchers in New Zealand have now carried out a fresh trial of cytisine. They recruited 1310 smokers who intended to quit and gave exactly half of them cytisine as a course of tablets, taken daily in diminishing doses for 25 days. The other half received standard nicotine replacement therapy (NRT) – either as patches, gums or lozenges – for two months. The researchers noted the number of people who managed to abstain from smoking at one week, one month, two months and six months into the trial. Throughout, they found that people taking cytisine were less likely to have smoked than those using NRT. Six months in, 143 of the 655 cytisine recipients were still not smoking compared with 100 in the NRT group. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 20434 - Posted: 12.20.2014
By Will Dunham WASHINGTON (Reuters) - You might want to be careful about who you call a birdbrain. Some of our feathered friends exhibit powers of perception that put humans to shame. Scientists said on Thursday that little songbirds known as golden-winged warblers fled their nesting grounds in Tennessee up to two days before the arrival of a fierce storm system that unleashed 84 tornadoes in southern U.S. states in April. The researchers said the birds were apparently alerted to the danger by sounds at frequencies below the range of human hearing. The storm killed 35 people, wrecked many homes, toppled trees and tossed vehicles around like toys, but the warblers were already long gone, flying up to 930 miles (1,500 km) to avoid the storm and reaching points as far away as Florida and Cuba, the researchers said. Local weather conditions were normal when the birds took flight from their breeding ground in the Cumberland Mountains of eastern Tennessee, with no significant changes in factors like barometric pressure, temperature or wind speeds. And the storm, already spawning tornadoes, was still hundreds of miles away. "This suggests that these birds can detect severe weather at great distances," said wildlife biologist David Andersen of the U.S. Geological Survey and the University of Minnesota, one of the researchers in the study published in the journal Current Biology. "We hypothesize that the birds were detecting infrasound from tornadoes that were already occurring when the storm was still quite distant from our study site," Andersen added.
|By Joshua A. Krisch There is a mystery on Tiwai Island. A large wildlife sanctuary in Sierra Leone, the island is home to pygmy hippopotamuses, hundreds of bird species and several species of primates, including Campbell’s monkeys. These monkeys communicate via an advanced language that primatologists and linguists have been studying for decades. Over time, experts nearly cracked the code behind monkey vocabulary. And then came krak. In the Ivory Coast’s Tai Forest Campbell’s monkeys (Cercopithecus campbelli) use the term krak to indicate that a leopard is nearby and the term hok to warn of an eagle circling overheard. Primatologists indexed their monkey lexicon accordingly. But on Tiwai Island they found that those same monkeys used krak as a general alarm call—one that, occasionally, even referred to eagles. “Why on Earth were they producing krak when they heard an eagle,” asks co-author Philippe Schlenker, a linguist at France’s National Center for Scientific Research and professor at New York University. “For some reason krak, which is a leopard in the Tai Forest, seems to be recycled as a general alarm call on Tiwai Island.” In a paper published in the November 28 Linguistics and Philosophy Schlenker and his team applied logic and human linguistics to crack the krak code. Their findings imply that some monkey dialects can be just as sophisticated as human language. In 2009 a team of scientists travelled to Tai Forest with one mission—to terrify Campbell’s monkeys. Prior studies had collected monkey calls and then parsed vague meanings based on events that were already happening on the forest floor. But these primatologists set up realistic model leopards and played recordings of eagle screeches over loudspeakers. Their field experiments resulted in some of the best data available about how monkeys verbally respond to predators. © 2014 Scientific American
By Smitha Mundasad Health reporter, BBC News The precise part of the brain that gives people a sense of direction has been pinpointed by scientists. People with stronger nerve signals in their "internal compass" tended to be better navigators. The study, published in the journal Current Biology, suggested people get lost when their compass cannot keep up. The researchers in London hope the discovery will help explain why direction sense can deteriorate in conditions such as Alzheimer's disease. Scientists have long believed that such a signal existed within the brain, but until now it had been pure speculation. Volunteers were asked to navigate through a virtual environment Volunteers were asked to navigate towards certain objects placed in four corners of the virtual room They were then asked to navigate the area, from memory alone, while their brains were being scanned by an MRI machine. The scans revealed a part of the brain - known as the entorhinal region - fired up consistently during the tasks. The stronger the signal in the region, the better the volunteers were at finding their way around correctly. Dr Hugo Spiers, who led the study, said: "Studies on London cab drivers have shown that the first thing they do when they work out a route is calculate which direction they need to head in. "We now know the entorhinal cortex is responsible for such calculations and the quality of the signals from this region seem to determine how good someone's navigational skills will be." Dr Martin Chadwick, who was also involved in the study, explained: "Our results provide evidence to support the idea that your internal compass readjusts as you move through the environment. BBC © 2014
Keyword: Animal Migration
Link ID: 20431 - Posted: 12.20.2014
By ADAM FRANK In the endless public wars between science and religion, Buddhism has mostly been given a pass. The genesis of this cultural tolerance began with the idea, popular in the 1970s, that Buddhism was somehow in harmony with the frontiers of quantum physics. While the silliness of “quantum spirituality” is apparent enough these days, the possibility that Eastern traditions might have something to say to science did not disappear. Instead, a more natural locus for that encounter was found in the study of the mind. Spurred by the Dalai Lama’s remarkable engagement with scientists, interest in Buddhist attitudes toward the study of the mind has grown steadily. But within the Dalai Lama’s cheerful embrace lies a quandary whose resolution could shake either tradition to its core: the true relationship between our material brains and our decidedly nonmaterial minds. More than evolution, more than inexhaustible arguments over God’s existence, the real fault line between science and religion runs through the nature of consciousness. Carefully unpacking that contentious question, and exploring what Buddhism offers its investigation, is the subject of Evan Thompson’s new book, “Waking, Dreaming, Being.” A professor of philosophy at the University of British Columbia, Thompson is in a unique position to take up the challenge. In addition to a career built studying cognitive science’s approach to the mind, he is intimate with the long history of Buddhist and Vedic commentary on the mind too. He also happens to be the son of the maverick cultural historian William Irwin Thompson, whose Lindisfarne Association proposed the “study and realization of a new planetary culture” (a goal that reveals a lot about its strengths and weaknesses). Growing up in this environment, the younger Thompson managed to pick up an enthusiasm for non-Western philosophical traditions and a healthy skepticism for their spiritualist assumptions. © 2014 The New York Times Company
Link ID: 20430 - Posted: 12.20.2014
By GINA KOLATA After three decades of failure, researchers have found a treatment that greatly improves the prognosis for people having the most severe and disabling strokes. By directly removing large blood clots blocking blood vessels in the brain, they can save brain tissue that would have otherwise died, enabling many to return to an independent life. The study, published online Wednesday in The New England Journal of Medicine and conducted by researchers in the Netherlands, is being met with an outpouring of excitement. One reason the treatment worked, researchers suspect, is that doctors used a new type of snare to grab the clots. It is a stent, basically a small wire cage, on the end of a catheter that is inserted in the groin and threaded through an artery to the brain. When the tip of the catheter reaches the clot, the stent is opened and pushed into the clot. It snags the clot, allowing the doctor to withdraw the catheter and pull out the stent with the clot attached. About 630,000 Americans each year have strokes caused by clots blocking blood vessels in the brain. In about a third to half, the clot is in a large vessel, which has potentially devastating consequences. People with smaller clots are helped by the lifesaving drug tPA, which dissolves them. But for those with big clots, tPA often does not help. Until now, no other treatments had been shown to work. One in five patients who had tPA alone recovered enough to return to living independently. But one in three who also had their clot removed directly were able to take care of themselves after their stroke. And that, said Dr. Larry B. Goldstein, director of the Duke Stroke Center, is “a significant and meaningful improvement in what people are able to do.” © 2014 The New York Times Company
Link ID: 20429 - Posted: 12.18.2014
By James Gallagher Health editor, BBC News website A link between autism and air pollution exposure during pregnancy has been suggested by scientists. The Harvard School of Public Health team said high levels of pollution had been linked to a doubling of autism in their study of 1,767 children. They said tiny particulate matter, which can pass from the lungs to the bloodstream, may be to blame. Experts said pregnant women should minimise their exposure, although the link had still to be proven. Air pollution is definitely damaging. The World Health Organization estimates it causes 3.7 million deaths each year. The study, published in Environmental Health Perspectives, investigated any possible link with autism. It analysed 245 children with autism and 1,522 without. By looking at estimated pollution exposure during pregnancy, based on the mother's home address, the scientists concluded high levels of pollution were more common in children with autism. The strongest link was with fine particulate matter - invisible specks of mineral dust, carbon and other chemicals - that enter the bloodstream and cause damage throughout the body. Yet, the research is unable to conclusively say that pollution causes autism as there could be other factors that were not accounted for in the study. Consistent pattern There is a large inherited component to autism, but lead researcher Dr Marc Weisskopf said there was mounting evidence that air pollution may play a role too. BBC © 2014
Jason G Goldman We humans don’t typically agree on all that much, but there is at least one thing that an impressive amount of us accept: which hand is easiest to control. If you use one hand for writing, you probably use the same one for eating as well, and most of us – around 85% of our species – prefer our right hands. In fact, "there has never been any report of a human population in which left-handed individuals predominate", according to archaeologist Natalie Uomini at the University of Liverpool in the UK. Lateralisation of limb use – that is, a bias towards one side or the other – usually begins in the brain. We know that some tasks are largely controlled by brain activity in the left hemisphere, while the right hemisphere governs other tasks. Confusingly, there is some crossing of nerves between the body and the brain, which means it’s actually the left side of the brain that has more control over the right side of the body and vice versa. In other words, the brain’s left hemisphere helps control the operation of the right hand, eye, leg and so on. Some argue that this division of neurological labour has been a feature of animals for half a billion years. Perhaps it evolved because it is more efficient to allow the two hemispheres to carry out different computations at the same time. The left side of the brain, for instance, might have evolved to carry out routine operations – things like foraging for food – while the right side was kept free to detect and react rapidly to unexpected challenges in the environment – an approaching predator, for instance. This can be seen in various fish, toads and birds, which are all more likely to attack prey seen in the right eye. © 2014 BBC.
By Candy Schulman My mother’s greatest fear was Alzheimer’s. She got Lewy body dementia, or LBD, instead. This little known, oddly named, debilitating illness afflicts an estimated 1.3 million Americans, the actor and comedian Robin Williams possibly among them. It is often misdiagnosed because its signs, such as hallucinations and body rigidity, do not seem like those of dementia, but in the end it robs people of themselves even more painfully. I first noticed my mother’s cognitive difficulties when she was 88. Until then, she’d led an extraordinarily active life: She was a competitive golfer with a bureau full of trophies, a painter and a sculptor. Every Hanukkah she hosted a lively feast for her eight grandchildren and nine great-grandchildren. This time, though, she needed my help planning, shopping and cooking. She was having difficulty with the guest list, trying to write every family member’s name on a piece of paper, adding up the numbers to see how many potatoes to buy for latkes. Her concentration became frayed and she kept ripping it up and starting again, close to tears. Several months before that, she had sent me a Mother’s Day card that was illustrated with childlike prose, colorful illustrations and glitter hearts. The poem on the cover was printed in a playful purple font: “For you, Mom. For kissing my boo-boos, for wiping my face. . . . For calming my fears with your loving embrace.” On Mother’s Day and the rest of the year, Mom added in a shaky script, “thanks.”
Link ID: 20422 - Posted: 12.16.2014
|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American
|By Ingrid Wickelgren Confusion is one symptom of a concussion. But confusion may also characterize decisions about how soon to let an athlete play after taking a hit to the head. Sizing up symptoms such as dizziness and nausea is subjective, after all. Now a study suggests that a blood test could objectively determine whether or not the damage is bad enough to put a player on the bench. The work is in the Journal of Neurotrauma. [Robert Siman et al, Serum SNTF Increases in Concussed Professional Ice Hockey Players and Relates to the Severity of Post Concussion Symptoms] A strong blow to the head causes chemical changes within nerve cells that damage their structural proteins. Among the debris is a protein fragment called SNTF—which in more severe cases, spills into the bloodstream. The new study followed 20 professional hockey players who got concussions with symptoms that lasted six days or more. And blood levels of SNTF were much higher one hour to six days later than were levels of the protein fragment in eight other athletes who had gotten concussions that cleared up within five days. Levels were also low in 45 non-concussed players tested during the pre-season. A blood test for SNTF might thus forecast recovery time from a head injury. Combined with other neurological tests, levels of this molecule could help doctors tell athletes when it’s safe to suit up again. © 2014 Scientific American
Keyword: Brain Injury/Concussion
Link ID: 20419 - Posted: 12.16.2014
By Bruce Bower In the movie Roxanne, Steve Martin plays a lovesick guy who mocks his own huge schnoz by declaring: “It’s not the size of a nose that’s important. It’s what’s in it that matters.” Scientists demonstrated the surprising truth behind that joke this year: People can whiff an average of more than 1 trillion different odors, regardless of nose size (SN: 4/19/14, p. 6). No one had systematically probed how many scents people can actually tell apart. So a team led by Leslie Vosshall of Rockefeller University in New York City asked 26 men and women to discriminate between pairs of scents created from mixes of 128 odor molecules. Volunteers easily discriminated between smells that shared as much as 51 percent of their odor molecules. Errors gradually rose as pairs of scents became chemically more alike. Vosshall’s group calculated that an average participant could tell apart a minimum of more than 1 trillion smells made up of different combinations of 30 odor molecules. Really good smellers could have detected way more than 1 trillion odor mixtures, the scientists said. Smell lags behind sight and hearing as a sense that people need to find food, avoid dangers and otherwise succeed at surviving. Still, detecting the faint odor of spoiled food and other olfactory feats must have contributed to the success of Homo sapiens over the last 200,000 years. Perhaps many animals can whiff the difference between a trillion or more smells. For now, odor-detection studies modeled on Vosshall’s approach have been conducted only with humans. © Society for Science & the Public 2000 - 2014.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20417 - Posted: 12.16.2014
by Andy Coghlan It may not sound very appetising, but an edible powder made from waste excreted by bacteria in our guts may help people to avoid gaining weight. Stabilising a person's weight could have a major health impact, says Gary Frost of Imperial College London, because as people on Western diets grow older, they tend to put on between 0.3 and 0.8 kilograms per year on average. A fatty acid called propionate is released when the bacteria in our gut digest fibre. Propionate makes people feel full by activating cells in the large intestine that produce the satiety hormones GLP-1 and PYY: these tell the brain that it's time to stop eating. But to trigger a big enough dose of this appetite-suppressing signal from gut bacteria alone, people would have to eat extremely large amounts of fibre. To get around that, Frost and his team made the molecule in a concentrated form called inulin-propionate ester (IPE). "That gives you eight times the amount of someone following a typical Western diet," he says. To test its appetite-stemming properties, the team gave powdered IPE, mixed in with fruit juice or a milkshake, to a group of overweight volunteers every day for six months. A type of ordinary fibre was given to another set of people, who acted as controls. Only one of the 25 volunteers taking IPE put on more than 3 per cent of their body weight over that time, compared with six of the 24 controls. One reason for this might be that the IPE recipients ate around 9 per cent less over the six months. © Copyright Reed Business Information Ltd.
Link ID: 20416 - Posted: 12.13.2014
|By Lindsey Konkel For 28 years, Bill Gilmore lived in a New Hampshire beach town, where he surfed and kayaked. “I’ve been in water my whole life,” he said. “Before the ocean, it was lakes. I’ve been a water rat since I was four.” Now Gilmore can no longer swim, fish or surf, let alone button a shirt or lift a fork to his mouth. Earlier this year, he was diagnosed with amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease. In New England, medical researchers are now uncovering clues that appear to link some cases of the lethal neurological disease to people’s proximity to lakes and coastal waters. About five years ago, doctors at a New Hampshire hospital noticed a pattern in their ALS patients—many of them, like Gilmore, lived near water. Since then, researchers at Dartmouth-Hitchcock Medical Center have identified several ALS hot spots in lake and coastal communities in New England, and they suspect that toxic blooms of blue-green algae—which are becoming more common worldwide—may play a role. Now scientists are investigating whether breathing a neurotoxin produced by the algae may raise the risk of the disease. They have a long way to go, however: While the toxin does seem to kill nerve cells, no research, even in animals, has confirmed the link to ALS. As with all ALS patients, no one knows what caused Bill Gilmore’s disease. He was a big, strong guy—a carpenter by profession. One morning in 2011, his arms felt weak. “I couldn’t pick up my tools. I thought I had injured myself,” said Gilmore, 59, who lived half his life in Hampton and now lives in Rochester, N.H. © 2014 Scientific American
by Colin Barras It's not just great minds that think alike. Dozens of the genes involved in the vocal learning that underpins human speech are also active in some songbirds. And knowing this suggests that birds could become a standard model for investigating the genetics of speech production – and speech disorders. Complex language is a uniquely human trait, but vocal learning – the ability to pick up new sounds by imitating others – is not. Some mammals, including whales, dolphins and elephants, share our ability to learn new vocalisations. So do three groups of birds: the songbirds, parrots and hummingbirds. The similarities between vocal learning in humans and birds are not just superficial. We know, for instance, that songbirds have specialised vocal learning brain circuits that are similar to those that mediate human speech. What's more, a decade ago we learned that FOXP2, a gene known to be involved in human language, is also active in "area X" of the songbird brain – one of the brain regions involved in those specialised vocal learning circuits. Andreas Pfenning at the Massachusetts Institute of Technology and his colleagues have now built on these discoveries. They compared maps of genetic activity – transcriptomes – in brain tissue taken from the zebra finch, budgerigar and Anna's hummingbird, representing the three groups of vocal-learning birds. © Copyright Reed Business Information Ltd.
|By Claudia Wallis Touch a hot frying pan and the searing message of pain sprints up to your brain and back down to your hand so fast that the impulse to withdraw your fingers seems instantaneous. That rapid-fire signal begins in a heat-sensing molecule called a TRPV1 channel. This specialized protein is abundant on the surface of sensory nerve cells in our fingers and elsewhere and is a shape-shifter that can take an open or closed configuration. Heat opens a central pore in the molecule, so do certain spider toxins and capsaicin—the substance that gives chili peppers their burn. Once the pore is open, charged ions of sodium and calcium flow into the nerve cell, triggering the pain signal. Ouch! As neuroscientist-journalist Stephani Sutherland explains in “Pain that Won’t Quit,” in the December Scientific American, researchers have long been interested in finding ways to moderate the action of this channel—and other ion channels—in patients who suffer from chronic pain. Shutting down the TRPV1 channel completely, however, is not an option because it plays a vital role in regulating body temperature. In two papers published in Nature in December 2013 investigators at the University of California, San Francisco, gave pain researchers a big leg up in understanding TRPV1. They revealed, in exquisite atomic detail, the structure of the channel molecule (from a rat) using an electron cryomicroscope, an instrument designed to explore the 3-D structure of molecules at very low temperatures. One of those investigators, Yifan Cheng, also created this colorful animation, showing how the molecule looks when the channel is open. © 2014 Scientific American
Keyword: Pain & Touch
Link ID: 20411 - Posted: 12.13.2014
By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014
By Nsikan Akpan Gut surgery is often the only option for life-threatening obesity and diabetes, but what if doctors could cut the pounds without using a knife? Scientists have engineered an antiobesity drug that rivals the dramatic benefits seen with surgery, dropping excess body weight by a third. Though the work was done only in rodents, the drug is the first to influence three obesity-related hormones in the gut at once. Bariatric surgery, including gastric bypass, typically involves limiting food intake by removing part of the stomach or intestines. Yet it does more than shrink the size of patient’s stomach or intestines. It also changes the release of multiple gut-related hormones, explains clinical endocrinologist Stephen O'Rahilly of the University of Cambridge in the United Kingdom, who wasn’t involved with the study. That’s important, because years of eating a diet high in fat and sugar can throw a person’s metabolism into disarray. Cells undergo genetic reprogramming that negatively impacts how they process sugar and store fat, locking in obesity. This pattern makes it harder and harder to lose weight, even if a person changes their diet and begins exercising. Bariatric surgery interrupts that cycle by stimulating the production of several hormones that reduce blood sugar, burn fat, and curb appetite. (It may also change the composition of the gut’s microbes.) Three of these hormones are called glucagon-like peptide-1 (GLP-1), gastric inhibitory peptide (GIP), and glucagon. Cells in your gut release GLP-1 and GIP after a meal to keep your body’s blood sugar levels in a normal range. GLP-1 also curbs appetite, signaling to your brain that you are full. In type 2 diabetes, the body stops responding to GLP-1 and GIP, which contributes to hyperglycemia, or too much blood sugar. Hyperglycemia causes the devastating hallmarks of diabetes, such as kidney injury, cardiovascular disease, and nerve damage. © 2014 American Association for the Advancement of Science.
Link ID: 20408 - Posted: 12.10.2014
By Tina Rosenberg When Ebola ends, the people who have suffered, who have lost loved ones, will need many things. They will need ways to rebuild their livelihoods. They will need a functioning health system, which can ensure that future outbreaks do not become catastrophes. And they will need mental health care. Depression is the most important thief of productive life for women around the world, and the second-most important for men. We sometimes imagine it is a first-world problem, but depression is just as widespread, if not more so, in poor countries, where there is a good deal more to be depressed about. And it is more debilitating, as a vast majority of sufferers have no safety net. Health care for all must include mental health care. It’s hard to believe but both Liberia and Sierra Leone have only a single psychiatrist. The Ebola crisis has exposed these countries’ malignant neglect of their health systems. People can’t get care for diarrhea and malaria. How will these countries take care of an epidemic of depression? This isn’t really a medical question. We know how to treat depression. What we don’t know yet is how to make effective treatment cheap, culturally appropriate, convenient and non-stigmatizing — all needed to get treatment out to millions and millions of people. But some researchers are finding out. They are doing so despite the fact that growing attention to this issue hasn’t been accompanied by money. The U.S. National Institute of Mental Health last year provided just $24.5 million for global mental health efforts, and the Canadian government’s Grand Challenges Canada, which is said to have the largest portfolio of mental health innovation in developing countries, has spent only $28 million on them since it began in 2010. © 2014 The New York Times Company
Link ID: 20404 - Posted: 12.08.2014