Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Richard A. Friedman THANKS to Caitlyn Jenner, and the military’s changing policies, transgender people are gaining acceptance — and living in a bigger, more understanding spotlight than at any previous time. We’re learning to be more accepting of transgender individuals. And we’re learning more about gender identity, too. The prevailing narrative seems to be that gender is a social construct and that people can move between genders to arrive at their true identity. But if gender were nothing more than a social convention, why was it necessary for Caitlyn Jenner to undergo facial surgeries, take hormones and remove her body hair? The fact that some transgender individuals use hormone treatment and surgery to switch gender speaks to the inescapable biology at the heart of gender identity. This is not to suggest that gender identity is simply binary — male or female — or that gender identity is inflexible for everyone. Nor does it mean that conventional gender roles always feel right; the sheer number of people who experience varying degrees of mismatch between their preferred gender and their body makes this very clear. In fact, recent neuroscience research suggests that gender identity may exist on a spectrum and that gender dysphoria fits well within the range of human biological variation. For example, Georg S. Kranz at the Medical University of Vienna and colleagues elsewhere reported in a 2014 study in The Journal of Neuroscience that individuals who identified as transsexuals — those who wanted sex reassignment — had structural differences in their brains that were between their desired gender and their genetic sex. © 2015 The New York Times Company
Keyword: Sexual Behavior
Link ID: 21329 - Posted: 08.24.2015
By Kazi Stastna The U.S. approval of a pill to treat low libido in women has whipped up a whirlwind of debate and raised questions about whether the so-called female Viagra addresses the real reasons for lack of sexual desire. The U.S. Food and Drug Administration last week approved flibanserin, to be sold under the name Addyi starting in October, for the treatment of hypoactive sexual desire disorder (HSDD) among premenopausal women — some two decades after Viagra was approved for the treatment of male erectile dysfunction. Sprout Pharmaceuticals pitched flibanserin as a drug that would finally give women with sexual dysfunction similar treatment options to men and bused dozens of women to FDA hearings in Maryland to attest to its benefits and plead for its approval in what some saw as a heavy-handed and misleading public relations campaign. The FDA gave flibanserin the OK after twice rejecting it and despite concerns about its risks and modest efficacy because it said women suffering distress from low libido have an "unmet medical need." Days after it did, Canadian pharmaceutical company Valeant offered to buy Sprout for $1 billion US and said it will apply to get flibanserin approved in Canada and other countries. Although often likened to Viagra, flibanserin was created as an antidepressant and works on the brain while erectile dysfunction medications stimulate blood flow to the penis. Critics argue it's an ineffectual pharmacological solution for a problem better treated with relationship counselling, sex therapy and behavioural changes. "Their suffering is real, but the women who testified had a lot of different stories, and some of those stories were very good reasons for having low libido, including having six children, having a one-year-old, having had breast cancer treatment …," says Adriane Fugh-Berman, associate professor of pharmacology and physiology at Georgetown University in Washington, D.C., and director of PharmedOut, a pharmaceutical marketing watchdog group. ©2015 CBC/Radio-Canada.
Keyword: Sexual Behavior
Link ID: 21328 - Posted: 08.24.2015
Mo Costandi The human brain can be compared to something like a big, bustling city. It has workers, the neurons and glial cells which co-operate with each other to process information; it has offices, the clusters of cells that work together to achieve specific tasks; it has highways, the fibre bundles that transfer information across long distances; and it has centralised hubs, the densely interconnected nodes that integrate information from its distributed networks. Like any big city, the brain also produces large amounts of waste products, which have to be cleared away so that they do not clog up its delicate moving parts. Until very recently, though, we knew very little about how this happens. The brain’s waste disposal system has now been identified. We now know that it operates while we sleep at night, just like the waste collectors in most big cities, and the latest research suggests that certain sleeping positions might make it more efficient. Waste from the rest of the body is cleared away by the lymphatic system, which makes and transports a fluid called lymph. The lymphatic system is an important component of the immune system. Lymph contains white blood cells that can kill microbes and mop up their remains and other cellular debris. It is carried in branching vessels to every organ and body part, and passes through them, via the spaces between their cells, picking up waste materials. It is then drained, filtered, and recirculated. The brain was thought to lack lymphatic vessels altogether, and so its waste disposal system proved to be far more elusive. Several years ago, however, Maiken Nedergaard of the University of Rochester Medical Center and colleagues identified a system of hydraulic “pipes” running alongside blood vessels in the mouse brain. Using in vivo two-photon imaging to trace the movements of fluorescent markers, they showed that these vessels carry cerebrospinal fluid around the brain, and that the fluid enters inter-cellular spaces in the brain tissue, picking up waste on its way. © 2015 Guardian News and Media Limited
Link ID: 21327 - Posted: 08.22.2015
By Gretchen Vogel Researchers may have finally explained how an obesity-promoting gene variant induces some people to put on the pounds. Using state-of-the-art DNA editing tools, they have identified a genetic switch that helps govern the body’s metabolism. The switch controls whether common fat cells burn energy rather than store it as fat. The finding suggests the tantalizing prospect that doctors might someday offer a gene therapy to melt extra fat away. Along with calories and exercise, genes influence a person’s tendency to gain—and keep—extra pounds. One of the genes with the strongest link to obesity is called FTO. People with certain versions of the gene are several kilos heavier on average and significantly more likely to be obese. Despite years of study, no one had been able to figure out what the gene does in cells or how it influences weight. There was some evidence FTO helped control other genes, but it was unclear which ones. Some researchers had looked for activity of FTO in various tissues, without finding any clear signals. Melina Claussnitzer, Manolis Kellis, and their colleagues at Harvard University, Massachusetts Institute of Technology, and the Broad Institute in Cambridge, turned to data from the Roadmap Epigenomics Project, an 8-year effort that identified the chemical tags on DNA that influence the function of genes. The researchers used those epigenetic tags to look at whether FTO was turned on or off in 127 cell types. The gene seemed to be active in developing fat cells called adipocyte progenitor cells. © 2015 American Association for the Advancement of Science
Almost fully-formed brain grown in a lab. Woah: Scientists grow first nearly fully-formed human brain. Boffins raise five-week-old fetal human brain in the lab for experimentation. On Tuesday, all the above appeared as headlines for one particular story. What was it all about? Mini-brains 3 to 4 millimetres across have been grown in the lab before, but if a larger brain had been created – and the press release publicising the claim said it was the size of a pencil eraser – that would be a major breakthrough. New Scientist investigated the claims. The announcement was made by Rene Anand, a neuroscientist at Ohio State University in Columbus, at a military health research meeting in Florida. Anand says he has grown a brain – complete with a cortex, midbrain and brainstem – in a dish, comparable in maturity to that of a fetus aged 5 weeks. Anand and his colleague Susan McKay started with human skin cells, which they turned into induced pluripotent stem cells (iPSCs) using a tried-and-tested method. By applying an undisclosed technique, one that a patent has been applied for, the pair say they were able to encourage these stem cells to form a brain. “We are replicating normal development,” says Anand. He says they hope to be able to create miniature models of brains experiencing a range of diseases, such as Parkinson’s and Alzheimer’s. Inconclusive evidence But not everyone is convinced, especially as Anand hasn’t published his results. Scientists we sent Anand’s poster presentation to said that although the team has indeed grown some kind of miniature collection of cells, or “organoid”, in a dish, the structure isn’t much like a fetal brain. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 21322 - Posted: 08.22.2015
Bill McQuay The natural world is abuzz with the sound of animals communicating — crickets, birds, even grunting fish. But scientists learning to decode these sounds say the secret signals of African elephants — their deepest rumblings — are among the most intriguing calls any animal makes. Katy Payne, the same biologist who recognized song in the calls of humpback whales in the 1960s, went on to help create the Elephant Listening Project in the Central African Republic in the 1980s. At the time, Payne's team was living in shacks in a dense jungle inhabited by hundreds of rare forest elephants. That's where one of us — Bill McQuay — first encountered the roar of an elephant in 2002, while reporting a story for an NPR-National Geographic collaboration called Radio Expeditions. Here's how Bill remembers that day in Africa: I was walking through this rainforest to an observation platform built up in a tree — out of the reach of the elephants. I climbed up onto the platform, a somewhat treacherous exercise with all my recording gear. Then I set up my recording equipment, put on the headphones, and started listening. That first elephant roar sounded close. But I was so focused on the settings on my recorder that I didn't bother to look around. The second roar sounded a lot closer. I thought, this is so cool! What I didn't realize was, there was this huge bull elephant standing right underneath me — pointing his trunk up at me, just a few feet away. Apparently he was making a "dominance display." © 2015 NPR
Helen Thomson An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease. Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases. It could also be used to test drugs for conditions such as Alzheimer’s and Parkinson’s, since the regions they affect are in place during an early stage of brain development. The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed, claimed Rene Anand of Ohio State University, Columbus, who presented the work today at the Military Health System Research Symposium in Fort Lauderdale, Florida. Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand. Anand and his colleagues claim to have reproduced 99% of the brain’s diverse cell types and genes. They say their brain also contains a spinal cord, signalling circuitry and even a retina. The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.” © 2015 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 21316 - Posted: 08.19.2015
Daniel Cressey In 2013, Beau Kilmer took on a pretty audacious head count. Citizens in the state of Washington had just voted to legalize marijuana for recreational use, and the state's liquor control board, which would regulate the nascent industry, was anxious to understand how many people were using the drug — and importantly, how much they were consuming. The task was never going to be straightforward. Users of an illicit substance, particularly heavy users, often under-report the amounts they take. So Kilmer, co-director of the RAND Drug Policy Research Center in Santa Monica, California, led a team to develop a web-based survey that would ask people how often they had used cannabis in the past month and year. To help them gauge the amounts, the surveys included scaled pictures showing different quantities of weed. The survey, along with other data the team had collected, revealed a rift between perception and reality. Based on prior data, state officials had estimated use at about 85 tonnes per year; Kilmer's research suggested that it was actually double that, about 175 tonnes1. The take-home message, says Kilmer, was “we're going to have to start collecting more data”. Scientists around the world would echo that statement. Laws designed to legalize cannabis or lessen the penalties associated with it are taking effect around the world. They are sweeping the sale of the drug out of stairwells and shady alleys and into modern shopfronts under full view of the authorities. In 2013, Uruguay became the first nation to legalize marijuana trade. And several countries in Europe — Spain and Italy among them — have moved away from tough penalties for use and possession. Thirty-nine US states plus Washington DC have at least some provisions for medicinal use of the drug. Washington, Colorado, Alaska and Oregon have gone further, legalizing the drug for recreational consumption. A handful of other states including California and Massachusetts are expected to vote on similar recreational-use measures by the end of 2016. © 2015 Nature Publishing Group
Keyword: Drug Abuse
Link ID: 21315 - Posted: 08.19.2015
By Lisa Rapaport (Reuters Health) - U.S. teens who try electronic cigarettes may be more than twice as likely to move on to smoking conventional cigarettes as those who have never tried the devices, report researchers from the University of Southern California. The findings, published August 18 in JAMA, offer some of the best evidence yet at establishing a link between e-cigarettes and smoking, said Dr. Nancy Rigotti, an expert in tobacco research at Massachusetts General Hospital and author of an editorial accompanying the study. "Adolescent brains appear to be especially susceptible to becoming addicted to nicotine when exposed," Rigotti told Reuters Health in an email. About 2 million middle- and high-school students tried e-cigarettes in 2014, triple the number of teen users in 2013, the Centers for Disease Control and Prevention reported in April. The data sparked alarm among tobacco control advocates who fear e-cigarettes will create a new generation of nicotine addicts who may eventually switch to conventional cigarettes. Big tobacco companies, including Altria Group Inc, Lorillard Tobacco Co and Reynolds American Inc, are all developing e-cigarettes. The battery-powered devices feature a glowing tip and a heating element that turns liquid nicotine and other flavorings into a cloud of vapor that users inhale. An international review of published research by the Cochrane Review in December concluded that the devices could help smokers quit but said much of the existing evidence on e-cigarettes was thin. © 2015 Scientific American
Keyword: Drug Abuse
Link ID: 21314 - Posted: 08.19.2015
By ANDREW POLLACK The first prescription drug to enhance women’s sexual drive won regulatory approval on Tuesday, clinching a victory for a lobbying campaign that had accused the Food and Drug Administration of gender bias for ignoring the sexual needs of women. The drug — Addyi from Sprout Pharmaceuticals — is actually the first drug approved to treat a flagging or absent libido for either sex. Viagra and other drugs available for men are approved to help achieve erections, or to treat certain deficiencies of the hormone testosterone, not to increase desire. Advocates who pressed for approval of Addyi, many of them part of a coalition called Even the Score, said that a drug to improve women’s sex lives was long overdue, given the many options available to men. “This is the biggest breakthrough for women’s sexual health since the pill,” said Sally Greenberg, executive director of the National Consumers League. But critics said the campaign behind Addyi had made a mockery of the system that regulates pharmaceuticals and had co-opted the women’s movement to pressure the F.D.A. into approving a drug that was at best minimally effective and could cause side effects like low blood pressure, fainting, nausea, dizziness and sleepiness. In announcing the approval, Dr. Janet Woodcock, a senior F.D.A. official, said the agency was “committed to supporting the development of safe and effective treatments for female sexual dysfunction.” The F.D.A. decision on Tuesday was not a surprise since an advisory committee of outside experts had recommended by a vote of 18 to 6 in June that the drug be approved, albeit with precautions required to try to limit the risks and ensure that it was not overused. © 2015 The New York Times Company
Keyword: Sexual Behavior
Link ID: 21311 - Posted: 08.19.2015
Alexander Christie-Miller You could say they sent the first tweets. An ancient whistling language that sounds a little like birdsong has been found to use both sides of the brain – challenging the idea that the left side is all important for communicating. The whistling language is still used by around 10,000 people in the mountains of north-east Turkey, and can carry messages as far as 5 kilometres. Researchers have now shown that this language involves the brain’s right hemisphere, which was already known to be important for understanding music. Until recently, it was thought that the task of interpreting language fell largely to the brain’s left hemisphere. Onur Güntürkün of Ruhr University Bochum in Germany wondered whether the musical melodies and frequencies of whistled Turkish might require people to use both sides of their brain to communicate. His team tested 31 fluent whistlers by playing slightly different spoken or whistled syllables into their left and right ears at the same time, and asking them to say what they heard. The left hemisphere depends slightly more on sounds received by the right ear, and vice versa for the right hemisphere. By comparing the number of times the whistlers reported the syllables that had been played into either their right or left ear, they could tell how often each side of the brain was dominant. As expected, when the syllables were spoken, the right ear and left hemisphere were dominant 75 per cent of the time. But when syllables were whistled, the split between right and left dominance was about even. © Copyright Reed Business Information Ltd.
By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.
Dean Burnett Yesterday, an article in the Entrepreneurs section of the Guardian purported to reveal a “cloth cap that could help treat depression”. This claim has caused some alarm in the neuroscience and mental health fields, so it’s important to look a little more closely at what the manufacturers are actually claiming. The piece in question concerns a product from Neuroelectrics: a soft helmet containing electrodes and sensors. According to the company’s website, it can be used to monitor brain activity (electroencephalography, or EEG), or administer light electrical currents to different areas of the brain in order to treat certain neurological and psychiatric conditions (known as transcranial direct current stimulation or tDCS). While this would obviously be great news to the millions of people who deal with such conditions every day, such claims should be treated with a considerable amount of caution. The fields of science dedicated to researching and, hopefully, treating serious brain-based problems like depression, stroke, personality disorder etc. work hard to find new and inventive methods for doing so, or refining and improving existing ones. Sometimes they succeed, but probably not as often as they’d like. The problem is that when a new development occurs or a new approach is found, it doesn’t automatically mean it’s widely applicable or even effective for everyone. The brain is furiously complicated. There is no magic bullet for brain problems [Note: you shouldn’t use bullets, magic or otherwise, when dealing with the brain]. © 2015 Guardian News and Media Limited
Link ID: 21305 - Posted: 08.18.2015
By NICHOLAS BAKALAR “Insanity Treated By Electric Shock” read the headline of an article published on July 6, 1940, in The New York Times. The article described “a new method, introduced in Italy, of treating certain types of mental disorders by sending an electric shock through the brain.” It was the first time that what is now called electroconvulsive therapy, or ECT, had been mentioned in The Times. The electric shock, the article said, “is produced by a small portable electric box which was invented in Italy by Professor Ugo Cerletti of the Rome University Clinic.” Dr. S. Eugene Barrera, the principal researcher on the project, “emphasized that hope for any ‘miracle cure’ must not be pinned on the new method.” On April 29, 1941, the subject came up again, this time in an article about a scientific meeting at which a professor of psychiatry at Northwestern reported “ ‘very promising instantaneous results’ in the recently developed electric shock method of relieving schizophrenic patients of their malady.” The treatment entered clinical practice fairly quickly. In October 1941, The Times reported on the opening of several new buildings at Hillside Hospital in Queens (today called Zucker Hillside Hospital). “The hospital has pioneered in the use of insulin and metrazol, and also in the electric shock treatment, which has proved useful in shortening the average stay of patients,” the article read. Over the years, ECT has had its ups and downs in the public imagination and in the pages of The Times. In an article on Nov. 25, 1980, the reporter Dava Sobel seemed to relegate it to another age. © 2015 The New York Times Company
Link ID: 21304 - Posted: 08.18.2015
By CLAIRE MARTIN The eyeglass lenses that Don McPherson invented were meant for surgeons. But through serendipity he found an entirely different use for them: as a possible treatment for colorblindness. Mr. McPherson is a glass scientist and an avid Ultimate Frisbee player. He discovered that the lenses he had invented, which protect surgeons’ eyes from lasers and help them differentiate human tissue, caused the world at large to look candy-colored — including the Frisbee field. At a tournament in Santa Cruz, Calif., in 2002, while standing on a grassy field dotted with orange goal-line cones, he lent a pair of glasses with the lenses to a friend who happened to be colorblind. “He said something to the effect of, ‘Dude, these are amazing,’ ” Mr. McPherson says. “He’s like, ‘I see orange cones. I’ve never seen them before.’ ” Mr. McPherson was intrigued. He said he did not know the first thing about colorblindness, but felt compelled to figure out why the lenses were having this effect. Mr. McPherson had been inserting the lenses into glasses that he bought at stores, then selling them through Bay Glass Research, his company at the time. Mr. McPherson went on to study colorblindness, fine-tune the lens technology and start a company called EnChroma that now sells glasses for people who are colorblind. His is among a range of companies that have brought inadvertent or accidental inventions to market. Such inventions have included products as varied as Play-Doh, which started as a wallpaper cleaner, and the pacemaker, discovered through a study of hypothermia. To learn more about color vision and the feasibility of creating filters to correct colorblindness, Mr. McPherson applied for a grant from the National Institutes of Health in 2005. He worked with vision scientists and a mathematician and computer scientist named Andrew Schmeder. They weren’t the first to venture into this industry; the history of glassmakers claiming to improve colorblindness is long and riddled with controversy. © 2015 The New York Times Company
Link ID: 21303 - Posted: 08.17.2015
A new clinical trial is set to begin in the United Kingdom using the powerful noses of dogs to detect prostate cancer in humans. While research has been done before, these are the first trials approved by Britain's National Health Service. The trials, at the Milton Keynes University Hospital in Buckinghamshire, will use animals from a nonprofit organization called Medical Detection Dogs, co-founded in 2008 by behavioral psychologist Claire Guest. "What we've now discovered is that lots of diseases and conditions — and cancer included — that they actually have different volatile organic compounds, these smelly compounds, that are associated with them," Guest tells NPR's Rachel Martin. "And dogs can smell them." The dogs offer an inexpensive, non-invasive method to accompany the existing blood tests for prostate cancer, which detect prostate-specific antigen, or PSA, Guest says. "It's a low false-negative but a very high false-positive, meaning that three out of four men that have a raised PSA haven't got cancer," she explains. "So the physician has a very difficult decision to make: Which of the four men does he biopsy? What we want to do is provide an additional test — not a test that stands alone but an additional test that runs alongside the current testing, which a physician can use as part of that patient's picture." The samples come to the dogs — the dogs never go to the patient. At the moment, our dogs would be screening about between a .5- to 1-ml drop of urine [or 1/5 to 1/10 teaspoon], so a very small amount. In the early days, of course, we know whether the samples have come from a patient with cancer or if the patient has another disease or condition, or is in fact healthy. © 2015 NPR
Keyword: Chemical Senses (Smell & Taste)
Link ID: 21302 - Posted: 08.17.2015
John von Radowitz , Press Association Psychologists have confirmed that playing violent video games is linked to aggressive and callous behaviour. A review of almost a decade of studies found that exposure to violent video games was a "risk factor" for increased aggression. But the same team of experts said there was insufficient evidence to conclude that the influence of games such as Call Of Duty and Grand Theft Auto led to criminal acts. The findings have prompted a call for more parental control over violent scenes in video games from the American Psychological Association (APA). The original version of Doom, released in 1993, was widely controversial for its unprecedented levels of graphic violence A report from the APA task force on violent media concludes: "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression." The report said no single influence led a person to act aggressively or violently. Rather, it was an "accumulation of risk factors" that resulted in such behaviour. It added: "The research reviewed here demonstrates that violent video game use is one such risk factor." The APA has urged game creators to increase levels of parental control over the amount of violence video games contain.
Link ID: 21301 - Posted: 08.17.2015
Carl Zimmer You are what you eat, and so were your ancient ancestors. But figuring out what they actually dined on has been no easy task. There are no Pleistocene cookbooks to consult. Instead, scientists must sift through an assortment of clues, from the chemical traces in fossilized bones to the scratch marks on prehistoric digging sticks. Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains. Roughly seven million years ago, our ancestors split off from the apes. As far as scientists can tell, those so-called hominins ate a diet that included a lot of raw, fiber-rich plants. After several million years, hominins started eating meat. The oldest clues to this shift are 3.3-million-year-old stone tools and 3.4-million-year-old mammal bones scarred with cut marks. The evidence suggests that hominins began by scavenging meat and marrow from dead animals. At some point hominins began to cook meat, but exactly when they invented fire is a question that inspires a lot of debate. Humans were definitely making fires by 300,000 years ago, but some researchers claim to have found campfires dating back as far as 1.8 million years. Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants. © 2015 The New York Times Company
When the owl swooped, the “blind” mice ran away. This was thanks to a new type of gene therapy to reprogramme cells deep in the eye to sense light. After treatment, the mice ran for cover when played a video of an approaching owl, just like mice with normal vision. “You could say they were trying to escape, but we don’t know for sure,” says Rob Lucas of the University of Manchester, UK, co-leader of the team that developed and tested the treatment. “What we can say is that they react to the owl in the same way as sighted mice, whereas the untreated mice didn’t do anything.” This is the team’s best evidence yet that injecting the gene for a pigment that detects light into the eyes of blind mice can help them see real objects again. This approach aims to treat all types of blindness caused by damaged or missing rods and cones, the eye’s light receptor cells. Most gene therapies for blindness so far have concentrated on replacing faulty genes in rarer, specific forms of inherited blindness, such as Leber congenital amaurosis. Deep down The new treatment works by enabling other cells that lie deeper within the retina to capture light. While rod and cone cells normally detect light and convert this into an electrical signal, the ganglion and bipolar cells behind them are responsible for processing these signals and sending them to the brain. By giving these cells the ability to produce their own light-detecting pigment, they can to some extent compensate for the lost receptors, or so it seems.
Link ID: 21298 - Posted: 08.15.2015
By James Gallagher Health editor, BBC News website Fat or carbs? Scientists have shed new light on which diet might be more effective at reducing fat Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows. Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake. Experts say the most effective diet is one people can stick to. It has been argued that restricting carbs is the best way to get rid of a "spare tyre" as it alters the body's metabolism. The theory goes that fewer carbohydrates lead to lower levels of insulin, which in turn lead to fat being released from the body's stores. "All of those things do happen with carb reduction and you do lose body fat, but not as much as when you cut out the fat," said lead researchers Dr Kevin Hall, from the US-based National Institute of Diabetes and Digestive and Kidney Diseases. Cutting down on carbohydrates might not be as effective after all, the study suggests In the study, 19 obese people were initially given 2,700 calories a day. Then, over a period of two weeks they tried diets which cut their calorie intake by a third, either by reducing carbohydrates or fat. The team analysed the amount of oxygen and carbon dioxide being breathed out and the amount of nitrogen in participants' urine to calculate precisely the chemical processes taking place inside the body. The results published in Cell Metabolism showed that after six days on each diet, those reducing fat intake lost an average 463g of body fat - 80% more than those cutting down on carbs, whose average loss was 245g. © 2015 BBC.
Link ID: 21296 - Posted: 08.15.2015