Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Aaron E. Carroll If there is one health myth that will not die, it is this: You should drink eight glasses of water a day. It’s just not true. There is no science behind it. And yet every summer we are inundated with news media reports warning that dehydration is dangerous and also ubiquitous. These reports work up a fear that otherwise healthy adults and children are walking around dehydrated, even that dehydration has reached epidemic proportions. Let’s put these claims under scrutiny. I was a co-author of a paper back in 2007 in the BMJ on medical myths. The first myth was that people should drink at least eight 8-ounce glasses of water a day. This paper got more media attention (even in The Times) than pretty much any other research I’ve ever done. It made no difference. When, two years later, we published a book on medical myths that once again debunked the idea that we need eight glasses of water a day, I thought it would persuade people to stop worrying. I was wrong again. Many people believe that the source of this myth was a 1945 Food and Nutrition Board recommendation that said people need about 2.5 liters of water a day. But they ignored the sentence that followed closely behind. It read, “Most of this quantity is contained in prepared foods.” Water is present in fruits and vegetables. It’s in juice, it’s in beer, it’s even in tea and coffee. Before anyone writes me to tell me that coffee is going to dehydrate you, research shows that’s not true either. Although I recommended water as the best beverage to consume, it’s certainly not your only source of hydration. You don’t have to consume all the water you need through drinks. You also don’t need to worry so much about never feeling thirsty. The human body is finely tuned to signal you to drink long before you are actually dehydrated. © 2015 The New York Times Company
Link ID: 21335 - Posted: 08.25.2015
By Esther Landhuis The birth of a child leaves its mark on the brain. Most investigations of these changes have focused on mothers, but scientists have recently begun looking more closely at fathers. Neural circuits that support parental behaviors appear more robust in moms a few weeks after the baby is born, whereas in dads the growth can take several months. A study in Social Neuroscience analyzed 16 dads several weeks after their baby's birth and again a few months later. At each check, the researchers administered a multiple-choice test to check for signs of depression and used MRI to image the brain. Compared with the earlier scans, MRI at three to four months postpartum showed growth in the hypothalamus, amygdala and other regions that regulate emotion, motivation and decision making. Furthermore, dads with more growth in these brain areas were less likely to show depressive symptoms, says first author Pilyoung Kim, who directs the Family and Child Neuroscience Lab at the University of Denver. Although some physiological brain changes are similar in new moms and dads, other changes seem different and could relate to the roles of each parent, says senior author James Swain, a psychiatrist at the University of Michigan (brain diagrams below). A 2014 behavioral study of expectant fathers showed that midpregnancy ultrasound imaging was a “magic moment” in the dads' emerging connection with their baby. Yet the emotional bond was different than it is in expectant moms. Instead of thinking about cuddling or feeding the baby, dads-to-be focused on the future: they imagined saving money for a college fund or walking down the aisle at their daughter's wedding. © 2015 Scientific American
By JOAN RAYMOND Rita Gunther McGrath, a Columbia Business School professor, is one of those business travelers who do not care about delays, cancellations or navigating a new location. What does concern her is the seeming inability to conquer jet lag, and the accompanying symptoms that leave her groggy, unfocused and feeling, she says, “like a dishrag.” “Jet lag has always been an issue for me,” says Ms. McGrath, who has been a business traveler for more than two decades and has dealt with itineraries that take her from New York to New Zealand to Helsinki to Hong Kong all within a matter of days. She has scoured the Internet for “jet lag cures,” and has tried preventing or dealing with the misery by avoiding alcohol, limiting light exposure or blasting her body with sunlight and “doing just about anything and everything that experts tell you to do,” Ms. McGrath said. “Jet lag is not conducive to the corporate environment,” she said. “There has to be some kind of help that actually works for those of us that travel a lot, but I sure can’t find it.” Although science is closer to understanding the basic biological mechanisms that make many travelers feel so miserable when crossing time zones, research has revealed that, at least for now, there is no one-size fits-all recommendation for preventing or dealing with the angst of jet lag. Recommendations to beat jet lag include adjusting sleep schedules, short-term use of medications to sleep or stay awake, melatonin supplements and light exposure timing, among others, said Col. Ian Wedmore, an emergency medicine specialist for the Army. © 2015 The New York Times Company
Keyword: Biological Rhythms
Link ID: 21333 - Posted: 08.25.2015
By Hanae Armitage The libido enhancement drug flibanserin (trade name Addyi) took center stage last week after winning long-sought approval from the U.S. Food and Drug Administration (FDA). The coverage from advocates and nonbelievers has run the gamut—advice, caution, and criticism likely to confuse undecided—but curious—onlookers. But exactly how Addyi drums up sex drive is still murky. The drug has a long backstory. It was originally investigated in 1995 by pharmacologist Franco Borsini and a team of researchers at Boehringer Ingelheim Italia in Milan as an antidepressant because of its ability to regulate neurotransmitters—the brain’s chemical-signaling molecules. In particular, the team suspected that the drug regulated three key neurotransmitters thought to influence mood: serotonin, dopamine, and norepinephrine. A clinical trial found it did little to alleviate depression, but did seem to have an effect on mood. It just wasn’t the mood the researchers were expecting. These early trials tipped clinicians to flibanserin’s more prominent role in sexual health, as female subjects had higher scores on the Arizona Sexual Experience Scale, a survey that asks participants to rate their satisfaction on a variety of sexual health topics, like how often participants felt sexual desire and how intense that desire was. A separate group of researchers, also at Boehringer Ingelheim, completed their first clinical trials to explore flibanserin as a libido-enhancer in 2008. They measured levels of desire through a journal-based evaluation in which subjects recorded their levels of sexual drive on a daily basis. But FDA twice concluded that the resulting increases in libido were not statistically significant, and regulators were wary of potentially dangerous side effects like dizziness, sleepiness, nausea, and fainting. © 2015 American Association for the Advancement of Science
Keyword: Sexual Behavior
Link ID: 21332 - Posted: 08.25.2015
By NINA STROHMINGER and SHAUN NICHOLS WHEN does the deterioration of your brain rob you of your identity, and when does it not? Alzheimer’s, the neurodegenerative disease that erodes old memories and the ability to form new ones, has a reputation as a ruthless plunderer of selfhood. People with the disease may no longer seem like themselves. Neurodegenerative diseases that target the motor system, like amyotrophic lateral sclerosis, can lead to equally devastating consequences: difficulty moving, walking, speaking and eventually, swallowing and breathing. Yet they do not seem to threaten the fabric of selfhood in quite the same way. Memory, it seems, is central to identity. And indeed, many philosophers and psychologists have supposed as much. This idea is intuitive enough, for what captures our personal trajectory through life better than the vault of our recollections? But maybe this conventional wisdom is wrong. After all, the array of cognitive faculties affected by neurodegenerative diseases is vast: language, emotion, visual processing, personality, intelligence, moral behavior. Perhaps some of these play a role in securing a person’s identity. The challenge in trying in determine what parts of the mind contribute to personal identity is that each neurodegenerative disease can affect many cognitive systems, with the exact constellation of symptoms manifesting differently from one patient to the next. For instance, some Alzheimer’s patients experience only memory loss, whereas others also experience personality change or impaired visual recognition. The only way to tease apart which changes render someone unrecognizable is to compare all such symptoms, across multiple diseases. And that’s just what we did, in a study published this month in Psychological Science. © 2015 The New York Times Company
Link ID: 21331 - Posted: 08.24.2015
Jon Hamilton More than 50 million adults in the U.S. have a disorder such as insomnia, restless leg syndrome or sleep apnea, according to an Institute of Medicine report. And it's now clear that a lack of sleep "not only increases the risk of errors and accidents, it also has adverse effects on the body and brain," according to Charles Czeisler, chief of the division of sleep and circadian disorders at Brigham and Women's hospital in Boston. Research in the past couple of decades has shown that a lack of sleep increases a person's risk for cardiovascular disease, diabetes, infections, and maybe even Alzheimer's disease. Yet most sleep disorders go untreated. Michael Arnott, of Cambridge, Massachusetts, says he used to have terrible trouble staying awake on long drives. Sleep specialists discovered he has obstructive sleep apnea, though not for the most common reasons — he isn't overweight, and doesn't smoke or take sedatives. "I would get groggy and feel like I've got to keep talking, open the window," Arnott says. His wife, Mary White, says being a passenger on those drives could be scary. "All of a sudden there'd be a change in the speed and I'd look over, and his eyes would be starting to close," she remembers. White thought her husband might have sleep apnea, which interferes with breathing. But Arnott was in denial. He figured he was free of most risk factors for apnea. He wasn't overweight, he didn't smoke or take sedatives, and he has always stayed in great shape. So his wife took the initiative. "I asked him to see a doctor and he wouldn't," she says. In 2012, though, White persuaded him to take part in a sleep research study that paid for his participation, and took place at a sleep lab in Boston –not too far from the couple's home in Cambridge. © 2015 NPR
Link ID: 21330 - Posted: 08.24.2015
Richard A. Friedman THANKS to Caitlyn Jenner, and the military’s changing policies, transgender people are gaining acceptance — and living in a bigger, more understanding spotlight than at any previous time. We’re learning to be more accepting of transgender individuals. And we’re learning more about gender identity, too. The prevailing narrative seems to be that gender is a social construct and that people can move between genders to arrive at their true identity. But if gender were nothing more than a social convention, why was it necessary for Caitlyn Jenner to undergo facial surgeries, take hormones and remove her body hair? The fact that some transgender individuals use hormone treatment and surgery to switch gender speaks to the inescapable biology at the heart of gender identity. This is not to suggest that gender identity is simply binary — male or female — or that gender identity is inflexible for everyone. Nor does it mean that conventional gender roles always feel right; the sheer number of people who experience varying degrees of mismatch between their preferred gender and their body makes this very clear. In fact, recent neuroscience research suggests that gender identity may exist on a spectrum and that gender dysphoria fits well within the range of human biological variation. For example, Georg S. Kranz at the Medical University of Vienna and colleagues elsewhere reported in a 2014 study in The Journal of Neuroscience that individuals who identified as transsexuals — those who wanted sex reassignment — had structural differences in their brains that were between their desired gender and their genetic sex. © 2015 The New York Times Company
Keyword: Sexual Behavior
Link ID: 21329 - Posted: 08.24.2015
By Kazi Stastna The U.S. approval of a pill to treat low libido in women has whipped up a whirlwind of debate and raised questions about whether the so-called female Viagra addresses the real reasons for lack of sexual desire. The U.S. Food and Drug Administration last week approved flibanserin, to be sold under the name Addyi starting in October, for the treatment of hypoactive sexual desire disorder (HSDD) among premenopausal women — some two decades after Viagra was approved for the treatment of male erectile dysfunction. Sprout Pharmaceuticals pitched flibanserin as a drug that would finally give women with sexual dysfunction similar treatment options to men and bused dozens of women to FDA hearings in Maryland to attest to its benefits and plead for its approval in what some saw as a heavy-handed and misleading public relations campaign. The FDA gave flibanserin the OK after twice rejecting it and despite concerns about its risks and modest efficacy because it said women suffering distress from low libido have an "unmet medical need." Days after it did, Canadian pharmaceutical company Valeant offered to buy Sprout for $1 billion US and said it will apply to get flibanserin approved in Canada and other countries. Although often likened to Viagra, flibanserin was created as an antidepressant and works on the brain while erectile dysfunction medications stimulate blood flow to the penis. Critics argue it's an ineffectual pharmacological solution for a problem better treated with relationship counselling, sex therapy and behavioural changes. "Their suffering is real, but the women who testified had a lot of different stories, and some of those stories were very good reasons for having low libido, including having six children, having a one-year-old, having had breast cancer treatment …," says Adriane Fugh-Berman, associate professor of pharmacology and physiology at Georgetown University in Washington, D.C., and director of PharmedOut, a pharmaceutical marketing watchdog group. ©2015 CBC/Radio-Canada.
Keyword: Sexual Behavior
Link ID: 21328 - Posted: 08.24.2015
Mo Costandi The human brain can be compared to something like a big, bustling city. It has workers, the neurons and glial cells which co-operate with each other to process information; it has offices, the clusters of cells that work together to achieve specific tasks; it has highways, the fibre bundles that transfer information across long distances; and it has centralised hubs, the densely interconnected nodes that integrate information from its distributed networks. Like any big city, the brain also produces large amounts of waste products, which have to be cleared away so that they do not clog up its delicate moving parts. Until very recently, though, we knew very little about how this happens. The brain’s waste disposal system has now been identified. We now know that it operates while we sleep at night, just like the waste collectors in most big cities, and the latest research suggests that certain sleeping positions might make it more efficient. Waste from the rest of the body is cleared away by the lymphatic system, which makes and transports a fluid called lymph. The lymphatic system is an important component of the immune system. Lymph contains white blood cells that can kill microbes and mop up their remains and other cellular debris. It is carried in branching vessels to every organ and body part, and passes through them, via the spaces between their cells, picking up waste materials. It is then drained, filtered, and recirculated. The brain was thought to lack lymphatic vessels altogether, and so its waste disposal system proved to be far more elusive. Several years ago, however, Maiken Nedergaard of the University of Rochester Medical Center and colleagues identified a system of hydraulic “pipes” running alongside blood vessels in the mouse brain. Using in vivo two-photon imaging to trace the movements of fluorescent markers, they showed that these vessels carry cerebrospinal fluid around the brain, and that the fluid enters inter-cellular spaces in the brain tissue, picking up waste on its way. © 2015 Guardian News and Media Limited
Link ID: 21327 - Posted: 08.22.2015
By Gretchen Vogel Researchers may have finally explained how an obesity-promoting gene variant induces some people to put on the pounds. Using state-of-the-art DNA editing tools, they have identified a genetic switch that helps govern the body’s metabolism. The switch controls whether common fat cells burn energy rather than store it as fat. The finding suggests the tantalizing prospect that doctors might someday offer a gene therapy to melt extra fat away. Along with calories and exercise, genes influence a person’s tendency to gain—and keep—extra pounds. One of the genes with the strongest link to obesity is called FTO. People with certain versions of the gene are several kilos heavier on average and significantly more likely to be obese. Despite years of study, no one had been able to figure out what the gene does in cells or how it influences weight. There was some evidence FTO helped control other genes, but it was unclear which ones. Some researchers had looked for activity of FTO in various tissues, without finding any clear signals. Melina Claussnitzer, Manolis Kellis, and their colleagues at Harvard University, Massachusetts Institute of Technology, and the Broad Institute in Cambridge, turned to data from the Roadmap Epigenomics Project, an 8-year effort that identified the chemical tags on DNA that influence the function of genes. The researchers used those epigenetic tags to look at whether FTO was turned on or off in 127 cell types. The gene seemed to be active in developing fat cells called adipocyte progenitor cells. © 2015 American Association for the Advancement of Science
Helen Thomson Genetic changes stemming from the trauma suffered by Holocaust survivors are capable of being passed on to their children, the clearest sign yet that one person’s life experience can affect subsequent generations. The conclusion from a research team at New York’s Mount Sinai hospital led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who had either been interned in a Nazi concentration camp, witnessed or experienced torture or who had had to hide during the second world war. They also analysed the genes of their children, who are known to have increased likelihood of stress disorders, and compared the results with Jewish families who were living outside of Europe during the war. “The gene changes in the children could only be attributed to Holocaust exposure in the parents,” said Yehuda. Her team’s work is the clearest example in humans of the transmission of trauma to a child via what is called “epigenetic inheritance” - the idea that environmental influences such as smoking, diet and stress can affect the genes of your children and possibly even grandchildren. The idea is controversial, as scientific convention states that genes contained in DNA are the only way to transmit biological information between generations. However, our genes are modified by the environment all the time, through chemical tags that attach themselves to our DNA, switching genes on and off. Recent studies suggest that some of these tags might somehow be passed through generations, meaning our environment could have and impact on our children’s health. © 2015 Guardian News and Media Limited
By Christian Jarrett If we’re being honest, most of us have at least some selfish aims – to make money, to win a promotion at work, and so on. But importantly, we pursue these goals while at the same time conforming to basic rules of decency. For example, if somebody helps us out, we’ll reciprocate, even if doing so costs us time or cash. Yet there is a minority of people out there who don’t play by these rules. These selfish individuals consider other people as mere tools to be leveraged in the pursuit of their aims. They think nothing of betrayal or backstabbing, and they basically believe everyone else is in it for themselves too. Psychologists call these people “Machiavellians,” and there’s a questionnaire that tests for this trait (one of the so-called “dark triad” of personality traits along with narcissism and psychopathy). People high in Machiavellianism are more likely to agree with statements like: It is wise to flatter important people and The best way to handle people is to tell them what they want to hear. Calling them Machiavellian is too kind. These people are basically jerks. Related Stories Inside the Brains of Happily Married Couples Lonely People’s Brains Work Differently Now a team of Hungarian researchers from the University of Pécs has scanned the brains of high scorers on Machiavellianism while they played a simple game of trust. Reporting their results in the journal Brain and Cognition, the researchers said they found that Machiavellians’ brains went into overdrive when they encountered a partner who exhibited signs of being fair and cooperative. Why? Tamas Bereczkei and his team say it’s because the Machiavellians are immediately figuring out how to exploit the situation for their own gain. The game involved four stages and the student participants — a mix of high and low scorers on Machiavellianism — played several times with different partners. First, the participants were given roughly $5 worth of Hungarian currency and had to decide how much to “invest” in their partner. Any money they invested was always tripled as it passed to their partner. © 2015, New York Media LLC.
Link ID: 21324 - Posted: 08.22.2015
By Catherine Saint Louis People who work 55 hours or more per week have a 33 percent greater risk of stroke and a 13 percent greater risk of coronary heart disease than those working standard hours, researchers reported on Wednesday in the Lancet. The new analysis includes data on more than 600,000 individuals in Europe, the United States and Australia, and is the largest study thus far of the relationship between working hours and cardiovascular health. But the analysis was not designed to draw conclusions about what caused the increased risk and could not account for all relevant confounding factors. “Earlier studies have pointed to heart attacks as a risk of long working hours, but not stroke,” said Dr. Urban Janlert, a professor of public health at Umea University in Sweden, who wrote an accompanying editorial. “That’s surprising.” Mika Kivimaki, a professor of epidemiology at University College London, and his colleagues combined the results of multiple studies and tried to account for factors that might skew the results. In addition to culling data from published studies, the researchers also compiled unpublished information from public databases and asked authors of previous work for additional data. Dr. Steven Nissen, the chief of cardiovascular medicine at the Cleveland Clinic, found the methodology unconvincing. “It’s based upon exclusively observational studies, many of which were unpublished,” and some never peer-reviewed, he said. Seventeen studies of stroke included 528,908 men and women who were tracked on average 7.2 years. Some 1,722 nonfatal and deadly strokes were recorded. After controlling for smoking, physical activity and high blood pressure and cholesterol, the researchers found a one-third greater risk of stroke among those workers who reported logging 55 or more hours weekly, compared with those who reported working the standard 35 to 40 hours. © 2015 The New York Times Company
Almost fully-formed brain grown in a lab. Woah: Scientists grow first nearly fully-formed human brain. Boffins raise five-week-old fetal human brain in the lab for experimentation. On Tuesday, all the above appeared as headlines for one particular story. What was it all about? Mini-brains 3 to 4 millimetres across have been grown in the lab before, but if a larger brain had been created – and the press release publicising the claim said it was the size of a pencil eraser – that would be a major breakthrough. New Scientist investigated the claims. The announcement was made by Rene Anand, a neuroscientist at Ohio State University in Columbus, at a military health research meeting in Florida. Anand says he has grown a brain – complete with a cortex, midbrain and brainstem – in a dish, comparable in maturity to that of a fetus aged 5 weeks. Anand and his colleague Susan McKay started with human skin cells, which they turned into induced pluripotent stem cells (iPSCs) using a tried-and-tested method. By applying an undisclosed technique, one that a patent has been applied for, the pair say they were able to encourage these stem cells to form a brain. “We are replicating normal development,” says Anand. He says they hope to be able to create miniature models of brains experiencing a range of diseases, such as Parkinson’s and Alzheimer’s. Inconclusive evidence But not everyone is convinced, especially as Anand hasn’t published his results. Scientists we sent Anand’s poster presentation to said that although the team has indeed grown some kind of miniature collection of cells, or “organoid”, in a dish, the structure isn’t much like a fetal brain. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 21322 - Posted: 08.22.2015
Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.
By Gretchen Reynolds Sticking to a diet requires self-control and a willingness to forgo present pleasures for future benefits. Not surprisingly, almost everyone yields to temptation at least sometimes, opting for the cookie instead of the apple. Wondering why we so often override our resolve, scientists at the Laboratory for Social and Neural Systems Research at the University of Zurich recently considered the role of stress, which is linked to a variety of health problems, including weight gain. (There’s something to the rom-com cliché of the jilted lover eating ice cream directly from the carton.) But just how stress might drive us to sweets has not been altogether clear. It turns out that even mild stress may immediately alter the workings of our brains in ways that undermine willpower. For their study, published this month in Neuron, researchers recruited 51 young men who said they were trying to maintain a healthy diet and lifestyle. The men were divided into two groups, one of which served as a control, and then all were asked to skim through images of different kinds of food on a computer screen, rating them for taste and healthfulness. Next, the men in the experimental group were told to plunge a hand into a bowl of icy water for as long as they could, a test known to induce mild physiological and psychological stress. Relative to the control group, the men developed higher levels of cortisol, a stress hormone. After that, men from each group sat in a brain-scanning machine and watched pictures of paired foods flash across a screen. Generally, one of the two foods was more healthful than the other. The subjects were asked to click rapidly on which food they would choose to eat, knowing that at the end of the test they would actually be expected to eat one of these picks (chosen at random from all of their choices). © 2015 The New York Times Company
Bill McQuay The natural world is abuzz with the sound of animals communicating — crickets, birds, even grunting fish. But scientists learning to decode these sounds say the secret signals of African elephants — their deepest rumblings — are among the most intriguing calls any animal makes. Katy Payne, the same biologist who recognized song in the calls of humpback whales in the 1960s, went on to help create the Elephant Listening Project in the Central African Republic in the 1980s. At the time, Payne's team was living in shacks in a dense jungle inhabited by hundreds of rare forest elephants. That's where one of us — Bill McQuay — first encountered the roar of an elephant in 2002, while reporting a story for an NPR-National Geographic collaboration called Radio Expeditions. Here's how Bill remembers that day in Africa: I was walking through this rainforest to an observation platform built up in a tree — out of the reach of the elephants. I climbed up onto the platform, a somewhat treacherous exercise with all my recording gear. Then I set up my recording equipment, put on the headphones, and started listening. That first elephant roar sounded close. But I was so focused on the settings on my recorder that I didn't bother to look around. The second roar sounded a lot closer. I thought, this is so cool! What I didn't realize was, there was this huge bull elephant standing right underneath me — pointing his trunk up at me, just a few feet away. Apparently he was making a "dominance display." © 2015 NPR
Helen Thomson Modafinil is the world’s first safe “smart drug”, researchers at Harvard and Oxford universities have said, after performing a comprehensive review of the drug. They concluded that the drug, which is prescribed for narcolepsy but is increasingly taken without prescription by healthy people, can improve decision- making, problem-solving and possibly even make people think more creatively. While acknowledging that there was limited information available on the effects of long-term use, the reviewers said that the drug appeared safe to take in the short term, with few side effects and no addictive qualities. Modafinil has become increasingly common in universities across Britain and the US. Prescribed in the UK as Provigil, it was licensed in 2002 for use as a treatment for narcolepsy - a brain disorder that can cause a person to suddenly fall asleep at inappropriate times or to experience chronic pervasive sleepiness and fatigue. Used without prescription, and bought through easy-to-find websites, modafinil is what is known as a smart drug - used primarily by people wanting to improve their focus before an exam. A poll of Nature journal readers suggested that one in five have used drugs to improve focus, with 44% stating modafinil as their drug of choice. But despite its increasing popularity, there has been little consensus on the extent of modafinil’s effects in healthy, non-sleep-disordered humans. A new review of 24 of the most recent modafinil studies suggests that the drug has many positive effects in healthy people, including enhancing attention, improving learning and memory and increasing something called “fluid intelligence” - essentially our capacity to solve problems and think creatively. © 2015 Guardian News and Media Limited
By Mitch Leslie Some microbes that naturally dwell in our intestines might be bad for our eyes, triggering autoimmune uveitis, one of the leading causes of blindness. A new study suggests that certain gut residents produce proteins that enable destructive immune cells to enter the eyes. The idea that gut microbes might promote autoimmune uveitis “has been there in the back of our minds,” says ocular immunologist Andrew Taylor of the Boston University School of Medicine, who wasn’t connected to the research. “This is the first time that it’s been shown that the gut flora seems to be part of the process.” As many as 400,000 people in the United States have autoimmune uveitis, in which T cells—the commanders of the immune system—invade the eye and damage its middle layer. All T cells are triggered by specific molecules called antigens, and for T cells that cause autoimmune uveitis, certain eye proteins are the antigens. Even healthy people carry these T cells, yet they don't usually swarm the eyes and unleash the disease. That's because they first have to be triggered by their matching antigen. However, those proteins don't normally leave the eye. So what could stimulate the T cells? One possible explanation is microbes in the gut. In the new study, immunologist Rachel Caspi of the National Eye Institute in Bethesda, Maryland, and colleagues genetically engineered mice so their T cells recognized one of the same eye proteins targeted in autoimmune uveitis. The rodents developed the disease around the time they were weaned. But dosing the animals with four antibiotics that killed off most of their gut microbes delayed the onset and reduced the severity of the disease. © 2015 American Association for the Advancement of Science.
Helen Thomson An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease. Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases. It could also be used to test drugs for conditions such as Alzheimer’s and Parkinson’s, since the regions they affect are in place during an early stage of brain development. The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed, claimed Rene Anand of Ohio State University, Columbus, who presented the work today at the Military Health System Research Symposium in Fort Lauderdale, Florida. Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand. Anand and his colleagues claim to have reproduced 99% of the brain’s diverse cell types and genes. They say their brain also contains a spinal cord, signalling circuitry and even a retina. The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.” © 2015 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 21316 - Posted: 08.19.2015