Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Anil Ananthaswamy HOLD that thought. When it comes to consciousness, the brain may be doing just that. It now seems that conscious perception requires brain activity to hold steady for hundreds of milliseconds. This signature in the pattern of brainwaves can be used to distinguish between levels of impaired consciousness in people with brain injury. The new study by Aaron Schurger at the Swiss Federal Institute of Technology in Lausanne doesn't explain the so-called "hard problem of consciousness" – how roughly a kilogram of nerve cells is responsible for the miasma of sensations, thoughts and emotions that make up our mental experience. However, it does chip away at it, and support the idea that it may one day be explained in terms of how the brain processes information. Neuroscientists think that consciousness requires neurons to fire in such a way that they produce a stable pattern of brain activity. The exact pattern will depend on what the sensory information is, but once information has been processed, the idea is that the brain should hold a pattern steady for a short period of time – almost as if it needs a moment to read out the information. In 2009, Schurger tested this theory by scanning 12 people's brains with fMRI machines. The volunteers were shown two images simultaneously, one for each eye. One eye saw a red-on-green line drawing and the other eye saw green-on-red. This confusion caused the volunteers to sometimes consciously perceive the drawing and sometimes not. © Copyright Reed Business Information Ltd.
By Ariana Eunjung Cha Autism has always been a tricky disorder to diagnose. There’s no such thing as a blood test, cheek swap or other accepted biological marker so specialists must depend on parent and teacher reports, observations and play assessments. Figuring out a child's trajectory once he or she is diagnosed is just as challenging. The spectrum is wide and some are destined to be on the mild end and be very talkative, sometimes almost indistinguishable from those without the disorder in some settings, while others will suffer from a more severe form and have trouble being able to speak basic words. Now scientists believe that they have a way to distinguish between those paths, at least in terms of language ability, in the toddler years using brain imaging. In an article published Thursday in the journal Neuron, scientists at the University of California-San Diego have found that children with autism spectrum disorder, or ASD, with good language outcomes have strikingly distinct patterns of brain activation as compared to those with poor language outcomes and typically developing toddlers. "Why some toddlers with ASD get better and develop good language and others do not has been a mystery that is of the utmost importance to solve," Eric Courchesne, one of the study’s authors and co-director of the University of California-San Diego's Autism Center, said in a statement. The images of the children in the study -- MRIs of the brain -- were taken at 12 to 29 months while their language was assessed one to two years later at 30 to 48 months.
Mo Costandi In 2009, researchers at the University of California, Santa Barbara performed a curious experiment. In many ways, it was routine — they placed a subject in the brain scanner, displayed some images, and monitored how the subject's brain responded. The measured brain activity showed up on the scans as red hot spots, like many other neuroimaging studies. Except that this time, the subject was an Atlantic salmon, and it was dead. Dead fish do not normally exhibit any kind of brain activity, of course. The study was a tongue-in-cheek reminder of the problems with brain scanning studies. Those colorful images of the human brain found in virtually all news media may have captivated the imagination of the public, but they have also been subject of controversy among scientists over the past decade or so. In fact, neuro-imagers are now debating how reliable brain scanning studies actually are, and are still mostly in the dark about exactly what it means when they see some part of the brain "light up." Functional magnetic resonance imaging (fMRI) measures brain activity indirectly by detecting changes in the flow of oxygen-rich blood, or the blood oxygen-level dependent (BOLD) signal, with its powerful magnets. The assumption is that areas receiving an extra supply of blood during a task have become more active. Typically, researchers would home in on one or a few "regions of interest," using 'voxels,' tiny cube-shaped chunks of brain tissue containing several million neurons, as their units of measurement.
Keyword: Brain imaging
Link ID: 20775 - Posted: 04.10.2015
By Megan Griffith-Greene The idea of playing a game to make you sharper seems like a no-brainer. That's the thinking behind a billion-dollar industry selling brain training games and programs designed to boost cognitive ability. But an investigation by CBC's Marketplace reveals that brain training games such as Lumosity may not make your brain perform better in everyday life. Lumosity Brain training games, such as Lumosity, are a billion-dollar industry. Many people are worried about maintaining their brain health and want to prevent a decline in their mental abilities. (CBC) Almost 15 per cent of Canadians over the age of 65 are affected by some kind of dementia. And many people of all ages are worried about maintaining their brain health and possibly preventing a decline in their mental abilities. "I don't think there's anything to say that you can train your brain to be cognitively better in the way that we know that we can train our bodies to be physically better," neuroscientist Adrian Owen told Marketplace co-host Tom Harrington. To test how effective the games are at improving cognitive function, Marketplace partnered with Owen, who holds the Canada Excellence Research Chair in Cognitive Neuroscience and Imaging at the Brain and Mind Institute at Western University. A group of 54 adults, including Harrington, did the brain training at least three times per week for 15 minutes or more over a period of between two and a half and four weeks. The group underwent a complete cognitive assessment at the beginning and end of the training to see if there had been any change as the result of the training program. ©2015 CBC/Radio-Canada.
Jordan Gaines Lewis Hodor hodor hodor. Hodor hodor? Hodor. Hodor-hodor. Hodor! Oh, um, excuse me. Did you catch what I said? Fans of the hit HBO show Game of Thrones, the fifth season of which premieres this Sunday, know what I’m referencing, anyway. Hodor is the brawny, simple-minded stableboy of the Stark family in Winterfell. His defining characteristic, of course, is that he only speaks a single word: “Hodor.” But those who read the A Song of Ice and Fire book series by George R R Martin may know something that the TV fans don’t: his name isn’t actually Hodor. According to his great-grandmother Old Nan, his real name is Walder. “No one knew where ‘Hodor’ had come from,” she says, “but when he started saying it, they started calling him by it. It was the only word he had.” Whether he intended it or not, Martin created a character who is a textbook example of someone with a neurological condition called expressive aphasia. In 1861, French physician Paul Broca was introduced to a man named Louis-Victor Leborgne. While his comprehension and mental functioning remained relatively normal, Leborgne progressively lost the ability to produce meaningful speech over a period of 20 years. Like Hodor, the man was nicknamed Tan because he only spoke a single word: “Tan.”
Link ID: 20773 - Posted: 04.10.2015
|By Gareth Cook The wait has been long, but the discipline of neuroscience has finally delivered a full-length treatment of the zombie phenomenon. In their book, Do Zombies Dream of Undead Sheep?, scientists Timothy Verstynen and Bradley Voytek cover just about everything you might want to know about the brains of the undead. It's all good fun, and if you learn some serious neuroscience along the way, well, that's fine with them, too. Voytek answered questions from contributing editor Gareth Cook. How is it that you and your co-author came to write a book about zombies? Clearly, it is an urgent public health threat, but I would not have expected a book from neuroscientists on the topic. Indeed! You think you're prepared for the zombie apocalypse and then—BAM!—it happens, and only then do you realize how poorly prepared you really were. Truly the global concern of our time. Anyway, this whole silly thing started when Tim and I would get together to watch zombie movies with our wives and friends. Turns out when you get some neuroscientists together to watch zombie movies, after a few beers they start to diagnose them and mentally dissect their brains. Back in the summer of 2010 zombie enthusiast and author—and head of the Zombie Research Society—Matt Mogk got in touch with me to see if we were interested in doing something at the intersection of zombies and neuroscience. © 2015 Scientific American
Link ID: 20772 - Posted: 04.10.2015
Cari Romm “As humans, we can identify galaxies light-years away. We can study particles smaller than an atom,” President Barack Obama said in April 2013, “But we still haven’t unlocked the mystery of the three pounds of matter that sits between our ears.” The observation was part of the president’s announcement of the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, an effort to fast-track the development of new technology that will help scientists understand the workings of the human brain and its diseases. With progress, though, comes a whole new set of ethical questions. Can drugs used to treat conditions like ADHD, for example, also be used to make healthy people into sharper, more focused versions of themselves—and should they? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? Can brain scans submitted as courtroom evidence reveal anything about a defendant’s intent? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? To address these questions, the Presidential Commission for the Study of Bioethical Issues, an independent advisory group, recently released the second volume of a report examining the issues that may arise as neuroscience advances. The commission outlined three areas it deemed particularly fraught: cognitive enhancement, consent, and the use of neuroscience in the legal system. © 2015 by The Atlantic Monthly Group
|By Simon Makin People can control prosthetic limbs, computer programs and even remote-controlled helicopters with their mind, all by using brain-computer interfaces. What if we could harness this technology to control things happening inside our own body? A team of bioengineers in Switzerland has taken the first step toward this cyborglike setup by combining a brain-computer interface with a synthetic biological implant, allowing a genetic switch to be operated by brain activity. It is the world's first brain-gene interface. The group started with a typical brain-computer interface, an electrode cap that can register subjects' brain activity and transmit signals to another electronic device. In this case, the device is an electromagnetic field generator; different types of brain activity cause the field to vary in strength. The next step, however, is totally new—the experimenters used the electromagnetic field to trigger protein production within human cells in an implant in mice. The implant uses a cutting-edge technology known as optogenetics. The researchers inserted bacterial genes into human kidney cells, causing them to produce light-sensitive proteins. Then they bioengineered the cells so that stimulating them with light triggers a string of molecular reactions that ultimately produces a protein called secreted alkaline phosphatase (SEAP), which is easily detectable. They then placed the human cells plus an LED light into small plastic pouches and inserted them under the skin of several mice. © 2015 Scientific American
By Jonathan Webb Science reporter, BBC News Living in total darkness, the animals' eyes have disappeared over millions of years A study of blind crustaceans living in deep, dark caves has revealed that evolution is rapidly withering the visual parts of their brain. The findings catch evolution in the act of making this adjustment - as none of the critters have eyes, but some of them still have stumpy eye-stalks. Three different species were studied, each representing a different subgroup within the same class of crustaceans. The research is published in the journal BMC Neuroscience. The class of "malocostracans" also includes much better-known animals like lobsters, shrimps and wood lice, but this study focussed on three tiny and obscure examples that were only discovered in the 20th Century. It is the first investigation of these mysterious animals' brains. "We studied three species. All of them live in caves, and all of them are very rare or hardly accessible," said lead author Dr Martin Stegner, from the University of Rostock in Germany. Specifically, his colleagues retrieved the specimens from the coast of Bermuda, from Table Mountain in South Africa, and from Monte Argentario in Italy. One of the species was retrieved from caves on the coast of Bermuda The animals were preserved rather than living, so the team could not observe their tiny brains in action. But by looking at the physical shape of the brain, and making comparisons with what we know about how the brain works in their evolutionary relatives, the researchers were able to assign jobs to the various lobes, lumps and spindly structures they could see under the microscope. © 2015 BBC.
Tom Bawden Scientists have deciphered the secrets of gibbon “speech” – discovering that the apes are sophisticated communicators employing a range of more than 450 different calls to talk to their companions. The research is so significant that it could provide clues on the evolution of human speech and also suggests that other animal species could speak a more precise language than has been previously thought, according to lead author Dr Esther Clarke of Durham University. Her study found that gibbons produce different categories of “hoo” calls – relatively quiet sounds that are distinct from their more melodic “song” calls. These categories of call allow the animals to distinguish when their fellow gibbons are foraging for food, alerting them to distant noises or warning others about the presence of predators. In addition, Dr Clarke found that each category of “hoo” call can be broken down further, allowing gibbons to be even more specific in their communication. A warning about lurking raptor birds, for example, sounds different to one about pythons or clouded leopards – being pitched at a particularly low frequency to ensure it is too deep for the birds of prey to hear. The warning call denoting the presence of tigers and leopards is the same because they belong to the same class of big cats, the research found. © independent.co.uk
Do Alcoholics Anonymous participants do better at abstinence than nonparticipants because they are more motivated? Or is it because of something inherent in the A.A. program? How researchers answered these questions in a recent study offers insight into challenges of evidence-based medicine and evidence-informed policy. The study, published in the journal Alcoholism: Clinical and Experimental Research, teased apart a treatment effect (improvement due to A.A. itself) and a selection effect (driven by the type of people who seek help). The investigators found that there is a genuine A.A. treatment effect. Going to an additional two A.A. meetings per week produced at least three more days of alcohol abstinence per month. Separating treatment from selection effects is a longstanding problem in social and medical science. Their entanglement is one of the fundamental ways in which evidence of correlation fails to be a sign of causation. For many years, researchers and clinicians have debated whether the association of A.A. with greater abstinence was caused by treatment or a correlation that arises from the type of people who seek it. Such confounding is often addressed with an experiment in which individuals are randomly assigned to either a treatment or a nontreatment (or control) group in order to remove the possibility of self-selection. The treatment effect is calculated by comparing outcomes obtained by participants in each group. Several studies of A.A. have applied this approach. For instance, Kimberly Walitzer, Kurt Dermen and Christopher Barrick randomized alcoholics to receive treatment that strongly encouraged and supported A.A. participation or a control group. The former exhibited a greater degree of abstinence. In an ideal randomized controlled trial (R.C.T.), everyone selected for treatment receives it and no one in the control group does. The difference in outcomes is the treatment effect, free of bias from selection. That’s the ideal. However, in practice, randomized controlled trials can still suffer selection problems. © 2015 The New York Times Company
Keyword: Drug Abuse
Link ID: 20767 - Posted: 04.08.2015
By ERICA GOODE He was described, in the immediate aftermath of the Germanwings crash, as a cheerful and careful pilot, a young man who had dreamed of flying since boyhood. But in the days since, it has seemed increasingly clear that Andreas Lubitz, 27, the plane’s co-pilot, was something far more sinister: the perpetrator of one of the worst mass murder-suicides in history. If what researchers have learned about such crimes is any indication, this notoriety may have been just what Mr. Lubitz wanted. The actions now attributed to Mr. Lubitz — taking 149 unsuspecting people with him to a horrifying death — seem in some ways unfathomable, and his full motives may never be fully understood. But studies over the last decades have begun to piece together characteristics that many who carry out such violence seem to share, among them a towering narcissism, a strong sense of grievance and a desire for infamy. Adam Lankford, an associate professor of criminal justice at the University of Alabama, said that in his research on mass killers who also took their own lives, he has found “a significant number of cases where they mention a desire for fame, glory or attention as a motive.” Before Adam Lanza, 20, the Sandy Hook Elementary School shooter, killed 20 children, six adults and himself in 2012, he wrote in an online forum, “Just look at how many fans you can find for all different types of mass murderers.” Robert Hawkins, 19, who committed suicide after killing eight people at a shopping mall in Omaha in 2007, left a note saying “I’m gonna be famous,” punctuating the sentence with an expletive. And Dylan Klebold, 17, of Columbine High School fame, bragged that the goal was to cause “the most deaths in U.S. history…we’re hoping. We’re hoping.” “Directors will be fighting over this story,” Mr. Klebold said in a video made before the massacre. © 2015 The New York Times Company
Arran Frood A psychedelic drink used for centuries in healing ceremonies is now attracting the attention of biomedical scientists as a possible treatment for depression. Researchers from Brazil last month published results from the first clinical test of a potential therapeutic benefit for ayahuasca, a South American plant-based brew1. Although the study included just six volunteers and no placebo group, the scientists say that the drink began to reduce depression in patients within hours, and the effect was still present after three weeks. They are now conducting larger studies that they hope will shore up their findings. The work forms part of a renaissance in studying the potential therapeutic benefits of psychedelic or recreational drugs — research that was largely banned or restricted worldwide half a century ago. Ketamine, which is used medically as an anaesthetic, has shown promise as a fast-acting antidepressant; psilocybin, a hallucinogen found in ‘magic mushrooms’, can help to alleviate anxiety in patients with advanced-stage cancer2; MDMA (ecstasy) can alleviate post-traumatic stress disorder; and patients who experience debilitating cluster headaches have reported that LSD eases their symptoms. Ayahuasca, a sacramental drink traditionally brewed from the bark of a jungle vine (Banisteriopsis caapi) and the leaves of a shrub (Psychotria viridis), contains ingredients that are illegal in most countries. But a booming ayahuasca industry has developed in South America, where its religious use is allowed, and where thousands of people each year head to rainforest retreats to sample its intense psychedelic insights. © 2015 Nature Publishing Group,
By Kate Galbraith Most evenings, before watching late-night comedy or reading emails on his phone, Matt Nicoletti puts on a pair of orange-colored glasses that he bought for $8 off the Internet. “My girlfriend thinks I look ridiculous in them,” he said. But Mr. Nicoletti, a 30-year-old hospitality consultant in Denver, insists that the glasses, which can block certain wavelengths of light emitted by electronic screens, make it easier to sleep. Studies have shown that such light, especially from the blue part of the spectrum, inhibits the body’s production of melatonin, a hormone that helps people fall asleep. Options are growing for blocking blue light, though experts caution that few have been adequately tested for effectiveness and the best solution remains avoiding brightly lit electronics at night. A Swiss study of 13 teenage boys, published in August in The Journal of Adolescent Health, showed that when the boys donned orange-tinted glasses, also known as blue blockers and shown to prevent melatonin suppression, in the evening for a week, they felt “significantly more sleepy” than when they wore clear glasses. The boys looked at their screens, as teenagers tend to do, for at least a few hours on average before going to bed, and were monitored in the lab. Older adults may be less affected by blue light, experts say, since the yellowing of the lens and other changes in the aging eye filter out increasing amounts of blue light. But blue light remains a problem for most people, and an earlier study of 20 adults ages 18 to 68 found that those who wore amber-tinted glasses for three hours before bed improved their sleep quality considerably relative to a control group that wore yellow-tinted lenses, which blocked only ultraviolet light. Devices such as smartphones and tablets are often illuminated by light-emitting diodes, or LEDs, that tend to emit more blue light than incandescent products. Televisions with LED backlighting are another source of blue light, though because they are typically viewed from much farther away than small screens like phones, they may have less of an effect, said Debra Skene, a professor of neuroendocrinology at the University of Surrey in England. © 2015 The New York Times Company
By Jan Hoffman As adults age, vision deteriorates. One common type of decline is in contrast sensitivity, the ability to distinguish gradations of light to dark, making it possible to discern where one object ends and another begins. When an older adult descends a flight of stairs, for example, she may not tell the edge of one step from the next, so she stumbles. At night, an older driver may squint to see the edge of white road stripes on blacktop. Caught in the glare of headlights, he swerves. But new research suggests that contrast sensitivity can be improved with brain-training exercises. In a study published last month in Psychological Science, researchers at the University of California, Riverside, and Brown University showed that after just five sessions of behavioral exercises, the vision of 16 people in their 60s and 70s significantly improved. After the training, the adults could make out edges far better. And when given a standard eye chart, a task that differed from the one they were trained on, they could correctly identify more letters. “There’s an idea out there that everything falls apart as we get older, but even older brains are growing new cells,” said Allison B. Sekuler, a professor of psychology, neuroscience and behavior at McMaster University in Ontario, who was not involved in the new study. “You can teach an older brain new tricks.” The training improved contrast sensitivity in 16 young adults in the study as well, although the older subjects showed greater gains. That is partly because the younger ones, college students, already had reasonably healthy vision and there was not as much room for improvement. Before the training, the vision of each adult, young and older, was assessed. The exercises were fine-tuned at the beginning for each individual so researchers could measure improvements, said Dr.G. John Andersen, the project’s senior adviser and a psychology professor at the University of California, Riverside. © 2015 The New York Times Company
By KEN BELSON One of the limitations of studying chronic traumatic encephalopathy, or C.T.E., the degenerative brain disease linked to repeated head trauma, has been that researchers have been able to detect it only in tissue obtained posthumously. A study published Monday by Proceedings of the National Academy of Sciences, though, suggests that researchers trying to develop a test that will detect the disease in living patients have taken a small step forward. The study, conducted at U.C.L.A., included 14 retired N.F.L. players who suffered from mood swings, depression and cognitive problems associated with C.T.E. The players were given PET, or positron emission tomography, scans that revealed tau protein deposits in their brains, a signature of C.T.E. Although the results were not conclusive, the distribution of tau in their brains was consistent with those found in the autopsies of players who had C.T.E. The 14 players were compared with 24 patients with Alzheimer’s disease and 28 patients in a control group with no significant cognitive problems. The scans showed that the tau deposits in the 14 players were “distinctly different” from those in the patients with Alzheimer’s disease. “There seems to be an emerging new pattern we haven’t seen in any known forms of dementia, and it is definitely not normal,” said Dr. Julian Bailes, a coauthor of the study and the chairman of neurosurgery at NorthShore Neurological Institute in Evanston, Ill. © 2015 The New York Times Company
by Hal Hodson For a few days last summer, a handful of students walked through a park behind the University of Hannover in Germany. Each walked solo, but followed the same route as the others: made the same turns, walked the same distance. This was odd, because none of them knew where they were going. Instead, their steps were steered from a phone 10 paces behind them, which sent signals via bluetooth to electrodes attached to their legsMovie Camera. These stimulated the students' muscles, guiding their steps without any conscious effort. Max Pfeiffer of the University of Hannover was the driver. His project directs electrical currentMovie Camera into the students' sartorius, the longest muscle in the human body, which runs from the inside of the knee to the top of the outer thigh. When it contracts, it pulls the leg out and away from the body. To steer his test subjects left, Pfeiffer would zap their left sartorius, opening their gait and guiding them in that direction. Pfeiffer hopes his system will free people's minds up for other things as they navigate the world, allowing them to focus on their conversation or enjoy their surroundings. Tourists could keep their eyes on the sights while being imperceptibly guided around the city. Acceptance may be the biggest problem, although it is possible that the rise of wearable computing might help. Pfeiffer says the electrode's current causes a tingling sensation that diminishes the more someone uses the system. Volunteers said they were comfortable with the system taking control of their leg muscles, but only if they felt they could take control back. © Copyright Reed Business Information Ltd
Link ID: 20761 - Posted: 04.06.2015
Drawing on the widest survey of sexual behaviour since the Kinsey Report, David Spiegelhalter, in his book Sex By Numbers, answers key questions about our private lives. Here he reveals how Kinsey’s contested claim that 10% of us are gay is actually close to the mark For a single statistic to be the primary propaganda weapon for a radical political movement is unusual. Back in 1977, the US National Gay Task Force (NGTF) was invited into the White House to meet President Jimmy Carter’s representatives – a first for gay and lesbian groups. The NGTF’s most prominent campaigning slogan was “we are everywhere”, backed up by the memorable statistical claim that one in 10 of the US population was gay – this figure was deeply and passionately contested. So where did Bruce Voeller, a scientist who was a founder and first director of the NGTF, get this nice round 10% from? To find out, we have to delve back into Alfred Kinsey’s surveys in 1940s America, which were groundbreaking at the time but are now seen as archaic in their methods: he sought out respondents in prisons and the gay underworld, made friends with them and, over a cigarette, noted down their behaviours using an obscure code. Kinsey did not believe that sexual identity was fixed and simply categorised, and perhaps his most lasting contribution was his scale, still used today, in which individuals are rated from exclusively heterosexual to exclusively homosexual on a scale of 0 to 6. Kinsey’s headline finding was that “at least 37% of the male population has some homosexual experience between the beginning of adolescence and old age”, meaning physical contact to the point of orgasm. © 2015 Guardian News and Media Limited
Keyword: Sexual Behavior
Link ID: 20760 - Posted: 04.06.2015
by Alison George Misguided notions about our sexual appetites are missing the bigger picture and making people unhappy, says Emily Nagoski Why is there no such thing as a sex drive? A drive is a motivational system to deal with life-or-death issues, like hunger or being too cold. You're not going to die if you don't have sex. But biologists might say that if you don't reproduce, that is a form of death Yes. That's the argument that was used when desire was being added to the way sexual dysfunctions were diagnosed in the 1970s, to justify the framing of sexual desire as a drive. But when it comes to sex, there just isn't any physical evidence of a drive mechanism. So what's going on? If sex is a drive then desire should be spontaneous, like a hunger. When you see a sexy person or have a stray sexy thought, it activates an internal craving or urge for sex. That's called "spontaneous desire". It feels like it comes out of the blue. But there is another way of experiencing desire which is also healthy and normal, called "responsive desire", where your interest only emerges in response to arousal. So, your partner comes over and starts kissing your neck and you're like, "oh, right, sex, that's a good idea". Do you think an absence of spontaneous desire is normal? Yes. If our metaphor for desire is hunger, if you are never hungry for food there will be dire consequences and that's clearly a disorder, right? That's a medical problem that needs to be fixed. But not experiencing spontaneous hunger for sex doesn't have dire consequences; it is not a medical disorder. I think the reason we expect everyone to have spontaneous desire is because that's how most men experience it. © Copyright Reed Business Information Ltd
Keyword: Sexual Behavior
Link ID: 20759 - Posted: 04.06.2015
by Bethany Brookshire A new round of dietary do’s and don’ts accompanied last month’s scientific report on the latest food research, summarizing everything from aspartame to saturated fats. The report puts eggs back on the menu. High dietary cholesterol is no longer linked to blood cholesterol in most healthy people. But what grabbed the headlines? Coffee, of course. Many of us are happy to raise a mug to our legal stimulant of choice, especially with the report’s suggestion that three to five cups of joe get a pass. But where do these numbers come from? What science do nutrition experts take into account to determine whether coffee is harmful or safe? And — perhaps the most important question — what does “three to five cups” really mean? The good news for coffee comes from the 2015 Dietary Guidelines Advisory Committee, a group of experts in nutrition and health appointed by the Department of Health and Human Services and the U.S. Department of Agriculture to review the science behind what Americans should eat. The report, released February 19, is not the be-all-end-all of what should be on our plates and in our cups. Instead, it’s a scientific report intended to help the HHS and USDA make policy decisions for the next edition of the Dietary Guidelines for Americans, due out later this year. This is the first time the U.S. Dietary Guidelines have addressed coffee at all. But now, there is enough science on coffee to make a closer look worthwhile, says Tom Brenna, a food scientist at Cornell University and a member of the Committee. “There was so much evidence out there,” he says. “Instead of just five or six papers on the subject, there’s a huge number.” © Society for Science & the Public 2000 - 2015
Keyword: Drug Abuse
Link ID: 20758 - Posted: 04.06.2015