Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 19654

By Sarah C. P. Williams If you sailed through school with high grades and perfect test scores, you probably did it with traits beyond sheer smarts. A new study of more than 6000 pairs of twins finds that academic achievement is influenced by genes affecting motivation, personality, confidence, and dozens of other traits, in addition to those that shape intelligence. The results may lead to new ways to improve childhood education. “I think this is going to end up being a really classic paper in the literature,” says psychologist Lee Thompson of Case Western Reserve University in Cleveland, Ohio, who has studied the genetics of cognitive skills and who was not involved in the work. “It’s a really firm foundation from which we can build on.” Researchers have previously shown that a person’s IQ is highly influenced by genetic factors, and have even identified certain genes that play a role. They’ve also shown that performance in school has genetic factors. But it’s been unclear whether the same genes that influence IQ also influence grades and test scores. In the new study, researchers at King’s College London turned to a cohort of more than 11,000 pairs of both identical and nonidentical twins born in the United Kingdom between 1994 and 1996. Rather than focus solely on IQ, as many previous studies had, the scientists analyzed 83 different traits, which had been reported on questionnaires that the twins, at age 16, and their parents filled out. The traits ranged from measures of health and overall happiness to ratings of how much each teen liked school and how hard they worked. © 2014 American Association for the Advancement of Science

Keyword: Genes & Behavior; Aggression
Link ID: 20170 - Posted: 10.07.2014

By LAWRENCE K. ALTMAN A British-American scientist and a pair of Norwegian researchers were awarded this year’s Nobel Prize in Physiology or Medicine on Monday for discovering “an inner GPS in the brain” that enables virtually all creatures to navigate their surroundings. John O’Keefe, 75, a British-American scientist, will share the prize of $1.1 million with May-Britt Moser, 51, and Edvard I. Moser, 52, only the second married couple to win a Nobel in medicine, who will receive the other half. The three scientists’ discoveries “have solved a problem that has occupied philosophers and scientists for centuries — how does the brain create a map of the space surrounding us and how can we navigate our way through a complex environment?” said the Karolinska Institute in Sweden, which chooses the laureates. The positioning system they discovered helps us know where we are, find our way from place to place and store the information for the next time, said Goran K. Hansson, secretary of the Karolinska’s Nobel Committee. The researchers documented that certain cells are responsible for the higher cognitive function that steers the navigational system. Dr. O’Keefe began using neurophysiological methods in the late 1960s to study how the brain controls behavior and sense of direction. In 1971, he discovered the first component of the inner navigational system in rats. He identified nerve cells in the hippocampus region of the brain that were always activated when a rat was at a certain location. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20169 - Posted: 10.07.2014

By Clare Wilson If you’re facing surgery, this may well be your worst nightmare: waking up while under the knife without medical staff realizing. The biggest-ever study of this phenomenon is shedding light on what such an experience feels like and is causing debate about how best to prevent it. For a one-year period starting in 2012, an anesthetist at every hospital in the United Kingdom and Ireland recorded every case where a patient told a staff member that he had been awake during surgery. Prompted by these reports, the researchers investigated 300 cases, interviewing the patient and doctors involved. One of the most striking findings, says the study’s lead author, Jaideep Pandit of Oxford University Hospitals, was that pain was not generally the worst part of the experience: It was paralysis. For some operations, paralyzing drugs are given to relax muscles and stop reflex movements. “Pain was something they understood, but very few of us have experienced what it’s like to be paralyzed,” Pandit says. “They thought they had been buried alive.” “I thought I was about to die,” says Sandra, who regained consciousness but was unable to move during a dental operation when she was 12 years old. “It felt as though nothing would ever work again — as though the anesthetist had removed everything apart from my soul.”

Keyword: Consciousness
Link ID: 20168 - Posted: 10.07.2014

Aaron E. Carroll For a drug to be approved by the Food and Drug Administration, it must prove itself better than a placebo, or fake drug. This is because of the “placebo effect,” in which patients often improve just because they think they are being treated with something. If we can’t compare a new drug with a placebo, we can’t be sure that the benefit seen from it is anything more than wishful thinking. But when it comes to medical devices and surgery, the requirements aren’t the same. Placebos aren’t required. That is probably a mistake. At the turn of this century, arthroscopic surgery for osteoarthritis of the knee was common. Basically, surgeons would clean out the knee using arthroscopic devices. Another common procedure was lavage, in which a needle would inject saline into the knee to irrigate it. The thought was that these procedures would remove fragments of cartilage and calcium phosphate crystals that were causing inflammation. A number of studies had shown that people who had these procedures improved more than people who did not. However, a growing number of people were concerned that this was really no more than a placebo effect. And in 2002, a study was published that proved it. A total of 180 patients who had osteoarthritis of the knee were randomly assigned (with their consent) to one of three groups. The first had a standard arthroscopic procedure, and the second had lavage. The third, however, had sham surgery. They had an incision, and a procedure was faked so that they didn’t know that they actually had nothing done. Then the incision was closed. The results were stunning. Those who had the actual procedures did no better than those who had the sham surgery. They all improved the same amount. The results were all in people’s heads. © 2014 The New York Times Company

Keyword: Pain & Touch
Link ID: 20167 - Posted: 10.07.2014

By Lisa Sanders, M.D. On Thursday, we challenged Well readers to solve the mystery of a 62-year-old man with severe neck pain that spread down his arm, a facial droop, and numbness on his torso. Nearly 200 of you wrote in, and 20 of you correctly diagnosed the patient. The correct diagnosis is… Lyme disease. And more precisely, the early disseminated form of Lyme disease with neurological involvement The first person with the correct answer was Dr. Arielle Hay, a pediatric rheumatologist in Miami, who nailed it just half an hour after the case was posted. Dr. Hay said that the biggest clue was the UConn letterhead. When combined with the odd neurological symptoms, this reminder of where the case took place brought Lyme disease to mind. Lyme disease is one of those diseases that hardly needs an explanation. It was first described in 1977, in a case series of 51 children and parents who had mysterious episodes of joint pain and swelling. The children were initially diagnosed with juvenile rheumatoid arthritis, but the clustering of cases eventually led the investigators, Dr. Allen Steere and Dr. Stephen Malawista, to consider an infectious disease. The illness was named after the Connecticut town where most of the initial cases were located. The disease is caused by a spirochete, a spiral shaped bacterium carried by the Ixodes tick, and usually presents first with a distinctive, expanding red rash (called erythema migrans) that appears at the site of the bite in the early, localized stage of the disease. It is thought that the rash appears in up to 80 percent of Lyme infections. © 2014 The New York Times Company

Keyword: Pain & Touch; Aggression
Link ID: 20166 - Posted: 10.07.2014

Fiona Fox Last week the UK Home Office published the findings of its investigations into allegations of animal suffering, made after undercover infiltrations at two animal research facilities. You will not find coverage of any of the conclusions in the national news media. Instead any search for media coverage will unearth the original infiltration stories under headlines such as: “Horrific video shows distress of puppies and kittens waiting to be dissected at animal testing lab”; “Graphic content: horrifying video shows puppies and kittens tested at UK laboratory”; and “Rats beheaded with scissors and kept in ‘pitiful state’.” These “shocking exposés”, brought to the newspapers by the animal rights group BUAV, include distressing images, links to videos that are difficult to watch, and quote allegedly secretly recorded researchers saying terrible things about the animals in their care. The newspapers seem in no doubt that the allegations they are carrying add up to “appalling suffering on a very large scale”, and appear to be proud of their role in bringing the abuses to light: “The Sunday Express today publishes details of an undercover investigation … that shines a light on the secret world of vivisection laboratories.” You may well see these articles as reassuring evidence that we still have public interest journalism in the UK. These animal rights supporters have done exactly what investigative journalists used to do in a time when newspapers had enough money to shine a light on the darker corners of our institutions and uncover hidden abuses. And you would be right, but for one thing: we now know that the stories were largely untrue. © 2014 Guardian News and Media Limited

Keyword: Animal Rights
Link ID: 20165 - Posted: 10.07.2014

|By Tori Rodriguez The safety of football continues to be a heated topic for players and parents, with mixed evidence regarding the effect of head injuries on mental illness. Past studies on the connection have often been methodologically flawed or yielded ambiguous results. Now a paper in April in the American Journal of Psychiatry, the largest study yet to investigate the link, finds that even a single head injury indeed increases the risk of later mental illness, especially if the injury occurs during adolescence. Using Danish medical registries, researchers led by physician Sonja Orlovska of the University of Copenhagen studied 113,906 people who had been hospitalized for head injuries over a 23-year period. They discovered that in addition to cognitive symptoms caused by structural damage to the brain (such as delirium), these people were subsequently more likely than the general population to develop several psychiatric illnesses. Risk increased by 65 percent for schizophrenia and 59 percent for depression. Risk was highest in the first year postinjury but remained significantly elevated throughout the next 15 years. After the team controlled for several potential confounders, such as accident proneness and a family history of psychiatric problems, they found the strongest injury-related predictor for later onset of schizophrenia, depression and bipolar disorder was a head trauma experienced between the ages of 11 and 15. “Previous studies have shown that head injury induces inflammation in the brain, which causes several changes—for example, an increased permeability of the blood-brain barrier,” Orlovska says. Normally the barrier protects the brain from potentially harmful contents in the bloodstream, but injury-induced inflammation may allow these substances access to the brain. “For some individuals, this might initiate damaging processes in the brain,” she says. © 2014 Scientific American,

Keyword: Brain Injury/Concussion; Aggression
Link ID: 20164 - Posted: 10.06.2014

By Gretchen Vogel Research on how the brain knows where it is has bagged the 2014 Nobel Prize in Physiology or Medicine, the Nobel Committee has announced from Stockholm. One half of the prize goes to John O'Keefe, director of the Sainsbury Wellcome Centre in Neural Circuits and Behaviour at University College London. The other is for a husband-wife couple: May-Britt Moser, who is director of the Centre for Neural Computation in Trondheim, and Edvard Moser, director of the Kavli Institute for Systems Neuroscience in Trondheim. "In 1971, John O´Keefe discovered the first component of this positioning system," the Nobel Committee says in a statement that was just released. "He found that a type of nerve cell in an area of the brain called the hippocampus that was always activated when a rat was at a certain place in a room. Other nerve cells were activated when the rat was at other places. O´Keefe concluded that these “place cells” formed a map of the room." "More than three decades later, in 2005, May‐Britt and Edvard Moser discovered another key component of the brain’s positioning system," the statement goes on to explain. "They identified another type of nerve cell, which they called “grid cells”, that generate a coordinate system and allow for precise positioning and pathfinding. Their subsequent research showed how place and grid cells make it possible to determine position and to navigate." © 2014 American Association for the Advancement of Science

Keyword: Learning & Memory
Link ID: 20163 - Posted: 10.06.2014

Alison Abbott The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.” If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape. While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London. In 2007, while still only in their mid-40s, they won a competition by the Kavli Foundation of Oxnard, California, to build and direct one of only 17 Kavli Institutes around the world. The Mosers are now minor celebrities in their home country, and their institute has become a magnet for other big thinkers in neuroscience. “It is definitely intellectually stimulating to be around them,” says neurobiologist Nachum Ulanovsky from the Weizmann Institute of Science in Rehovot, Israel, who visited the Trondheim institute for the first time in September. © 2014 Nature Publishing Grou

Keyword: Learning & Memory
Link ID: 20162 - Posted: 10.06.2014

By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company

Keyword: Stress; Aggression
Link ID: 20161 - Posted: 10.04.2014

by Michael Marshall When we search for the seat of humanity, are we looking at the wrong part of the brain? Most neuroscientists assume that the neocortex, the brain's distinctive folded outer layer, is the thing that makes us uniquely human. But a new study suggests that another part of the brain, the cerebellum, grew much faster in our ape ancestors. "Contrary to traditional wisdom, in the human lineage the cerebellum was the part of the brain that accelerated its expansion most rapidly, rather than the neocortex," says Rob Barton of Durham University in the UK. With Chris Venditti of the University of Reading in the UK, Barton examined how the relative sizes of different parts of the brain changed as primates evolved. During the evolution of monkeys, the neocortex and cerebellum grew in tandem, a change in one being swiftly followed by a change in the other. But starting with the first apes around 25 million years ago through to chimpanzees and humans, the cerebellum grew much faster. As a result, the cerebellums of apes and humans contain far more neurons than the cerebellum of a monkey, even if that monkey were scaled up to the size of an ape. "The difference in ape cerebellar volume, relative to a scaled monkey brain, is equal to 16 billion extra neurons," says Barton. "That's the number of neurons in the entire human neocortex." © Copyright Reed Business Information Ltd.

Keyword: Evolution; Aggression
Link ID: 20160 - Posted: 10.04.2014

|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American

Keyword: Learning & Memory; Aggression
Link ID: 20159 - Posted: 10.04.2014

By Kevin Hartnett You may have seen that deliberately annoying “View of the World from Ninth Avenue” map featured on the cover of the New Yorker a while back. It shows the distorted way geography appears to a Manhattanite: 9th and 10th avenues are the center of the world, New Jersey appears, barely, and everywhere else is just a blip if it registers at all. As it turns out, a similar kind of map exists for the human body — with at least some basis in neuroscience. In August I wrote a story for Ideas on the rise of face transplants and spoke to Michael Sims, author of the book, “Adam’s Navel: A Natural and Cultural History of the Human Form.” During our conversation Sims mentioned an odd diagram published in 1951 by a neurosurgeon named Wilder Penfield. The diagram is known as “Homunculus” (a name taken from a weird and longstanding art form that depicts small human beings); it shows the human body scaled according to the amount of brain tissue dedicated to each part, and arranged according to the locations in the brain that control them. In the diagram, the eyes, lips, nose, and tongue appear grotesquely large, indicating that we devote an outsized amount of brain tissue to operating and receiving sensation from these parts of the body. (Sims’s point was that we devote a lot of processing power to the face, and for that reason find it biologically disorienting that faces could be changeable.) The hand is quite large, too, while the toes, legs, trunks, shoulders, and arms are tiny, the equivalents of Kansas City and Russia on the New Yorker map. “Homunculus” seems like the kind of thing that would have long since been superseded by modern brain science, but it actually continues to have a surprising amount of authority, and often appears in neuroscience textbooks.

Keyword: Pain & Touch
Link ID: 20158 - Posted: 10.04.2014

By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014

Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.

Keyword: Emotions; Aggression
Link ID: 20156 - Posted: 10.04.2014

By Fredrick Kunkle Years ago, many scientists assumed that a woman’s heart worked pretty much the same as a man’s. But as more women entered the male-dominated field of cardiology, many such assumptions vanished, opening the way for new approaches to research and treatment. A similar shift is underway in the study of Alzheimer’s disease. It has long been known that more women than men get the deadly neurodegenerative disease, and an emerging body of research is challenging the common wisdom as to why. Although the question is by no means settled, recent findings suggest that biological, genetic and even cultural influences may play heavy roles. Of the more than 5 million people in the United States who have been diagnosed with Alzheimer’s, the leading cause of dementia, two-thirds are women. Because advancing age is considered the biggest risk factor for the disease, researchers largely have attributed that disparity to women’s longer life spans. The average life expectancy for women is 81 years, compared with 76 for men. Yet “even after taking age into account, women are more at risk,” said Richard Lipton, a physician who heads the Einstein Aging Study at Albert Einstein College of Medicine in New York. With the number of Alzheimer’s cases in the United States expected to more than triple by 2050, some researchers are urging a greater focus on understanding the underlying reasons women are more prone to the disease and on developing gender-specific treatments. .

Keyword: Alzheimers; Aggression
Link ID: 20155 - Posted: 10.04.2014

Carl Zimmer As much as we may try to deny it, Earth’s cycle of day and night rules our lives. When the sun sets, the encroaching darkness sets off a chain of molecular events spreading from our eyes to our pineal gland, which oozes a hormone called melatonin into the brain. When the melatonin latches onto neurons, it alters their electrical rhythm, nudging the brain into the realm of sleep. At dawn, sunlight snuffs out the melatonin, forcing the brain back to its wakeful pattern again. We fight these cycles each time we stay up late reading our smartphones, suppressing our nightly dose of melatonin and waking up grumpy the next day. We fly across continents as if we could instantly reset our inner clocks. But our melatonin-driven sleep cycle lags behind, leaving us drowsy in the middle of the day. Scientists have long wondered how this powerful cycle got its start. A new study on melatonin hints that it evolved some 700 million years ago. The authors of the study propose that our nightly slumbers evolved from the rise and fall of our tiny oceangoing ancestors, as they swam up to the surface of the sea at twilight and then sank in a sleepy fall through the night. To explore the evolution of sleep, scientists at the European Molecular Biology Laboratory in Germany study the activity of genes involved in making melatonin and other sleep-related molecules. Over the past few years, they’ve compared the activity of these genes in vertebrates like us with their activity in a distantly related invertebrate — a marine worm called Platynereis dumerilii. The scientists studied the worms at an early stage, when they were ball-shaped 2-day-old larvae. The ocean swarms with juvenile animals like these. Many of them spend their nights near the ocean surface, feeding on algae and other bits of food. Then they spend the day at lower depths, where they can hide from predators and the sun’s ultraviolet rays. © 2014 The New York Times Company

Keyword: Sleep; Aggression
Link ID: 20154 - Posted: 10.02.2014

BY Bethany Brookshire In this sweet, sweet world we live in, losing weight can be a dull and flavorless experience. Lovely stove-popped popcorn drenched in butter gives way to dry microwaved half-burnt kernels covered in dusty yellow powder. The cookies and candy that help us get through the long afternoons are replaced with virtuous but boring apples and nuts. Even the sugar that livens up our coffee gets a skeptical eye: That’s an extra 23 calories per packet you shouldn’t be eating. What makes life sweet for those of us who are counting calories is artificial sweeteners. Diet soda gives a sweet carbonated fix. A packet of artificial sweetener in your coffee or tea makes it a delicious morning dose. But a new study, published September 17 in Nature, found that the artificial sweetener saccharin has an unintended side effect: It alters the bacterial composition of the gut in mice and humans. The new bacterial neighborhood brings with it higher blood glucose levels, putting the humans and the murine counterparts at risk for diabetes. Many people wondered if the study’s effects were real. We all knew that sugar was bad, but now the scientists are coming for our Splenda! It seems more than a little unfair. But this study was a long time coming. The scientific community has been studying artificial sweeteners and their potential hazards for a long time. And while the new study adds to the literature, there are other studies, currently ongoing and planned for the future, that will determine the extent and necessity of our artificially sweetened future. © Society for Science & the Public 2000 - 2014.

Keyword: Obesity; Aggression
Link ID: 20153 - Posted: 10.02.2014

James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class. The improvements in this case came in executive control, which consists of inhibition (resisting distraction, maintaining focus), working memory, and cognitive flexibility (switching between tasks). The images above show the brain activity in the group of kids who did the program as opposed to the group that didn't. It's the kind of difference that's so dramatic it's a little unsettling. The study only lasted nine months, but when you're only seven years old, nine months is a long time to be sitting in class with a blue head. © 2014 by The Atlantic Monthly Group.

Keyword: ADHD
Link ID: 20152 - Posted: 10.02.2014

|By Nathan Collins Step aside, huge magnets and radioactive tracers—soon some brain activity will be revealed by simply training dozens of red lights on the scalp. A new study in Nature Photonics finds this optical technique can replicate functional MRI experiments, and it is more comfortable, more portable and less expensive. The method is an enhancement of diffuse optical tomography (DOT), in which a device shines tiny points of red light at a subject's scalp and analyzes the light that bounces back. The red light reflects off red hemoglobin in the blood but does not interact as much with tissues of other colors, which allows researchers to recover an fMRI-like image of changing blood flow in the brain at work. For years researchers attempting to use DOT have been limited by the difficulty of packing many heavy light sources and detectors into the small area around the head. They also needed better techniques for analyzing the flood of data that the detectors collected. Now researchers at Washington University in St. Louis and the University of Birmingham in England report they have solved those problems and made the first high-density DOT (HD-DOT) brain scans. The team first engineered a “double halo” structure to support the weight of 96 lights and 92 detectors, more than double the number in earlier arrays. The investigators also dealt with the computing challenges associated with that many lights—for example, they figured out how to filter out interference from blood flow in the scalp and other tissues. The team then used HD-DOT to successfully replicate fMRI studies of vision and language processing—a task impossible for other fMRI alternatives, such as functional near-infrared spectroscopy or electroencephalography, which do not cover a large enough swath of the brain or have sufficient resolution to pinpoint active brain areas. Finally, the team scanned the brains of people who have implanted electrodes for Parkinson's disease—something fMRI can never do because the machine generates electromagnetic waves that can destroy electronic devices such as pacemakers. © 2014 Scientific American

Keyword: Brain imaging
Link ID: 20151 - Posted: 10.02.2014