Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Dean Burnett Yesterday, an article in the Entrepreneurs section of the Guardian purported to reveal a “cloth cap that could help treat depression”. This claim has caused some alarm in the neuroscience and mental health fields, so it’s important to look a little more closely at what the manufacturers are actually claiming. The piece in question concerns a product from Neuroelectrics: a soft helmet containing electrodes and sensors. According to the company’s website, it can be used to monitor brain activity (electroencephalography, or EEG), or administer light electrical currents to different areas of the brain in order to treat certain neurological and psychiatric conditions (known as transcranial direct current stimulation or tDCS). While this would obviously be great news to the millions of people who deal with such conditions every day, such claims should be treated with a considerable amount of caution. The fields of science dedicated to researching and, hopefully, treating serious brain-based problems like depression, stroke, personality disorder etc. work hard to find new and inventive methods for doing so, or refining and improving existing ones. Sometimes they succeed, but probably not as often as they’d like. The problem is that when a new development occurs or a new approach is found, it doesn’t automatically mean it’s widely applicable or even effective for everyone. The brain is furiously complicated. There is no magic bullet for brain problems [Note: you shouldn’t use bullets, magic or otherwise, when dealing with the brain]. © 2015 Guardian News and Media Limited
Link ID: 21305 - Posted: 08.18.2015
By NICHOLAS BAKALAR “Insanity Treated By Electric Shock” read the headline of an article published on July 6, 1940, in The New York Times. The article described “a new method, introduced in Italy, of treating certain types of mental disorders by sending an electric shock through the brain.” It was the first time that what is now called electroconvulsive therapy, or ECT, had been mentioned in The Times. The electric shock, the article said, “is produced by a small portable electric box which was invented in Italy by Professor Ugo Cerletti of the Rome University Clinic.” Dr. S. Eugene Barrera, the principal researcher on the project, “emphasized that hope for any ‘miracle cure’ must not be pinned on the new method.” On April 29, 1941, the subject came up again, this time in an article about a scientific meeting at which a professor of psychiatry at Northwestern reported “ ‘very promising instantaneous results’ in the recently developed electric shock method of relieving schizophrenic patients of their malady.” The treatment entered clinical practice fairly quickly. In October 1941, The Times reported on the opening of several new buildings at Hillside Hospital in Queens (today called Zucker Hillside Hospital). “The hospital has pioneered in the use of insulin and metrazol, and also in the electric shock treatment, which has proved useful in shortening the average stay of patients,” the article read. Over the years, ECT has had its ups and downs in the public imagination and in the pages of The Times. In an article on Nov. 25, 1980, the reporter Dava Sobel seemed to relegate it to another age. © 2015 The New York Times Company
Link ID: 21304 - Posted: 08.18.2015
By CLAIRE MARTIN The eyeglass lenses that Don McPherson invented were meant for surgeons. But through serendipity he found an entirely different use for them: as a possible treatment for colorblindness. Mr. McPherson is a glass scientist and an avid Ultimate Frisbee player. He discovered that the lenses he had invented, which protect surgeons’ eyes from lasers and help them differentiate human tissue, caused the world at large to look candy-colored — including the Frisbee field. At a tournament in Santa Cruz, Calif., in 2002, while standing on a grassy field dotted with orange goal-line cones, he lent a pair of glasses with the lenses to a friend who happened to be colorblind. “He said something to the effect of, ‘Dude, these are amazing,’ ” Mr. McPherson says. “He’s like, ‘I see orange cones. I’ve never seen them before.’ ” Mr. McPherson was intrigued. He said he did not know the first thing about colorblindness, but felt compelled to figure out why the lenses were having this effect. Mr. McPherson had been inserting the lenses into glasses that he bought at stores, then selling them through Bay Glass Research, his company at the time. Mr. McPherson went on to study colorblindness, fine-tune the lens technology and start a company called EnChroma that now sells glasses for people who are colorblind. His is among a range of companies that have brought inadvertent or accidental inventions to market. Such inventions have included products as varied as Play-Doh, which started as a wallpaper cleaner, and the pacemaker, discovered through a study of hypothermia. To learn more about color vision and the feasibility of creating filters to correct colorblindness, Mr. McPherson applied for a grant from the National Institutes of Health in 2005. He worked with vision scientists and a mathematician and computer scientist named Andrew Schmeder. They weren’t the first to venture into this industry; the history of glassmakers claiming to improve colorblindness is long and riddled with controversy. © 2015 The New York Times Company
Link ID: 21303 - Posted: 08.17.2015
A new clinical trial is set to begin in the United Kingdom using the powerful noses of dogs to detect prostate cancer in humans. While research has been done before, these are the first trials approved by Britain's National Health Service. The trials, at the Milton Keynes University Hospital in Buckinghamshire, will use animals from a nonprofit organization called Medical Detection Dogs, co-founded in 2008 by behavioral psychologist Claire Guest. "What we've now discovered is that lots of diseases and conditions — and cancer included — that they actually have different volatile organic compounds, these smelly compounds, that are associated with them," Guest tells NPR's Rachel Martin. "And dogs can smell them." The dogs offer an inexpensive, non-invasive method to accompany the existing blood tests for prostate cancer, which detect prostate-specific antigen, or PSA, Guest says. "It's a low false-negative but a very high false-positive, meaning that three out of four men that have a raised PSA haven't got cancer," she explains. "So the physician has a very difficult decision to make: Which of the four men does he biopsy? What we want to do is provide an additional test — not a test that stands alone but an additional test that runs alongside the current testing, which a physician can use as part of that patient's picture." The samples come to the dogs — the dogs never go to the patient. At the moment, our dogs would be screening about between a .5- to 1-ml drop of urine [or 1/5 to 1/10 teaspoon], so a very small amount. In the early days, of course, we know whether the samples have come from a patient with cancer or if the patient has another disease or condition, or is in fact healthy. © 2015 NPR
Keyword: Chemical Senses (Smell & Taste)
Link ID: 21302 - Posted: 08.17.2015
John von Radowitz , Press Association Psychologists have confirmed that playing violent video games is linked to aggressive and callous behaviour. A review of almost a decade of studies found that exposure to violent video games was a "risk factor" for increased aggression. But the same team of experts said there was insufficient evidence to conclude that the influence of games such as Call Of Duty and Grand Theft Auto led to criminal acts. The findings have prompted a call for more parental control over violent scenes in video games from the American Psychological Association (APA). The original version of Doom, released in 1993, was widely controversial for its unprecedented levels of graphic violence A report from the APA task force on violent media concludes: "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression." The report said no single influence led a person to act aggressively or violently. Rather, it was an "accumulation of risk factors" that resulted in such behaviour. It added: "The research reviewed here demonstrates that violent video game use is one such risk factor." The APA has urged game creators to increase levels of parental control over the amount of violence video games contain.
Link ID: 21301 - Posted: 08.17.2015
Geoff Brumfiel Learning to make sounds by listening to others is a skill that helps make us human. But research now suggests a species of monkey may have evolved similar abilities. Marmosets have the capacity to learn calls from their parents, according to research published Thursday in the journal Science. The results mean that studying marmosets might provide insights into developmental disorders found in humans. It also suggests that vocal learning may be more widespread than many researchers thought. Many animals can link sounds with meaning. Dogs respond to simple calls; chimpanzees can even communicate with people using sign language. But the ability to hear a sound and mimic it is possessed by only a small number of species: primarily song birds and humans. "We didn't think that mammals and primates in particular — besides us — had any type of vocal learning," says Asif Ghazanfar, a neuroscientist at Princeton University who led the new study. Enter the small, adorable common marmoset. These fuzzy South American primates look more like squirrels than a monkey. "They're cute, and they smell. They wash themselves in their own urine," Ghazanfar says. "I'm not sure why they do that." But once you get over the stink, these little guys are interesting. Marmoset mommies always give birth to twins and they need help rearing them. So, unlike many mammal species, fathers lend a hand, along with siblings and other community members. Ghazanfar thinks all that child care is what gives marmosets another special trait: They're super talkative. "They're chattering nonstop," he says. "That is also very different from our close relatives the chimpanzees." © 2015 NPR
Carl Zimmer You are what you eat, and so were your ancient ancestors. But figuring out what they actually dined on has been no easy task. There are no Pleistocene cookbooks to consult. Instead, scientists must sift through an assortment of clues, from the chemical traces in fossilized bones to the scratch marks on prehistoric digging sticks. Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains. Roughly seven million years ago, our ancestors split off from the apes. As far as scientists can tell, those so-called hominins ate a diet that included a lot of raw, fiber-rich plants. After several million years, hominins started eating meat. The oldest clues to this shift are 3.3-million-year-old stone tools and 3.4-million-year-old mammal bones scarred with cut marks. The evidence suggests that hominins began by scavenging meat and marrow from dead animals. At some point hominins began to cook meat, but exactly when they invented fire is a question that inspires a lot of debate. Humans were definitely making fires by 300,000 years ago, but some researchers claim to have found campfires dating back as far as 1.8 million years. Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants. © 2015 The New York Times Company
When the owl swooped, the “blind” mice ran away. This was thanks to a new type of gene therapy to reprogramme cells deep in the eye to sense light. After treatment, the mice ran for cover when played a video of an approaching owl, just like mice with normal vision. “You could say they were trying to escape, but we don’t know for sure,” says Rob Lucas of the University of Manchester, UK, co-leader of the team that developed and tested the treatment. “What we can say is that they react to the owl in the same way as sighted mice, whereas the untreated mice didn’t do anything.” This is the team’s best evidence yet that injecting the gene for a pigment that detects light into the eyes of blind mice can help them see real objects again. This approach aims to treat all types of blindness caused by damaged or missing rods and cones, the eye’s light receptor cells. Most gene therapies for blindness so far have concentrated on replacing faulty genes in rarer, specific forms of inherited blindness, such as Leber congenital amaurosis. Deep down The new treatment works by enabling other cells that lie deeper within the retina to capture light. While rod and cone cells normally detect light and convert this into an electrical signal, the ganglion and bipolar cells behind them are responsible for processing these signals and sending them to the brain. By giving these cells the ability to produce their own light-detecting pigment, they can to some extent compensate for the lost receptors, or so it seems.
Link ID: 21298 - Posted: 08.15.2015
By Robert F. Service Move over, poppies. In one of the most elaborate feats of synthetic biology to date, a research team has engineered yeast with a medley of plant, bacterial, and rodent genes to turn sugar into thebaine, the key opiate precursor to morphine and other powerful painkilling drugs that have been harvested for thousands of years from poppy plants. The team also showed that with further tweaks, the yeast could make hydrocodone, a widely used painkiller that is now made chemically from thebaine. “This is a major milestone,” says Jens Nielsen, a synthetic biologist at Chalmers University of Technology in Göteborg, Sweden. The work, he adds, demonstrates synthetic biology’s increasing sophistication at transferring complex metabolic pathways into microbes. By tweaking the yeast pathways, medicinal chemists may be able to produce more effective, less addictive versions of opiate painkillers. But some biopolicy experts worry that morphinemaking yeast strains could also allow illicit drugmakers to brew heroin as easily as beer enthusiasts home brew today—the drug is a simple chemical conversion from morphine. That concern is one reason the research team, led by Christina Smolke, a synthetic biologist at Stanford University in Palo Alto, California, stopped short of making a yeast strain with the complete morphine pathway; medicinal drug makers also primarily use thebaine to make new compounds. Synthetic biologists had previously engineered yeast to produce artemisinin, an antimalarial compound, but that required inserting just a handful of plant genes. To get yeast to make thebaine, © 2015 American Association for the Advancement of Science.
By James Gallagher Health editor, BBC News website Fat or carbs? Scientists have shed new light on which diet might be more effective at reducing fat Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows. Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake. Experts say the most effective diet is one people can stick to. It has been argued that restricting carbs is the best way to get rid of a "spare tyre" as it alters the body's metabolism. The theory goes that fewer carbohydrates lead to lower levels of insulin, which in turn lead to fat being released from the body's stores. "All of those things do happen with carb reduction and you do lose body fat, but not as much as when you cut out the fat," said lead researchers Dr Kevin Hall, from the US-based National Institute of Diabetes and Digestive and Kidney Diseases. Cutting down on carbohydrates might not be as effective after all, the study suggests In the study, 19 obese people were initially given 2,700 calories a day. Then, over a period of two weeks they tried diets which cut their calorie intake by a third, either by reducing carbohydrates or fat. The team analysed the amount of oxygen and carbon dioxide being breathed out and the amount of nitrogen in participants' urine to calculate precisely the chemical processes taking place inside the body. The results published in Cell Metabolism showed that after six days on each diet, those reducing fat intake lost an average 463g of body fat - 80% more than those cutting down on carbs, whose average loss was 245g. © 2015 BBC.
Link ID: 21296 - Posted: 08.15.2015
Alison Abbott The octopus genome offers clues to how cephalopods evolved intelligence to rival the craftiest vertebrates. With its eight prehensile arms lined with suckers, camera-like eyes, elaborate repertoire of camouflage tricks and spooky intelligence, the octopus is like no other creature on Earth. Added to those distinctions is an unusually large genome, described in Nature1 on 12 August, that helps to explain how a mere mollusc evolved into an otherworldly being. “It’s the first sequenced genome from something like an alien,” jokes neurobiologist Clifton Ragsdale of the University of Chicago in Illinois, who co-led the genetic analysis of the California two-spot octopus (Octopus bimaculoides). The work was carried out by researchers from the University of Chicago, the University of California, Berkeley, the University of Heidelberg in Germany and the Okinawa Institute of Science and Technology in Japan. The scientists also investigated gene expression in twelve different types of octopus tissue. “It’s important for us to know the genome, because it gives us insights into how the sophisticated cognitive skills of octopuses evolved,” says neurobiologist Benny Hochner at the Hebrew University of Jerusalem in Israel, who has studied octopus neurophysiology for 20 years. Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs. © 2015 Nature Publishing Group
A healthy motor neuron needs to transport its damaged components from the nerve-muscle connection all the way back to the cell body in the spinal cord. If it cannot, the defective components pile up and the cell becomes sick and dies. Researchers at the National Institutes of Health’s National Institute of Neurological Disorders and Stroke (NINDS) have learned how a mutation in the gene for superoxide dismutase 1 (SOD1), which causes ALS, leads cells to accumulate damaged materials. The study, published in the journal Neuron, suggests a potential target for treating this familial form of ALS. More than 12,000 Americans have ALS, also known as Lou Gehrig’s disease, and roughly 5-10 percent of them inherited a genetic mutation from a parent. These cases of familial ALS are often caused by mutations in the gene that codes for SOD1, an important enzyme located in the neuron’s mitochondria, the cell’s energy-producing structures. This mutation causes the death of motor neurons that control the patient’s muscles, resulting in progressive paralysis. “About 90 percent of the energy in the brain is generated by mitochondria,” said Zu-Hang Sheng, Ph.D., an NINDS scientist and the study’s senior author. “If the mitochondria aren’t healthy, they produce energy less efficiently; they can also release harmful chemicals called reactive oxygen species that cause cell death. As a consequence, mitochondrial damage can cause neurodegeneration.” In healthy neurons, storage containers called late endosomes collect damaged mitochondria and various destructive chemicals. A motor protein called dynein then transports the endosomes to structures called lysosomes, which use the chemicals to break down the endosomes. Dr. Sheng’s team discovered that this crucial process is faulty in nerve cells with SOD1 mutations because mutant SOD1 interferes with a critical molecule called snapin that hooks the endosome to the dynein motor protein.
Keyword: ALS-Lou Gehrig's Disease
Link ID: 21294 - Posted: 08.13.2015
Richard Harris Hospitals have a free and powerful tool that they could use more often to help reduce the pain that surgery patients experience: music. Scores of studies over the years have looked at the power of music to ease this kind of pain; an analysis published Wednesday in The Lancet that pulls all those findings together builds a strong case. When researchers in London started combing the medical literature for studies about music's soothing power, they found hundreds of small studies suggesting some benefit. The idea goes back to the days of Florence Nightingale, and music was used to ease surgical pain as early as 1914. (My colleague Patricia Neighmond reported on one of these studies just a few months ago). Dr. Catherine Meads at Brunel University focused her attention on 73 rigorous, randomized clinical trials about the role of music among surgery patients. "As they studies themselves were small, they really didn't find all that much," Meads says. "But once we put them all together, we had much more power to find whether music worked or not." She and her colleagues now report that, yes indeed, surgery patients who listened to music, either before, during or after surgery, were better off — in terms of reduced pain, less anxiety and more patient satisfaction. © 2015 NPR
Keyword: Pain & Touch
Link ID: 21293 - Posted: 08.13.2015
by Julia Belluz When neuroscientists stuck a dead salmon in an fMRI machine and watched its brain light up, they knew they had a problem. It wasn't that there was a dead fish in their expensive imaging machine; they'd put it there on purpose, after all. It was that the medical device seemed to be giving these researchers impossible results. Dead fish should not have active brains. The lit of brain of a dead salmon — a cautionary neuroscience tale. (University of California Santa Barbara research poster) The researchers shared their findings in 2009 as a cautionary tale: If you don't run the proper statistical tests on your neuroscience data, you can come up with any number of implausible conclusions — even emotional reactions from a dead fish. In the 1990s, neuroscientists started using the massive, round fMRI (or functional magnetic resonance imaging) machines to peer into their subjects' brains. But since then, the field has suffered from a rash of false positive results and studies that lack enough statistical power — the likelihood of finding a real result when it exists — to deliver insights about the brain. When other scientists try to reproduce the results of original studies, they too often fail. Without better methods, it'll be difficult to develop new treatments for brain disorders and diseases like Alzheimer's and depression — let alone learn anything useful about our most mysterious organ. © 2015 Vox Media, Inc
Keyword: Brain imaging
Link ID: 21292 - Posted: 08.13.2015
Alison Abbott This is the crackle of neural activity that allows a fruit-fly (Drosophila melanogaster) larva to crawl backwards: a flash in the brain and a surge that undulates through the nervous system from the top of the larva’s tiny body to the bottom. When the larva moves forwards, the surge flows the other way. The video — captured almost at the resolution of single neurons — demonstrates the latest development in a technique to film neural activity throughout an entire organism. The original method was invented by Philipp Keller and Misha Ahrens at the Howard Hughes Medical Institute's Janelia Research Campus in Ashburn, Virginia. The researchers genetically modify neurons so that each cell fluoresces when it fires; they then use innovative microscopy that involves firing sheets of light into the brain to record that activity. In 2013, the researchers produced a video of neural activity across the brain of a (transparent) zebrafish larva1. The fruit-fly larva that is mapped in the latest film, published in Nature Communications on 11 August2, is more complicated. The video shows neural activity not just in the brain, but throughout the entire central nervous system (CNS), including the fruit-fly equivalent of a mammalian spinal cord. And unlike the zebrafish, the fruit fly's nervous system is not completely transparent, which makes it harder to image. The researchers stripped the CNS from the larva’s body to examine it. For up to an hour after removal, the CNS continues to spontaneously fire the coordinated patterns of activity that typically drive crawling (and other behaviours). © 2015 Nature Publishing Group
Keyword: Brain imaging
Link ID: 21291 - Posted: 08.12.2015
Your body may be still, but as you dream, your eyes can flicker manically. The rapid eye movement stage of sleep is when we have our most vivid dreams – but do our flickering eyes actually “see” anything? It is a question psychologists have been asking since REM sleep was first described in the 1950s, says Yuval Nir at Tel Aviv University in Israel. “The idea was that we scan an imaginary scene,” says Nir. “It’s an intuitive idea, but it has been very difficult to provide evidence for it.” Until now, much of the evidence has been anecdotal, says Nir. “People who were woken up when their eyes were moving from left to right would say they were dreaming about tennis, for example,” he says. More evidence comes from a previous study that monitored the sleep of people who have a disorder that means they often physically act out their dreams. Their eye movements matched their actions around 80 per cent of the time – a man dreaming about smoking, for example, appeared to look at a dream ashtray as he put out a cigarette. But most of the REM sleep these people had was not accompanied by body movements, making it hard to know for sure. And other researchers have argued that the eye flickers can’t be linked to “seeing” anything because rapid eye movements happen in both fetuses and people who are blind – neither group would have experience of vision and so wouldn’t be expected to move their eyes to follow an object, for example. © Copyright Reed Business Information Ltd.
Link ID: 21290 - Posted: 08.12.2015
Ashley Yeager A mouse scurries across a round table rimmed with Dixie cup–sized holes. Without much hesitation, the rodent heads straight for the hole that drops it into a box lined with cage litter. Any other hole would have led to a quick fall to the floor. But this mouse was more than lucky. It had an advantage — human glial cells were growing in its brain. Glia are thought of as the support staff for the brain’s nerve cells, or neurons, which transmit and receive the brain’s electrical and chemical signals. Named for the Greek term for “glue,” glia have been known for nearly 170 years as the cells that hold the brain’s bits together. Some glial cells help feed neurons. Other glia insulate nerve cell branches with myelin. Still others attack brain invaders responsible for infection or injury. Glial cells perform many of the brain’s most important maintenance jobs. But recent studies suggest they do a lot more. Glia can shape the conversation between neurons, speeding or slowing the electrical signals and strengthening neuron-to-neuron connections. When scientists coaxed human glia to grow in the brains of baby mice, the mice grew up to be supersmart, navigating tabletops full of holes and mastering other tasks much faster than normal mice. This experiment and others suggest that glia may actually orchestrate learning and memory, says neuroscientist R. Douglas Fields. “Glia aren’t doing vibrato. That’s for the neurons,” says Fields, of the National Institute of Child Health and Human Development in Bethesda, Md. “Glia are the conductors.” © Society for Science & the Public 2000 - 2015
There may finally be a way to stop people progressing beyond the first signs of schizophrenia – fish oil. When people with early-stage symptoms took omega-3 supplements for three months, they had much lower rates of progression than those who did not, according to one small-scale trial. People with schizophrenia are usually diagnosed in their teens or 20s, but may experience symptoms for years beforehand, such as minor delusions or paranoid thoughts. Only about a third of people with such symptoms do go on to develop psychosis, however, and antipsychotic drugs can cause nasty side effects, so these are rarely given as a preventative. Fish oil supplements, which contain polyunsaturated fatty acids like omega-3, may be a benign alternative. These fatty acids may normally help dampen inflammation in the brain and protect neurons from damage, and lower levels in the brain have been implicated in several mental illnesses. Tests have found that people with schizophrenia have lower levels of these fatty acids in their blood cells, suggesting the same could be true for their brain cells. Fish oil supplements have been investigated as a treatment for adults with schizophrenia, but so far results have been mixed – four trials found no benefit while another four found a small reduction in symptoms. But a study that gave omega-3 fish oil pills to younger people suggests that what matters is catching the condition in time. The trial followed 81 people aged 13 to 25 with early signs of schizophrenia. Roughly half took fish oil pills and half took placebo tablets for three months. A year later, those given fish oils were less likely to have developed psychosis. © Copyright Reed Business Information Ltd.
Link ID: 21288 - Posted: 08.12.2015
Despite virtual reality’s recent renaissance, the technology still has some obvious problems. One, you look like a dumbass using it. Two, the stomach-churning mismatch between what you see and what you feel contributes to “virtual reality sickness.” But there’s another, less obvious flaw that could add to that off-kilter sensation: an eye-focusing problem called vergence-accommodation conflict. It’s only less obvious because, well, you rarely experience it outside of virtual reality. At SIGGRAPH in Los Angeles this week, Stanford professor Gordon Wetzstein and his colleagues are presenting a new head-mounted display that minimizes the vergence-accommodation conflict. This isn’t just some esoteric academic problem. Leading VR companies like Oculus and Microsoft know all too well their headsets are off, and Magic Leap, the super secret augmented reality company in Florida, is betting the house on finding a solution first. “It’s an exciting area of research,” says Martin Banks, a vision scientist at the University of California, Berkeley. “I think it’s going to be the next big thing in displays.” Okay okay, so what’s the big deal with the vergence-accommodation conflict? Two things happen when you simply “look” at an object. First, you point your eyeballs. If an object is close, your eyes naturally converge on it; if it’s far, they diverge. Hence, vergence. If your eyes don’t line up correctly, you end up seeing double. The second thing that happens is the lenses inside your eyes focus on the object, aka accommodation. Normally, vergence and accommodation are coupled. “The visual system has developed a circuit where the two response talk to each other,” says Banks. “That makes perfect sense in the natural environment. They’re both trying to get to the same distance, so why wouldn’t they talk to one another?” In other words, your meat brain has figured out a handy shortcut for the real world.
Link ID: 21287 - Posted: 08.12.2015
Could taking iodine pills in pregnancy help to raise children’s IQ? Some researchers suggest women in the UK should take such supplements, but others say the evidence is unclear, and that it could even harm development. Iodine is found in dairy foods and fish, and is used in the body to make thyroid hormone, which is vital for brain development in the womb. In some parts of the world, such as inland areas where little fish is consumed or the soil is low in iodine, severe deficiencies can markedly lower intelligence in some people. In most affected areas, iodine is now added to salt. The UK was not thought to need this step, but in 2013 a large study of urine samples from pregnant women found that about two-thirds had mild iodine deficiency, and that the children of those with the lowest levels had the lowest IQs. Now another team has combined data from this study with other data to calculate that if all women in the UK were given iodine supplements from three months before pregnancy until they finished breastfeeding, average IQ would increase by 1.2 points per child. And the children of mothers who were most iodine deficient would probably benefit more, says Kate Jolly of the University of Birmingham, who was involved in the study. “We are talking about very small differences but on a population basis it could mean quite a lot,” she says. The team calculated that providing these iodine supplements would be worth the cost to the UK’s National Health Service because it would boost the country’s productivity. © Copyright Reed Business Information Ltd.