Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 20771

Alexander Christie-Miller You could say they sent the first tweets. An ancient whistling language that sounds a little like birdsong has been found to use both sides of the brain – challenging the idea that the left side is all important for communicating. The whistling language is still used by around 10,000 people in the mountains of north-east Turkey, and can carry messages as far as 5 kilometres. Researchers have now shown that this language involves the brain’s right hemisphere, which was already known to be important for understanding music. Until recently, it was thought that the task of interpreting language fell largely to the brain’s left hemisphere. Onur Güntürkün of Ruhr University Bochum in Germany wondered whether the musical melodies and frequencies of whistled Turkish might require people to use both sides of their brain to communicate. His team tested 31 fluent whistlers by playing slightly different spoken or whistled syllables into their left and right ears at the same time, and asking them to say what they heard. The left hemisphere depends slightly more on sounds received by the right ear, and vice versa for the right hemisphere. By comparing the number of times the whistlers reported the syllables that had been played into either their right or left ear, they could tell how often each side of the brain was dominant. As expected, when the syllables were spoken, the right ear and left hemisphere were dominant 75 per cent of the time. But when syllables were whistled, the split between right and left dominance was about even. © Copyright Reed Business Information Ltd.

Keyword: Language; Laterality
Link ID: 21309 - Posted: 08.18.2015

By Perri Klass, A little more than a year ago, the American Academy of Pediatrics issued a policy statement saying that all pediatric primary care should include literacy promotion, starting at birth. That means pediatricians taking care of infants and toddlers should routinely be advising parents about how important it is to read to even very young children. The policy statement, which I wrote with Dr. Pamela C. High, included a review of the extensive research on the links between growing up with books and reading aloud, and later language development and school success. But while we know that reading to a young child is associated with good outcomes, there is only limited understanding of what the mechanism might be. Two new studies examine the unexpectedly complex interactions that happen when you put a small child on your lap and open a picture book. This month, the journal Pediatrics published a study that used functional magnetic resonance imaging to study brain activity in 3-to 5-year-old children as they listened to age-appropriate stories. The researchers found differences in brain activation according to how much the children had been read to at home. Children whose parents reported more reading at home and more books in the home showed significantly greater activation of brain areas in a region of the left hemisphere called the parietal-temporal-occipital association cortex. This brain area is “a watershed region, all about multisensory integration, integrating sound and then visual stimulation,” said the lead author, Dr. John S. Hutton, a clinical research fellow at Cincinnati Children’s Hospital Medical Center. This region of the brain is known to be very active when older children read to themselves, but Dr. Hutton notes that it also lights up when younger children are hearing stories. What was especially novel was that children who were exposed to more books and home reading showed significantly more activity in the areas of the brain that process visual association, even though the child was in the scanner just listening to a story and could not see any pictures. © 2015 The New York Times Company

Keyword: Language; Development of the Brain
Link ID: 21308 - Posted: 08.18.2015

Every brain cell has a nucleus, or a central command station. Scientists have shown that the passage of molecules through the nucleus of a star-shaped brain cell, called an astrocyte, may play a critical role in health and disease. The study, published in the journal Nature Neuroscience, was partially funded by the National Institutes of Health (NIH). “Unexpectedly we may have discovered a hidden pathway to understanding how astrocytes respond to injury and control brain processes. The pathway may be common to many brain diseases and we’re just starting to follow it,” said Katerina Akassoglou, Ph.D., a senior investigator at the Gladstone Institute for Neurological Disease, a professor of neurology at the University of California, San Francisco, and a senior author of the study. Some neurological disorders are associated with higher than normal brain levels of the growth factor TGF-beta, including Alzheimer's disease and brain injury. Previous studies found that after brain injury, astrocytes produce greater amounts of p75 neurotrophin receptor (p75NTR), a protein that helps cells detect growth factors. The cells also react to TGF-beta by changing their shapes and secreting proteins that alter neuronal activity. Dr. Akassoglou’s lab showed that eliminating the p75NTR gene prevented hydrocephalus in mice genetically engineered to have astrocytes that produce higher levels of TGF-beta. Hydrocephalus is a disorder that fills the brain with excess cerebral spinal fluid. Eliminating the p75NTR gene also prevented astrocytes in the brains of the mice from forming scars after injuries and restored gamma oscillations, which are patterns of neuronal activity associated with learning and memory.

Keyword: Brain Injury/Concussion; Glia
Link ID: 21307 - Posted: 08.18.2015

By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.

Keyword: Learning & Memory; Genes & Behavior
Link ID: 21306 - Posted: 08.18.2015

Dean Burnett Yesterday, an article in the Entrepreneurs section of the Guardian purported to reveal a “cloth cap that could help treat depression”. This claim has caused some alarm in the neuroscience and mental health fields, so it’s important to look a little more closely at what the manufacturers are actually claiming. The piece in question concerns a product from Neuroelectrics: a soft helmet containing electrodes and sensors. According to the company’s website, it can be used to monitor brain activity (electroencephalography, or EEG), or administer light electrical currents to different areas of the brain in order to treat certain neurological and psychiatric conditions (known as transcranial direct current stimulation or tDCS). While this would obviously be great news to the millions of people who deal with such conditions every day, such claims should be treated with a considerable amount of caution. The fields of science dedicated to researching and, hopefully, treating serious brain-based problems like depression, stroke, personality disorder etc. work hard to find new and inventive methods for doing so, or refining and improving existing ones. Sometimes they succeed, but probably not as often as they’d like. The problem is that when a new development occurs or a new approach is found, it doesn’t automatically mean it’s widely applicable or even effective for everyone. The brain is furiously complicated. There is no magic bullet for brain problems [Note: you shouldn’t use bullets, magic or otherwise, when dealing with the brain]. © 2015 Guardian News and Media Limited

Keyword: Depression
Link ID: 21305 - Posted: 08.18.2015

By NICHOLAS BAKALAR “Insanity Treated By Electric Shock” read the headline of an article published on July 6, 1940, in The New York Times. The article described “a new method, introduced in Italy, of treating certain types of mental disorders by sending an electric shock through the brain.” It was the first time that what is now called electroconvulsive therapy, or ECT, had been mentioned in The Times. The electric shock, the article said, “is produced by a small portable electric box which was invented in Italy by Professor Ugo Cerletti of the Rome University Clinic.” Dr. S. Eugene Barrera, the principal researcher on the project, “emphasized that hope for any ‘miracle cure’ must not be pinned on the new method.” On April 29, 1941, the subject came up again, this time in an article about a scientific meeting at which a professor of psychiatry at Northwestern reported “ ‘very promising instantaneous results’ in the recently developed electric shock method of relieving schizophrenic patients of their malady.” The treatment entered clinical practice fairly quickly. In October 1941, The Times reported on the opening of several new buildings at Hillside Hospital in Queens (today called Zucker Hillside Hospital). “The hospital has pioneered in the use of insulin and metrazol, and also in the electric shock treatment, which has proved useful in shortening the average stay of patients,” the article read. Over the years, ECT has had its ups and downs in the public imagination and in the pages of The Times. In an article on Nov. 25, 1980, the reporter Dava Sobel seemed to relegate it to another age. © 2015 The New York Times Company

Keyword: Depression
Link ID: 21304 - Posted: 08.18.2015

By CLAIRE MARTIN The eyeglass lenses that Don McPherson invented were meant for surgeons. But through serendipity he found an entirely different use for them: as a possible treatment for colorblindness. Mr. McPherson is a glass scientist and an avid Ultimate Frisbee player. He discovered that the lenses he had invented, which protect surgeons’ eyes from lasers and help them differentiate human tissue, caused the world at large to look candy-colored — including the Frisbee field. At a tournament in Santa Cruz, Calif., in 2002, while standing on a grassy field dotted with orange goal-line cones, he lent a pair of glasses with the lenses to a friend who happened to be colorblind. “He said something to the effect of, ‘Dude, these are amazing,’ ” Mr. McPherson says. “He’s like, ‘I see orange cones. I’ve never seen them before.’ ” Mr. McPherson was intrigued. He said he did not know the first thing about colorblindness, but felt compelled to figure out why the lenses were having this effect. Mr. McPherson had been inserting the lenses into glasses that he bought at stores, then selling them through Bay Glass Research, his company at the time. Mr. McPherson went on to study colorblindness, fine-tune the lens technology and start a company called EnChroma that now sells glasses for people who are colorblind. His is among a range of companies that have brought inadvertent or accidental inventions to market. Such inventions have included products as varied as Play-Doh, which started as a wallpaper cleaner, and the pacemaker, discovered through a study of hypothermia. To learn more about color vision and the feasibility of creating filters to correct colorblindness, Mr. McPherson applied for a grant from the National Institutes of Health in 2005. He worked with vision scientists and a mathematician and computer scientist named Andrew Schmeder. They weren’t the first to venture into this industry; the history of glassmakers claiming to improve colorblindness is long and riddled with controversy. © 2015 The New York Times Company

Keyword: Vision
Link ID: 21303 - Posted: 08.17.2015

A new clinical trial is set to begin in the United Kingdom using the powerful noses of dogs to detect prostate cancer in humans. While research has been done before, these are the first trials approved by Britain's National Health Service. The trials, at the Milton Keynes University Hospital in Buckinghamshire, will use animals from a nonprofit organization called Medical Detection Dogs, co-founded in 2008 by behavioral psychologist Claire Guest. "What we've now discovered is that lots of diseases and conditions — and cancer included — that they actually have different volatile organic compounds, these smelly compounds, that are associated with them," Guest tells NPR's Rachel Martin. "And dogs can smell them." The dogs offer an inexpensive, non-invasive method to accompany the existing blood tests for prostate cancer, which detect prostate-specific antigen, or PSA, Guest says. "It's a low false-negative but a very high false-positive, meaning that three out of four men that have a raised PSA haven't got cancer," she explains. "So the physician has a very difficult decision to make: Which of the four men does he biopsy? What we want to do is provide an additional test — not a test that stands alone but an additional test that runs alongside the current testing, which a physician can use as part of that patient's picture." The samples come to the dogs — the dogs never go to the patient. At the moment, our dogs would be screening about between a .5- to 1-ml drop of urine [or 1/5 to 1/10 teaspoon], so a very small amount. In the early days, of course, we know whether the samples have come from a patient with cancer or if the patient has another disease or condition, or is in fact healthy. © 2015 NPR

Keyword: Chemical Senses (Smell & Taste)
Link ID: 21302 - Posted: 08.17.2015

John von Radowitz , Press Association Psychologists have confirmed that playing violent video games is linked to aggressive and callous behaviour. A review of almost a decade of studies found that exposure to violent video games was a "risk factor" for increased aggression. But the same team of experts said there was insufficient evidence to conclude that the influence of games such as Call Of Duty and Grand Theft Auto led to criminal acts. The findings have prompted a call for more parental control over violent scenes in video games from the American Psychological Association (APA). The original version of Doom, released in 1993, was widely controversial for its unprecedented levels of graphic violence A report from the APA task force on violent media concludes: "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression." The report said no single influence led a person to act aggressively or violently. Rather, it was an "accumulation of risk factors" that resulted in such behaviour. It added: "The research reviewed here demonstrates that violent video game use is one such risk factor." The APA has urged game creators to increase levels of parental control over the amount of violence video games contain.

Keyword: Aggression
Link ID: 21301 - Posted: 08.17.2015

Geoff Brumfiel Learning to make sounds by listening to others is a skill that helps make us human. But research now suggests a species of monkey may have evolved similar abilities. Marmosets have the capacity to learn calls from their parents, according to research published Thursday in the journal Science. The results mean that studying marmosets might provide insights into developmental disorders found in humans. It also suggests that vocal learning may be more widespread than many researchers thought. Many animals can link sounds with meaning. Dogs respond to simple calls; chimpanzees can even communicate with people using sign language. But the ability to hear a sound and mimic it is possessed by only a small number of species: primarily song birds and humans. "We didn't think that mammals and primates in particular — besides us — had any type of vocal learning," says Asif Ghazanfar, a neuroscientist at Princeton University who led the new study. Enter the small, adorable common marmoset. These fuzzy South American primates look more like squirrels than a monkey. "They're cute, and they smell. They wash themselves in their own urine," Ghazanfar says. "I'm not sure why they do that." But once you get over the stink, these little guys are interesting. Marmoset mommies always give birth to twins and they need help rearing them. So, unlike many mammal species, fathers lend a hand, along with siblings and other community members. Ghazanfar thinks all that child care is what gives marmosets another special trait: They're super talkative. "They're chattering nonstop," he says. "That is also very different from our close relatives the chimpanzees." © 2015 NPR

Keyword: Language; Evolution
Link ID: 21300 - Posted: 08.15.2015

Carl Zimmer You are what you eat, and so were your ancient ancestors. But figuring out what they actually dined on has been no easy task. There are no Pleistocene cookbooks to consult. Instead, scientists must sift through an assortment of clues, from the chemical traces in fossilized bones to the scratch marks on prehistoric digging sticks. Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today’s paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains. Roughly seven million years ago, our ancestors split off from the apes. As far as scientists can tell, those so-called hominins ate a diet that included a lot of raw, fiber-rich plants. After several million years, hominins started eating meat. The oldest clues to this shift are 3.3-million-year-old stone tools and 3.4-million-year-old mammal bones scarred with cut marks. The evidence suggests that hominins began by scavenging meat and marrow from dead animals. At some point hominins began to cook meat, but exactly when they invented fire is a question that inspires a lot of debate. Humans were definitely making fires by 300,000 years ago, but some researchers claim to have found campfires dating back as far as 1.8 million years. Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants. © 2015 The New York Times Company

Keyword: Evolution; Obesity
Link ID: 21299 - Posted: 08.15.2015

When the owl swooped, the “blind” mice ran away. This was thanks to a new type of gene therapy to reprogramme cells deep in the eye to sense light. After treatment, the mice ran for cover when played a video of an approaching owl, just like mice with normal vision. “You could say they were trying to escape, but we don’t know for sure,” says Rob Lucas of the University of Manchester, UK, co-leader of the team that developed and tested the treatment. “What we can say is that they react to the owl in the same way as sighted mice, whereas the untreated mice didn’t do anything.” This is the team’s best evidence yet that injecting the gene for a pigment that detects light into the eyes of blind mice can help them see real objects again. This approach aims to treat all types of blindness caused by damaged or missing rods and cones, the eye’s light receptor cells. Most gene therapies for blindness so far have concentrated on replacing faulty genes in rarer, specific forms of inherited blindness, such as Leber congenital amaurosis. Deep down The new treatment works by enabling other cells that lie deeper within the retina to capture light. While rod and cone cells normally detect light and convert this into an electrical signal, the ganglion and bipolar cells behind them are responsible for processing these signals and sending them to the brain. By giving these cells the ability to produce their own light-detecting pigment, they can to some extent compensate for the lost receptors, or so it seems.

Keyword: Vision
Link ID: 21298 - Posted: 08.15.2015

By Robert F. Service Move over, poppies. In one of the most elaborate feats of synthetic biology to date, a research team has engineered yeast with a medley of plant, bacterial, and rodent genes to turn sugar into thebaine, the key opiate precursor to morphine and other powerful painkilling drugs that have been harvested for thousands of years from poppy plants. The team also showed that with further tweaks, the yeast could make hydrocodone, a widely used painkiller that is now made chemically from thebaine. “This is a major milestone,” says Jens Nielsen, a synthetic biologist at Chalmers University of Technology in Göteborg, Sweden. The work, he adds, demonstrates synthetic biology’s increasing sophistication at transferring complex metabolic pathways into microbes. By tweaking the yeast pathways, medicinal chemists may be able to produce more effective, less addictive versions of opiate painkillers. But some biopolicy experts worry that morphinemaking yeast strains could also allow illicit drugmakers to brew heroin as easily as beer enthusiasts home brew today—the drug is a simple chemical conversion from morphine. That concern is one reason the research team, led by Christina Smolke, a synthetic biologist at Stanford University in Palo Alto, California, stopped short of making a yeast strain with the complete morphine pathway; medicinal drug
makers also primarily use thebaine to make new compounds. Synthetic biologists had previously engineered yeast to produce artemisinin, an antimalarial compound, but that required inserting just a handful of plant genes. To get yeast to make thebaine, © 2015 American Association for the Advancement of Science.

Keyword: Drug Abuse; Pain & Touch
Link ID: 21297 - Posted: 08.15.2015

By James Gallagher Health editor, BBC News website Fat or carbs? Scientists have shed new light on which diet might be more effective at reducing fat Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows. Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake. Experts say the most effective diet is one people can stick to. It has been argued that restricting carbs is the best way to get rid of a "spare tyre" as it alters the body's metabolism. The theory goes that fewer carbohydrates lead to lower levels of insulin, which in turn lead to fat being released from the body's stores. "All of those things do happen with carb reduction and you do lose body fat, but not as much as when you cut out the fat," said lead researchers Dr Kevin Hall, from the US-based National Institute of Diabetes and Digestive and Kidney Diseases. Cutting down on carbohydrates might not be as effective after all, the study suggests In the study, 19 obese people were initially given 2,700 calories a day. Then, over a period of two weeks they tried diets which cut their calorie intake by a third, either by reducing carbohydrates or fat. The team analysed the amount of oxygen and carbon dioxide being breathed out and the amount of nitrogen in participants' urine to calculate precisely the chemical processes taking place inside the body. The results published in Cell Metabolism showed that after six days on each diet, those reducing fat intake lost an average 463g of body fat - 80% more than those cutting down on carbs, whose average loss was 245g. © 2015 BBC.

Keyword: Obesity
Link ID: 21296 - Posted: 08.15.2015

Alison Abbott The octopus genome offers clues to how cephalopods evolved intelligence to rival the craftiest vertebrates. With its eight prehensile arms lined with suckers, camera-like eyes, elaborate repertoire of camouflage tricks and spooky intelligence, the octopus is like no other creature on Earth. Added to those distinctions is an unusually large genome, described in Nature1 on 12 August, that helps to explain how a mere mollusc evolved into an otherworldly being. “It’s the first sequenced genome from something like an alien,” jokes neurobiologist Clifton Ragsdale of the University of Chicago in Illinois, who co-led the genetic analysis of the California two-spot octopus (Octopus bimaculoides). The work was carried out by researchers from the University of Chicago, the University of California, Berkeley, the University of Heidelberg in Germany and the Okinawa Institute of Science and Technology in Japan. The scientists also investigated gene expression in twelve different types of octopus tissue. “It’s important for us to know the genome, because it gives us insights into how the sophisticated cognitive skills of octopuses evolved,” says neurobiologist Benny Hochner at the Hebrew University of Jerusalem in Israel, who has studied octopus neurophysiology for 20 years. Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs. © 2015 Nature Publishing Group

Keyword: Intelligence; Genes & Behavior
Link ID: 21295 - Posted: 08.13.2015

A healthy motor neuron needs to transport its damaged components from the nerve-muscle connection all the way back to the cell body in the spinal cord. If it cannot, the defective components pile up and the cell becomes sick and dies. Researchers at the National Institutes of Health’s National Institute of Neurological Disorders and Stroke (NINDS) have learned how a mutation in the gene for superoxide dismutase 1 (SOD1), which causes ALS, leads cells to accumulate damaged materials. The study, published in the journal Neuron, suggests a potential target for treating this familial form of ALS. More than 12,000 Americans have ALS, also known as Lou Gehrig’s disease, and roughly 5-10 percent of them inherited a genetic mutation from a parent. These cases of familial ALS are often caused by mutations in the gene that codes for SOD1, an important enzyme located in the neuron’s mitochondria, the cell’s energy-producing structures. This mutation causes the death of motor neurons that control the patient’s muscles, resulting in progressive paralysis. “About 90 percent of the energy in the brain is generated by mitochondria,” said Zu-Hang Sheng, Ph.D., an NINDS scientist and the study’s senior author. “If the mitochondria aren’t healthy, they produce energy less efficiently; they can also release harmful chemicals called reactive oxygen species that cause cell death. As a consequence, mitochondrial damage can cause neurodegeneration.” In healthy neurons, storage containers called late endosomes collect damaged mitochondria and various destructive chemicals. A motor protein called dynein then transports the endosomes to structures called lysosomes, which use the chemicals to break down the endosomes. Dr. Sheng’s team discovered that this crucial process is faulty in nerve cells with SOD1 mutations because mutant SOD1 interferes with a critical molecule called snapin that hooks the endosome to the dynein motor protein.

Keyword: ALS-Lou Gehrig's Disease
Link ID: 21294 - Posted: 08.13.2015

Richard Harris Hospitals have a free and powerful tool that they could use more often to help reduce the pain that surgery patients experience: music. Scores of studies over the years have looked at the power of music to ease this kind of pain; an analysis published Wednesday in The Lancet that pulls all those findings together builds a strong case. When researchers in London started combing the medical literature for studies about music's soothing power, they found hundreds of small studies suggesting some benefit. The idea goes back to the days of Florence Nightingale, and music was used to ease surgical pain as early as 1914. (My colleague Patricia Neighmond reported on one of these studies just a few months ago). Dr. Catherine Meads at Brunel University focused her attention on 73 rigorous, randomized clinical trials about the role of music among surgery patients. "As they studies themselves were small, they really didn't find all that much," Meads says. "But once we put them all together, we had much more power to find whether music worked or not." She and her colleagues now report that, yes indeed, surgery patients who listened to music, either before, during or after surgery, were better off — in terms of reduced pain, less anxiety and more patient satisfaction. © 2015 NPR

Keyword: Pain & Touch
Link ID: 21293 - Posted: 08.13.2015

by Julia Belluz When neuroscientists stuck a dead salmon in an fMRI machine and watched its brain light up, they knew they had a problem. It wasn't that there was a dead fish in their expensive imaging machine; they'd put it there on purpose, after all. It was that the medical device seemed to be giving these researchers impossible results. Dead fish should not have active brains. The lit of brain of a dead salmon — a cautionary neuroscience tale. (University of California Santa Barbara research poster) The researchers shared their findings in 2009 as a cautionary tale: If you don't run the proper statistical tests on your neuroscience data, you can come up with any number of implausible conclusions — even emotional reactions from a dead fish. In the 1990s, neuroscientists started using the massive, round fMRI (or functional magnetic resonance imaging) machines to peer into their subjects' brains. But since then, the field has suffered from a rash of false positive results and studies that lack enough statistical power — the likelihood of finding a real result when it exists — to deliver insights about the brain. When other scientists try to reproduce the results of original studies, they too often fail. Without better methods, it'll be difficult to develop new treatments for brain disorders and diseases like Alzheimer's and depression — let alone learn anything useful about our most mysterious organ. © 2015 Vox Media, Inc

Keyword: Brain imaging
Link ID: 21292 - Posted: 08.13.2015

Alison Abbott This is the crackle of neural activity that allows a fruit-fly (Drosophila melanogaster) larva to crawl backwards: a flash in the brain and a surge that undulates through the nervous system from the top of the larva’s tiny body to the bottom. When the larva moves forwards, the surge flows the other way. The video — captured almost at the resolution of single neurons — demonstrates the latest development in a technique to film neural activity throughout an entire organism. The original method was invented by Philipp Keller and Misha Ahrens at the Howard Hughes Medical Institute's Janelia Research Campus in Ashburn, Virginia. The researchers genetically modify neurons so that each cell fluoresces when it fires; they then use innovative microscopy that involves firing sheets of light into the brain to record that activity. In 2013, the researchers produced a video of neural activity across the brain of a (transparent) zebrafish larva1. The fruit-fly larva that is mapped in the latest film, published in Nature Communications on 11 August2, is more complicated. The video shows neural activity not just in the brain, but throughout the entire central nervous system (CNS), including the fruit-fly equivalent of a mammalian spinal cord. And unlike the zebrafish, the fruit fly's nervous system is not completely transparent, which makes it harder to image. The researchers stripped the CNS from the larva’s body to examine it. For up to an hour after removal, the CNS continues to spontaneously fire the coordinated patterns of activity that typically drive crawling (and other behaviours). © 2015 Nature Publishing Group

Keyword: Brain imaging
Link ID: 21291 - Posted: 08.12.2015

Your body may be still, but as you dream, your eyes can flicker manically. The rapid eye movement stage of sleep is when we have our most vivid dreams – but do our flickering eyes actually “see” anything? It is a question psychologists have been asking since REM sleep was first described in the 1950s, says Yuval Nir at Tel Aviv University in Israel. “The idea was that we scan an imaginary scene,” says Nir. “It’s an intuitive idea, but it has been very difficult to provide evidence for it.” Until now, much of the evidence has been anecdotal, says Nir. “People who were woken up when their eyes were moving from left to right would say they were dreaming about tennis, for example,” he says. More evidence comes from a previous study that monitored the sleep of people who have a disorder that means they often physically act out their dreams. Their eye movements matched their actions around 80 per cent of the time – a man dreaming about smoking, for example, appeared to look at a dream ashtray as he put out a cigarette. But most of the REM sleep these people had was not accompanied by body movements, making it hard to know for sure. And other researchers have argued that the eye flickers can’t be linked to “seeing” anything because rapid eye movements happen in both fetuses and people who are blind – neither group would have experience of vision and so wouldn’t be expected to move their eyes to follow an object, for example. © Copyright Reed Business Information Ltd.

Keyword: Sleep
Link ID: 21290 - Posted: 08.12.2015