Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 20272

By KEN BELSON The developers of a new drug aimed at diagnosing chronic traumatic encephalopathy, a degenerative brain disease linked to repeated head trauma, are under scrutiny by the Food and Drug Administration. In February, the F.D.A.’s Office of Prescription Drug Promotion sent a letter to two researchers at U.C.L.A. warning them that they had improperly marketed their drug on the Internet and had made overstated claims about the drug’s potential efficacy. The researchers at U.C.L.A. have been developing a biomarker called FDDNP, which aims to identify tau protein deposits in the brain (a signature of C.T.E.) when patients are given a PET scan. To date, researchers have been able to detect C.T.E. only in brain tissue obtained posthumously. The demand for a technique that can diagnose the disease in living patients is potentially large, given growing concerns about the impact of head trauma in athletes, soldiers and others. In its letter, the F.D.A. warned that the researchers, who are partners with the company Taumark, were not allowed to market the drug and make claims about its safety or effectiveness. “Thus, these claims and presentations suggest in a promotional context that FDDNP, an investigational new drug, is safe or effective for such uses, when F.D.A. has not approved FDDNP for any use,” the letter said. The Los Angeles Times first reported the details of the F.D.A.’s letter to the researchers, Dr. Gary Small and Dr. Jorge Barrio. The researchers were told to adjust the language on Taumark’s website, which is now disabled. © 2015 The New York Times Company

Keyword: Brain imaging; Brain Injury/Concussion
Link ID: 20788 - Posted: 04.13.2015

by Bethany Brookshire In 2011, a group of scientists “turned mice gay.” The only issue is, of course, they didn’t. Rather, Yi Rao and colleagues at Peking University in Beijing, China, showed that male mice will cheerfully mount both male and female mice, as long as their brains are deficient in one chemical messenger: serotonin. The paper, published in Nature, received plenty of media coverage. Now, two other research groups report seemingly opposite findings: Male mice with no serotonin in their brains still prefer female mice to males. The researchers contend that serotonin is about social communication and impulsive behaviors, not sex. Mounting behavior aside, sexual preference in mice is not about “turning mice gay.” It never has been. Instead, it’s about the role that a single chemical can play in animal behavior. And it’s about what, exactly, those behaviors really mean. Serotonin serves as a messenger between cells. It plays important roles in mood. Serotonin-related drugs are used to treat some forms of migraine. And of course, serotonin plays a role in the psychedelic effects of recreational drugs such as hallucinogens. So when the Peking University group set out to show serotonin’s role in sexual preference, they attacked it from several angles. They used mice that had been genetically engineered to lack the brain cells that usually produce serotonin. They used a chemical to deplete serotonin in the brains of normal mice. And they created another strain of mice that lacked the enzyme that makes serotonin in the brain. © Society for Science & the Public 2000 - 2015

Keyword: Sexual Behavior
Link ID: 20787 - Posted: 04.11.2015

by Catherine Brahic TRANSLUCENT comb jellies are some of the most primitive animals on Earth, yet they have remarkable nervous systems. Controversial data discussed at a meeting in London last month proposes that their neurons are unlike any others on Earth. This could be evidence that neurons evolved more than once in the history of animal life. The suggestion that neurons evolved in parallel multiple times has divided biologists for over a century. Ultimately, Erich Jarvis of Duke University in North Carolina told a Royal Society conference in March, the question relates to how special we are. If neurons evolved several times on our planet, then it becomes more likely that they could evolve elsewhere in the universe. Until recently, the consensus has leaned towards a very Darwinian story. In this scenario, sometime around 600 million years ago, the common ancestor to all animals gave rise to some organisms with simple neural networks. Central nervous systems arose later, allowing for greater coordination and more complex behaviours. These perhaps started out as tight balls of neurons, but eventually gave rise to the magnificently complex primate brain. This single origin scenario offers a tidy explanation for why some animals, like sponges and flat, simple placozoans, still don't have neurons: they must have branched off before these evolved and are relics of ancestral animals (see diagram). The story was somewhat turned on its head by the recent whole genome sequence of comb jellies. These small marine animals look like jellyfish but in fact seem to be only distantly related. They use a neural network just beneath their skin and a brain-like knot of neurons at one end to catch food, respond to light, sense gravity and escape predators. © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 20786 - Posted: 04.11.2015

Robin McKie, science editor A smile is the universal welcome, the writer Max Eastman once remarked. But how sure can we be that a person’s smile is genuine? The answer is the empathy test, created by psychologist Richard Wiseman, which probes our ability to appreciate the feelings of others – from their appearance. A photographer asks a subject to imagine meeting an individual they don’t like and to put on a fake smile. Later the subject sits with a real friend and as they converse, the photographer records their genuine smile. Thus two versions of their smile are recorded. The question is: how easy is it to tell the difference? “If you lack empathy, you are very bad at differentiating between the two photographs,” says Wiseman, who teaches at the University of Hertfordshire. But how do professions differ in their ability to spot a fake? And in particular, how do scientists and journalists score? Neither are particularly renowned for their empathy, after all. Last month’s Scientists Meet the Media party, for which the Observer is the media sponsor, gave Wiseman a perfect opportunity to compare the two professions. At the party, hosted by the Science Museum in London, some of Britain’s top researchers mingled with UK science journalists. About 150 guests were shown photographs of subjects with fake and genuine smiles. Guests were then asked to spot the false and the true. The results were intriguing. © 2015 Guardian News and Media Limited

Keyword: Emotions
Link ID: 20785 - Posted: 04.11.2015

By James Gallagher Health editor, BBC News website Those who were overweight had an 18% reduction in dementia, researchers found Being overweight cuts the risk of dementia, according to the largest and most precise investigation into the relationship. The researchers admit they were surprised by the findings, which run contrary to current health advice. The analysis of nearly two million British people, in the Lancet Diabetes & Endocrinology, showed underweight people had the highest risk. Dementia charities still advised not smoking, exercise and a balanced diet. Dementia is one of the most pressing modern health issues. The number of patients globally is expected to treble to 135 million by 2050. There is no cure or treatment, and the mainstay of advice has been to reduce risk by maintaining a healthy lifestyle. Yet it might be misguided. The team at Oxon Epidemiology and the London School of Hygiene and Tropical Medicine analysed medical records from 1,958,191 people aged 55, on average, for up to two decades. Their most conservative analysis showed underweight people had a 39% greater risk of dementia compared with being a healthy weight. But those who were overweight had an 18% reduction in dementia - and the figure was 24% for the obese. "Yes, it is a surprise," said lead researcher Dr Nawab Qizilbash. He told the BBC News website: "The controversial side is the observation that overweight and obese people have a lower risk of dementia than people with a normal, healthy body mass index. "That's contrary to most if not all studies that have been done, but if you collect them all together our study overwhelms them in terms of size and precision." Loss of tissue in a demented brain compared with a healthy one © 2015 BBC

Keyword: Obesity; Alzheimers
Link ID: 20784 - Posted: 04.11.2015

Leana Wen Every doctor and nurse in our hospital's emergency room knew Jerome. He was one of our regulars. In his 20s, he had back problems that led him to become addicted to prescription painkillers. That habit proved too expensive, and he switched to heroin. Jerome used to come to the ER nearly every week. Often, he just wanted a sandwich and someone to talk to. He had lost his job and his home. Several months ago, he decided he had to quit heroin. We helped him with health insurance so that he could find a primary care doctor. Our social worker connected him with addiction treatment, including medications and mental health counseling. He was also working on rekindling a relationship with his estranged family. One day, paramedics brought Jerome to the ER. They had found him in an abandoned building. He'd relapsed and was shooting heroin. A friend saw him unconscious and called for help. By the time paramedics arrived, he wasn't breathing and his heart had stopped beating. In the ER, we tried to resuscitate him for nearly an hour. We weren't successful. In Baltimore, where I serve as health commissioner, more people die from drug and alcohol overdoses than from homicide. In 2013, there were 246 deaths related to drugs and alcohol, compared with 235 homicides. © 2015 NPR

Keyword: Drug Abuse
Link ID: 20783 - Posted: 04.11.2015

By Jennifer Couzin-Frankel Sudden death, a mysterious and devastating outcome of epilepsy, could result from a brain stem shutdown following a seizure, researchers report today in Science Translational Medicine. Although the idea is still preliminary, it’s engendering hope that neurologists are one step closer to intervening before death strikes. Sudden unexpected death in epilepsy (SUDEP) has long bedeviled doctors and left heartbroken families in its wake. “It’s as big a mystery as epilepsy itself,” says Jeffrey Noebels, a neurologist at Baylor College of Medicine in Houston, Texas, and the senior author of the new paper. As its name suggests, SUDEP attacks without warning: People with epilepsy are found dead, often following a seizure, sometimes face down in bed. Many are young—the median age is 20—and patients with uncontrolled generalized seizures, the most severe type, are at highest risk. About 3000 people are thought to die of SUDEP each year in the United States. And doctors have struggled to understand why. “How can you have seizures your whole life, and all of a sudden, it’s your last one?” Noebels asks. In 2013, an international team of researchers described its study of epilepsy patients who had died while on hospital monitoring units. In 10 SUDEP cases for which they had the patients’ heart function and breathing patterns, the authors found that the patients’ cardiorespiratory systems collapsed over several minutes, and their brain activity was severely depressed. “Their EEG went flat after a seizure,” says Stephan Schuele, an epileptologist at Northwestern University Feinberg School of Medicine in Chicago, Illinois, who wasn’t involved in the study. © 2015 American Association for the Advancement of Science

Keyword: Epilepsy
Link ID: 20782 - Posted: 04.10.2015

By Nicholas Bakalar A new study suggests that early to bed and early to rise makes a man healthy — although not necessarily wealthy or wise. Korean researchers recruited 1,620 men and women, ages 47 to 59, and administered a questionnaire to establish whether they were morning people or night owls. They found 480 morning types, 95 night owls, and 1,045 who fit into neither group. The scientists measured all for glucose tolerance, body composition and waist size, and gathered information on other health and behavioral characteristics. The study is online in The Journal of Clinical Endocrinology & Metabolism. After controlling for an array of variables, they found that compared with morning people, men who were night owls were significantly more likely to have diabetes, and women night owls were more than twice as likely to have metabolic syndrome — high blood sugar levels, excess body fat around the waist, and abnormal lipid readings. The reasons for the effect are unclear, but the scientists suggest that consuming more calories after 8 p.m. and exposure to artificial light at night can both affect metabolic regulation. Can a night owl become a morning person? “Yes,” said the lead author, Dr. Nan Hee Kim, an endocrinologist at the Korea University College of Medicine. “It can be modified by external cues such as light, activity and eating behavior. But it isn’t known if this would improve the metabolic outcomes.” © 2015 The New York Times Company

Keyword: Sleep
Link ID: 20781 - Posted: 04.10.2015

Jon Hamilton Researchers have discovered the exact structure of the receptor that makes our sensory nerves tingle when we eat sushi garnished with wasabi. And because the "wasabi receptor" is also involved in pain perception, knowing its shape should help pharmaceutical companies develop new drugs to fight pain. The receptor, which scientists call TRPA1, is "an important molecule in the pain pathway," says David Julius, a professor of physiology at the University of California, San Francisco and an author of a paper published in this week's Nature. "A dream of mine is that some of the work we do will translate into medicines people can take for chronic pain." Julius led a team that discovered the receptor about a decade ago. Since then, researchers have shown that TRPA1 receptors begin sending distress signals to the brain whenever they encounter pungent chemical irritants, including not only wasabi but tear gas and air pollution from cars or wood fires. The receptors also become activated in response to chemicals released by the body itself when tissue becomes inflamed from an injury or a disease like rheumatoid arthritis. © 2015 NPR

Keyword: Pain & Touch
Link ID: 20780 - Posted: 04.10.2015

By Emily Underwood A splashy headline appeared on the websites of many U.K. newspapers this morning, claiming that men whose brothers or fathers have been convicted of a sex offense are “five times more likely to commit sex crimes than the average male” and that this increased risk of committing rape or molesting a child “may run in a family’s male genes.” The study, published online today in the International Journal of Epidemiology, analyzed data from 21,566 male sex offenders convicted in Sweden between 1973 and 2009 and concluded that genetics may account for at least 40% of the likelihood of committing a sex crime. (Women, who commit less than 1% of Sweden’s sexual offenses, were omitted from the analysis.) The scientists have suggested that the new research could be used to help identify potential offenders and target high-risk families for early intervention efforts. But independent experts—and even the researchers who led the work, to a certain degree—warn that the study has some serious limitations. Here are a few reasons to take its conclusions, and the headlines, with a generous dash of salt. Alternate explanations: Most studies point to early life experiences, such as childhood abuse, as the most important risk factor for becoming a perpetrator of abuse in adulthood. The new study, however, did not include any detail about the convicted sex criminals’ early life exposure to abuse. Instead, by comparing fathers with sons, and full brothers and half-brothers reared together or apart, the scientists attempted to tease out the relative contributions of shared environment and shared genes to the risk of sexual offending. Based on their analyses, the researchers concluded that shared environment accounted for just 2% of the risk of sexual offense, while genetics accounted for roughly 40%. Although there is likely some genetic contribution to sexual offending—perhaps related to impulsivity or sex drive—the group “may be overestimating the role of genes” because their assumptions were inaccurate, says Fred Berlin, a psychiatrist and sexologist at Johns Hopkins University in Baltimore, Maryland. © 2015 American Association for the Advancement of Science.

Keyword: Aggression; Genes & Behavior
Link ID: 20779 - Posted: 04.10.2015

By Rachel Feltman If you give a mouse an eating disorder, you might just figure out how to treat the disease in humans. In a new study published Thursday in Cell Press, researchers created mice who lacked a gene associated with disordered eating in humans. Without it, the mice showed behaviors not unlike those seen in humans with eating disorders: They tended to be obsessive compulsive and have trouble socializing, and they were less interested in eating high-fat food than the control mice. The findings could lead to novel drug treatments for some of the 24 million Americans estimated to suffer from eating disorders. In a 2013 study, the same researchers went looking for genes that might contribute to the risk of an eating disorder. Anorexia nervosa and bulimia nervosa aren't straightforwardly inherited -- there's definitely more to an eating disorder than your genes -- but it does seem like some families might have higher risks than others. Sure enough, the study of two large families, each with several members who had eating disorders, yielded mutations in two interacting genes. In one family, the estrogen-related receptor α (ESRRA) gene was mutated. The other family had a mutation on another gene that seemed to affect how well ESRRA could do its job. So in the latest study, they created mice that didn't have ESRRA in the parts of the brain associated with eating disorders. "You can't go testing this kind of gene expression in a human," lead author and University of Iowa neuorscientist Michael Lutter said. "But in mice, you can manipulate the expression of the gene and then look at how it changes their behavior."

Keyword: Anorexia & Bulimia; Genes & Behavior
Link ID: 20778 - Posted: 04.10.2015

by Anil Ananthaswamy HOLD that thought. When it comes to consciousness, the brain may be doing just that. It now seems that conscious perception requires brain activity to hold steady for hundreds of milliseconds. This signature in the pattern of brainwaves can be used to distinguish between levels of impaired consciousness in people with brain injury. The new study by Aaron Schurger at the Swiss Federal Institute of Technology in Lausanne doesn't explain the so-called "hard problem of consciousness" – how roughly a kilogram of nerve cells is responsible for the miasma of sensations, thoughts and emotions that make up our mental experience. However, it does chip away at it, and support the idea that it may one day be explained in terms of how the brain processes information. Neuroscientists think that consciousness requires neurons to fire in such a way that they produce a stable pattern of brain activity. The exact pattern will depend on what the sensory information is, but once information has been processed, the idea is that the brain should hold a pattern steady for a short period of time – almost as if it needs a moment to read out the information. In 2009, Schurger tested this theory by scanning 12 people's brains with fMRI machines. The volunteers were shown two images simultaneously, one for each eye. One eye saw a red-on-green line drawing and the other eye saw green-on-red. This confusion caused the volunteers to sometimes consciously perceive the drawing and sometimes not. © Copyright Reed Business Information Ltd.

Keyword: Consciousness; Brain imaging
Link ID: 20777 - Posted: 04.10.2015

By Ariana Eunjung Cha Autism has always been a tricky disorder to diagnose. There’s no such thing as a blood test, cheek swap or other accepted biological marker so specialists must depend on parent and teacher reports, observations and play assessments. Figuring out a child's trajectory once he or she is diagnosed is just as challenging. The spectrum is wide and some are destined to be on the mild end and be very talkative, sometimes almost indistinguishable from those without the disorder in some settings, while others will suffer from a more severe form and have trouble being able to speak basic words. Now scientists believe that they have a way to distinguish between those paths, at least in terms of language ability, in the toddler years using brain imaging. In an article published Thursday in the journal Neuron, scientists at the University of California-San Diego have found that children with autism spectrum disorder, or ASD, with good language outcomes have strikingly distinct patterns of brain activation as compared to those with poor language outcomes and typically developing toddlers. "Why some toddlers with ASD get better and develop good language and others do not has been a mystery that is of the utmost importance to solve," Eric Courchesne, one of the study’s authors and co-director of the University of California-San Diego's Autism Center, said in a statement. The images of the children in the study -- MRIs of the brain -- were taken at 12 to 29 months while their language was assessed one to two years later at 30 to 48 months.

Keyword: Autism; Language
Link ID: 20776 - Posted: 04.10.2015

Mo Costandi In 2009, researchers at the University of California, Santa Barbara performed a curious experiment. In many ways, it was routine — they placed a subject in the brain scanner, displayed some images, and monitored how the subject's brain responded. The measured brain activity showed up on the scans as red hot spots, like many other neuroimaging studies. Except that this time, the subject was an Atlantic salmon, and it was dead. Dead fish do not normally exhibit any kind of brain activity, of course. The study was a tongue-in-cheek reminder of the problems with brain scanning studies. Those colorful images of the human brain found in virtually all news media may have captivated the imagination of the public, but they have also been subject of controversy among scientists over the past decade or so. In fact, neuro-imagers are now debating how reliable brain scanning studies actually are, and are still mostly in the dark about exactly what it means when they see some part of the brain "light up." Functional magnetic resonance imaging (fMRI) measures brain activity indirectly by detecting changes in the flow of oxygen-rich blood, or the blood oxygen-level dependent (BOLD) signal, with its powerful magnets. The assumption is that areas receiving an extra supply of blood during a task have become more active. Typically, researchers would home in on one or a few "regions of interest," using 'voxels,' tiny cube-shaped chunks of brain tissue containing several million neurons, as their units of measurement.

Keyword: Brain imaging
Link ID: 20775 - Posted: 04.10.2015

By Megan Griffith-Greene The idea of playing a game to make you sharper seems like a no-brainer. That's the thinking behind a billion-dollar industry selling brain training games and programs designed to boost cognitive ability. But an investigation by CBC's Marketplace reveals that brain training games such as Lumosity may not make your brain perform better in everyday life. Lumosity Brain training games, such as Lumosity, are a billion-dollar industry. Many people are worried about maintaining their brain health and want to prevent a decline in their mental abilities. (CBC) Almost 15 per cent of Canadians over the age of 65 are affected by some kind of dementia. And many people of all ages are worried about maintaining their brain health and possibly preventing a decline in their mental abilities. "I don't think there's anything to say that you can train your brain to be cognitively better in the way that we know that we can train our bodies to be physically better," neuroscientist Adrian Owen told Marketplace co-host Tom Harrington. To test how effective the games are at improving cognitive function, Marketplace partnered with Owen, who holds the Canada Excellence Research Chair in Cognitive Neuroscience and Imaging at the Brain and Mind Institute at Western University. A group of 54 adults, including Harrington, did the brain training at least three times per week for 15 minutes or more over a period of between two and a half and four weeks. The group underwent a complete cognitive assessment at the beginning and end of the training to see if there had been any change as the result of the training program. ©2015 CBC/Radio-Canada.

Keyword: Learning & Memory; Alzheimers
Link ID: 20774 - Posted: 04.10.2015

Jordan Gaines Lewis Hodor hodor hodor. Hodor hodor? Hodor. Hodor-hodor. Hodor! Oh, um, excuse me. Did you catch what I said? Fans of the hit HBO show Game of Thrones, the fifth season of which premieres this Sunday, know what I’m referencing, anyway. Hodor is the brawny, simple-minded stableboy of the Stark family in Winterfell. His defining characteristic, of course, is that he only speaks a single word: “Hodor.” But those who read the A Song of Ice and Fire book series by George R R Martin may know something that the TV fans don’t: his name isn’t actually Hodor. According to his great-grandmother Old Nan, his real name is Walder. “No one knew where ‘Hodor’ had come from,” she says, “but when he started saying it, they started calling him by it. It was the only word he had.” Whether he intended it or not, Martin created a character who is a textbook example of someone with a neurological condition called expressive aphasia. In 1861, French physician Paul Broca was introduced to a man named Louis-Victor Leborgne. While his comprehension and mental functioning remained relatively normal, Leborgne progressively lost the ability to produce meaningful speech over a period of 20 years. Like Hodor, the man was nicknamed Tan because he only spoke a single word: “Tan.”

Keyword: Language
Link ID: 20773 - Posted: 04.10.2015

|By Gareth Cook The wait has been long, but the discipline of neuroscience has finally delivered a full-length treatment of the zombie phenomenon. In their book, Do Zombies Dream of Undead Sheep?, scientists Timothy Verstynen and Bradley Voytek cover just about everything you might want to know about the brains of the undead. It's all good fun, and if you learn some serious neuroscience along the way, well, that's fine with them, too. Voytek answered questions from contributing editor Gareth Cook. How is it that you and your co-author came to write a book about zombies? Clearly, it is an urgent public health threat, but I would not have expected a book from neuroscientists on the topic. Indeed! You think you're prepared for the zombie apocalypse and then—BAM!—it happens, and only then do you realize how poorly prepared you really were. Truly the global concern of our time. Anyway, this whole silly thing started when Tim and I would get together to watch zombie movies with our wives and friends. Turns out when you get some neuroscientists together to watch zombie movies, after a few beers they start to diagnose them and mentally dissect their brains. Back in the summer of 2010 zombie enthusiast and author—and head of the Zombie Research Society—Matt Mogk got in touch with me to see if we were interested in doing something at the intersection of zombies and neuroscience. © 2015 Scientific American

Keyword: Miscellaneous
Link ID: 20772 - Posted: 04.10.2015

Cari Romm “As humans, we can identify galaxies light-years away. We can study particles smaller than an atom,” President Barack Obama said in April 2013, “But we still haven’t unlocked the mystery of the three pounds of matter that sits between our ears.” The observation was part of the president’s announcement of the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, an effort to fast-track the development of new technology that will help scientists understand the workings of the human brain and its diseases. With progress, though, comes a whole new set of ethical questions. Can drugs used to treat conditions like ADHD, for example, also be used to make healthy people into sharper, more focused versions of themselves—and should they? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? Can brain scans submitted as courtroom evidence reveal anything about a defendant’s intent? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? To address these questions, the Presidential Commission for the Study of Bioethical Issues, an independent advisory group, recently released the second volume of a report examining the issues that may arise as neuroscience advances. The commission outlined three areas it deemed particularly fraught: cognitive enhancement, consent, and the use of neuroscience in the legal system. © 2015 by The Atlantic Monthly Group

Keyword: Drug Abuse; Learning & Memory
Link ID: 20771 - Posted: 04.08.2015

|By Simon Makin People can control prosthetic limbs, computer programs and even remote-controlled helicopters with their mind, all by using brain-computer interfaces. What if we could harness this technology to control things happening inside our own body? A team of bioengineers in Switzerland has taken the first step toward this cyborglike setup by combining a brain-computer interface with a synthetic biological implant, allowing a genetic switch to be operated by brain activity. It is the world's first brain-gene interface. The group started with a typical brain-computer interface, an electrode cap that can register subjects' brain activity and transmit signals to another electronic device. In this case, the device is an electromagnetic field generator; different types of brain activity cause the field to vary in strength. The next step, however, is totally new—the experimenters used the electromagnetic field to trigger protein production within human cells in an implant in mice. The implant uses a cutting-edge technology known as optogenetics. The researchers inserted bacterial genes into human kidney cells, causing them to produce light-sensitive proteins. Then they bioengineered the cells so that stimulating them with light triggers a string of molecular reactions that ultimately produces a protein called secreted alkaline phosphatase (SEAP), which is easily detectable. They then placed the human cells plus an LED light into small plastic pouches and inserted them under the skin of several mice. © 2015 Scientific American

Keyword: Robotics; Genes & Behavior
Link ID: 20770 - Posted: 04.08.2015

By Jonathan Webb Science reporter, BBC News Living in total darkness, the animals' eyes have disappeared over millions of years A study of blind crustaceans living in deep, dark caves has revealed that evolution is rapidly withering the visual parts of their brain. The findings catch evolution in the act of making this adjustment - as none of the critters have eyes, but some of them still have stumpy eye-stalks. Three different species were studied, each representing a different subgroup within the same class of crustaceans. The research is published in the journal BMC Neuroscience. The class of "malocostracans" also includes much better-known animals like lobsters, shrimps and wood lice, but this study focussed on three tiny and obscure examples that were only discovered in the 20th Century. It is the first investigation of these mysterious animals' brains. "We studied three species. All of them live in caves, and all of them are very rare or hardly accessible," said lead author Dr Martin Stegner, from the University of Rostock in Germany. Specifically, his colleagues retrieved the specimens from the coast of Bermuda, from Table Mountain in South Africa, and from Monte Argentario in Italy. One of the species was retrieved from caves on the coast of Bermuda The animals were preserved rather than living, so the team could not observe their tiny brains in action. But by looking at the physical shape of the brain, and making comparisons with what we know about how the brain works in their evolutionary relatives, the researchers were able to assign jobs to the various lobes, lumps and spindly structures they could see under the microscope. © 2015 BBC.

Keyword: Evolution; Vision
Link ID: 20769 - Posted: 04.08.2015