Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Featured Article

'Language Gene' Has a Partner

Few genes have made the headlines as much as FOXP2. The first gene associated with language disorders , it was later implicated in the evolution of human speech. Girls make more of the FOXP2 protein, which may help explain their precociousness in learning to talk. Now, neuroscientists have figured out how one of its molecular partners helps Foxp2 exert its effects.

The findings may eventually lead to new therapies for inherited speech disorders, says Richard Huganir, the neurobiologist at Johns Hopkins University School of Medicine in Baltimore, Maryland, who led the work. Foxp2 controls the activity of a gene called Srpx2, he notes, which helps some of the brain's nerve cells beef up their connections to other nerve cells. By establishing what SRPX2 does, researchers can look for defective copies of it in people suffering from problems talking or learning to talk.

Until 2001, scientists were not sure how genes influenced language. Then Simon Fisher, a neurogeneticist now at the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, and his colleagues fingered FOXP2 as the culprit in a family with several members who had trouble with pronunciation, putting words together, and understanding speech. These people cannot move their tongue and lips precisely enough to talk clearly, so even family members often can?t figure out what they are saying. It “opened a molecular window on the neural basis of speech and language,” Fisher says.

Photo credit: Yoichi Araki, Ph.D.


Links 1 - 20 of 21685

By Kelly Servick There’s an unfortunate irony for people who rely on morphine, oxycodone, and other opioid painkillers: The drug that’s supposed to offer you relief can actually make you more sensitive to pain over time. That effect, known as hyperalgesia, could render these medications gradually less effective for chronic pain, leading people to rely on higher and higher doses. A new study in rats—the first to look at the interaction between opioids and nerve injury for months after the pain-killing treatment was stopped—paints an especially grim picture. An opioid sets off a chain of immune signals in the spinal cord that amplifies pain rather than dulling it, even after the drug leaves the body, the researchers found. Yet drugs already under development might be able to reverse the effect. It’s no secret that powerful painkillers have a dark side. Overdose deaths from prescription opioids have roughly quadrupled over 2 decades, in near lockstep with increased prescribing. And many researchers see hyperalgesia as a part of that equation—a force that compels people to take more and more medication, while prolonging exposure to sometimes addictive drugs known to dangerously slow breathing at high doses. Separate from their pain-blocking interaction with receptors in the brain, opioids seem to reshape the nervous system to amplify pain signals, even after the original illness or injury subsides. Animals given opioids become more sensitive to pain, and people already taking opioids before a surgery tend to report more pain afterward. © 2016 American Association for the Advancement of Scienc

Keyword: Pain & Touch; Drug Abuse
Link ID: 22268 - Posted: 05.31.2016

By Gary Stix Scientists will never find a single gene for depression—nor two, nor 20. But among the 20,000 human genes and the hundreds of thousands of proteins and molecules that switch on those genes or regulate their activity in some way, there are clues that point to the roots of depression. Tools to identify biological pathways that are instrumental in either inducing depression or protecting against it have recently debuted—and hold the promise of providing leads for new drug therapies for psychiatric and neurological diseases. A recent paper in the journal Neuron illustrates both the dazzling complexity of this approach and the ability of these techniques to pinpoint key genes that may play a role in governing depression. Scientific American talked with the senior author on the paper—neuroscientist Eric Nestler from the Icahn School of Medicine at Mt. Sinai in New York. Nestler spoke about the potential of this research to break the logjam in pharmaceutical research that has impeded development of drugs to treat brain disorders. Scientific American: The first years in the war on cancer met with a tremendous amount of frustration. Things look like they're improving somewhat now for cancer. Do you anticipate a similar trajectory may occur in neuroscience for psychiatric disorders? Eric Nestler: I do. I just think it will take longer. I was in medical school 35 years ago when the idea that identifying a person's specific pathophysiology was put forward as a means of directing treatment of cancer. We're now three decades later finally seeing the day when that’s happening. I definitely think the same will occur for major brain disorders. The brain is just more complicated and the disorders are more complicated so it will take longer. © 2016 Scientific American

Keyword: Depression; Genes & Behavior
Link ID: 22267 - Posted: 05.31.2016

What do large tables, large breakfasts, and large servers have in common? They all affect how much you eat. This week on Hidden Brain, we look at the hidden forces that drive our diets. First we hear from Adam Brumberg at Cornell University's Food and Brand Lab about how to make healthier choices more easily (hint: good habits and pack your lunch!). Then, Senior (Svelte) Stopwatch Correspondent Daniel Pink returns for another round of Stopwatch Science to tell you about those tables, breakfasts, and servers. If you don't like spoilers, stop reading and go listen to the episode! Here are the studies: You may have heard that smaller portions can help you eat fewer calories. That's true. But what about larger tables? Researchers Brennan Davis, Collin Payne, and My Bui hypothesized that one of the ways smaller food units lead us to eat less is by playing with our perception. They tested this with pizza and found that while study participants tended to eat more small slices, they consumed fewer calories overall because it seemed like they were eating more. The researchers tried to distort people's perception even further by making the smaller slices seem bigger by putting them on a bigger table. What they found is that even hungry college students at fewer calories of (free) pizza when it was chopped into tiny slices and put on a big table. What about who's around that big table? That seems to matter, too. Researchers found both men and women order more food when they eat with women but choose smaller portions when they eat in the company of men. They say breakfast is the most important meal of the day. Well, it may also be the most slimming. When researchers assigned two groups of overweight women to eat a limited number of calories each day, they found those who ate more at breakfast and less at dinner shed about twice as many pounds as the other group. © 2016 npr

Keyword: Obesity
Link ID: 22266 - Posted: 05.31.2016

By Anil Ananthaswamy and Alice Klein Our brain’s defence against invading microbes could cause Alzheimer’s disease – which suggests that vaccination could prevent the condition. Alzheimer’s disease has long been linked to the accumulation of sticky plaques of beta-amyloid proteins in the brain, but the function of plaques has remained unclear. “Does it play a role in the brain, or is it just garbage that accumulates,” asks Rudolph Tanzi of Harvard Medical School. Now he has shown that these plaques could be defences for trapping invading pathogens. Working with Robert Moir at the Massachusetts General Hospital in Boston, Tanzi’s team has shown that beta-amyloid can act as an anti-microbial compound, and may form part of our immune system. .. To test whether beta-amyloid defends us against microbes that manage to get into the brain, the team injected bacteria into the brains of mice that had been bred to develop plaques like humans do. Plaques formed straight away. “When you look in the plaques, each one had a single bacterium in it,” says Tanzi. “A single bacterium can induce an entire plaque overnight.” Double-edged sword This suggests that infections could be triggering the formation of plaques. These sticky plaques may trap and kill bacteria, viruses or other pathogens, but if they aren’t cleared away fast enough, they may lead to inflammation and tangles of another protein, called tau, causing neurons to die and the progression towards © Copyright Reed Business Information Ltd.

Keyword: Alzheimers; Neuroimmunology
Link ID: 22265 - Posted: 05.31.2016

Robert Plomin, Scientists have investigated this question for more than a century, and the answer is clear: the differences between people on intelligence tests are substantially the result of genetic differences. But let's unpack that sentence. We are talking about average differences among people and not about individuals. Any one person's intelligence might be blown off course from its genetic potential by, for example, an illness in childhood. By genetic, we mean differences passed from one generation to the next via DNA. But we all share 99.5 percent of our three billion DNA base pairs, so only 15 million DNA differences separate us genetically. And we should note that intelligence tests include diverse examinations of cognitive ability and skills learned in school. Intelligence, more appropriately called general cognitive ability, reflects someone's performance across a broad range of varying tests. Genes make a substantial difference, but they are not the whole story. They account for about half of all differences in intelligence among people, so half is not caused by genetic differences, which provides strong support for the importance of environmental factors. This estimate of 50 percent reflects the results of twin, adoption and DNA studies. From them, we know, for example, that later in life, children adopted away from their biological parents at birth are just as similar to their biological parents as are children reared by their biological parents. Similarly, we know that adoptive parents and their adopted children do not typically resemble one another in intelligence. © 2016 Scientific American

Keyword: Intelligence; Genes & Behavior
Link ID: 22264 - Posted: 05.31.2016

By Viviane Callier Bees don’t just recognize flowers by their color and scent; they can also pick up on their minute electric fields. Such fields—which form from the imbalance of charge between the ground and the atmosphere—are unique to each species, based on the plant’s distance from the ground and shape. Flowers use them as an additional way to advertise themselves to pollinators, but until now researchers had no idea how bees sensed these fields. In a new study, published online today in the Proceedings of the National Academy of Sciences, researchers used a laser vibrometer—a tiny machine that hits the bee hair with a laser—to measure how the hair on a bee’s body responds to a flower’s tiny electric field. As the hair moves because of the electric field, it changes the frequency of the laser light that hits it, allowing the vibrometer to keep track of the velocity of motion of the hair. When the bees buzzed within 10 centimeters of the flower, the electric field—like static electricity from a balloon—caused the bee’s hair to bend. This bending activates neurons at the base of bee hair sockets, which allows the insects to “sense” the field, the team found. Electric fields can only be sensed from a distance of 10 cm or so, so they’re not very useful for large animals like ourselves. But for small insects, this distance represents several body lengths, a relatively long distance. Because sensing such fields is useful to small animals, the team suspects this ability could be important to other insect species as well. © 2016 American Association for the Advancement of Science.

Keyword: Pain & Touch
Link ID: 22263 - Posted: 05.31.2016

By Jane E. Brody Joanne Reitano is a professor of history at LaGuardia Community College in Long Island City, Queens. She writes wonderful books about the history of the city and state, and has recently been spending many hours — sometimes all day — at her computer to revise her first book, “The Restless City.” But while sitting in front of the screen, she told me, “I developed burning in my eyes that made it very difficult to work.” After resting her eyes for a while, the discomfort abates, but it quickly returns when she goes back to the computer. “If I was playing computer games, I’d turn off the computer, but I need it to work,” the frustrated professor said. Dr. Reitano has a condition called computer vision syndrome. She is hardly alone. It can affect anyone who spends three or more hours a day in front of computer monitors, and the population at risk is potentially huge. Worldwide, up to 70 million workers are at risk for computer vision syndrome, and those numbers are only likely to grow. In a report about the condition written by eye care specialists in Nigeria and Botswana and published in Medical Practice and Reviews, the authors detail an expanding list of professionals at risk — accountants, architects, bankers, engineers, flight controllers, graphic artists, journalists, academicians, secretaries and students — all of whom “cannot work without the help of computer.” And that’s not counting the millions of children and adolescents who spend many hours a day playing computer games. Studies have indicated 70 percent to 90 percent of people who use computers extensively, whether for work or play, have one or more symptoms of computer vision syndrome. The effects of prolonged computer use are not just vision-related. Complaints include neurological symptoms like chronic headaches and musculoskeletal problems like neck and back pain. © 2016 The New York Times Company

Keyword: Vision
Link ID: 22262 - Posted: 05.30.2016

By C. CLAIBORNE RAY Q. Does the size of an animal’s brain really correlate with intelligence on a species-by-species basis? A. “It’s not necessarily brain size but rather the ratio of brain size to body size that really tells the story,” said Rob DeSalle, a curator at the Sackler Institute for Comparative Genomics at the American Museum of Natural History. Looking at this ratio over a large number of vertebrate animals, he said, scientists have found that “brain size increases pretty linearly with body size, except for some critical species like Homo sapiens and some cetaceans,” the order of mammals that includes whales, dolphins and porpoises. “So if there is a deviation from this general ratio, one can predict how smart a vertebrate might be,” Dr. DeSalle continued. Therefore, living vertebrates that deviate so that their brains are inordinately bigger compared with their bodies are for the most part smarter, he said. As for dinosaurs, he said, scientists really can’t tell how smart they may have been. “But the Sarmientosaurus, with its lime-sized brain, was a big animal, so the extrapolation is that it would have been pretty dense,” he said. “On the other hand, Troodon, a human-sized dinosaur, had a huge brain relative to its body size and is widely considered the smartest dinosaur ever found.” © 2016 The New York Times Company

Keyword: Evolution
Link ID: 22261 - Posted: 05.30.2016

By David Shultz We still may not know what causes consciousness in humans, but scientists are at least learning how to detect its presence. A new application of a common clinical test, the positron emission tomography (PET) scan, seems to be able to differentiate between minimally conscious brains and those in a vegetative state. The work could help doctors figure out which brain trauma patients are the most likely to recover—and even shed light on the nature of consciousness. “This is really cool what these guys did here,” says neuroscientist Nicholas Schiff at Cornell University, who was not involved in the study. “We’re going to make great use of it.” PET scans work by introducing a small amount of radionuclides into the body. These radioactive compounds act as a tracer and naturally emit subatomic particles called positrons over time, and the gamma rays indirectly produced by this process can be detected by imaging equipment. The most common PET scan uses fluorodeoxyglucose (FDG) as the tracer in order to show how glucose concentrations change in tissue over time—a proxy for metabolic activity. Compared with other imaging techniques, PET scans are relatively cheap and easy to perform, and are routinely used to survey for cancer, heart problems, and other diseases. In the new study, researchers used FDG-PET scans to analyze the resting cerebral metabolic rate—the amount of energy being used by the tissue—of 131 patients with a so-called disorder of consciousness and 28 healthy controls. Disorders of consciousness can refer to a wide range of problems, ranging from a full-blown coma to a minimally conscious state in which patients may experience brief periods where they can communicate and follow instructions. Between these two extremes, patients may be said to be in a vegetative state or exhibit unresponsive wakefulness, characterized by open eyes and basic reflexes, but no signs of awareness. Most disorders of consciousness result from head trauma, and where someone falls on the consciousness continuum is typically determined by the severity of the injury. © 2016 American Association for the Advancement of Science

Keyword: Consciousness; Brain imaging
Link ID: 22260 - Posted: 05.28.2016

By Roland Pease BBC Radio Science Unit Researchers have invented a DNA "tape recorder" that can trace the family history of every cell in an organism. The technique is being hailed as a breakthrough in understanding how the trillions of complex cells in a body are descended from a single egg. "It has the potential to provide profound insights into how normal, diseased or damaged tissues are constructed and maintained," one UK biologist told the BBC. The work appears in Science journal. The human body has around 40 trillion cells, each with a highly specialised function. Yet each can trace its history back to the same starting point - a fertilised egg. Developmental biology is the business of unravelling how the genetic code unfolds at each cycle of cell division, how the body plan develops, and how tissues become specialised. But much of what it has revealed has depended on inference rather than a complete cell-by-cell history. "I actually started working on this problem as a graduate student in 2000," confessed Jay Shendure, lead researcher on the new scientific paper. "Could we find a way to record these relationships between cells in some compact form we could later read out in adult organisms?" The project failed then because there was no mechanism to record events in a cell's history. That changed with recent developments in so called CRISPR gene editing, a technique that allows researchers to make much more precise alterations to the DNA in living organisms. The molecular tape recorder developed by Prof Shendure's team at the University of Washington in Seattle, US, is a length of DNA inserted into the genome that contains a series of edit points which can be changed throughout an organism's life. © 2016 BBC.

Keyword: Development of the Brain; Neurogenesis
Link ID: 22259 - Posted: 05.28.2016

By BENEDICT CAREY Suzanne Corkin, whose painstaking work with a famous amnesiac known as H.M. helped clarify the biology of memory and its disorders, died on Tuesday in Danvers, Mass. She was 79. Her daughter, Jocelyn Corkin, said the cause was liver cancer. Dr. Corkin met the man who would become a lifelong subject and collaborator in 1964, when she was a graduate student in Montreal at the McGill University laboratory of the neuroscientist Brenda Milner. Henry Molaison — known in published reports as H.M., to protect his privacy — was a modest, middle-aged former motor repairman who had lost the ability to form new memories after having two slivers of his brain removed to treat severe seizures when he was 27. In a series of experiments, Dr. Milner had shown that a part of the brain called the hippocampus was critical to the consolidation of long-term memories. Most scientists had previously thought that memory was not dependent on any one cortical area. Mr. Molaison lived in Hartford, and Dr. Milner had to take the train down to Boston and drive from there to Connecticut to see him. It was a long trip, and transporting him to Montreal proved to be so complicated, largely because of his condition, that Dr. Milner did it just once. Yet rigorous study of H.M., she knew, would require proximity and a devoted facility — with hospital beds — to accommodate extended experiments. The psychology department at the Massachusetts Institute of Technology offered both, and with her mentor’s help, Dr. Corkin landed a position there. Thus began a decades-long collaboration between Dr. Corkin and Mr. Molaison that would extend the work of Dr. Milner, focus intense interest on the hippocampus, and make H.M. the most famous patient in the history of modern brain science. © 2016 The New York Times Company

Keyword: Learning & Memory
Link ID: 22258 - Posted: 05.28.2016

By Jordana Cepelewicz General consensus among Alzheimer’s researchers has it that the disease’s main culprit, a protein called amyloid beta, is an unfortunate waste product that is not known to play any useful role in the body—and one that can have devastating consequences. When not properly cleared from the brain it builds up into plaques that destroy synapses, the junctions between nerve cells, resulting in cognitive decline and memory loss. The protein has thus become a major drug target in the search for a cure to Alzheimer’s. Now a team of researchers at Harvard Medical School and Massachusetts General Hospital are proposing a very different story. In a study published this week in Science Translational Medicine, neurologists Rudolph Tanzi and Robert Moir report evidence that amyloid beta serves a crucial purpose: protecting the brain from invading microbes. “The original idea goes back to 2010 or so when Rob had a few too many Coronas,” Tanzi jokes. Moir had come across surprising similarities between amyloid beta and LL37, a protein that acts as a foot soldier in the brain’s innate immune system, killing potentially harmful bugs and alerting other cells to their presence. “These types of proteins, although small, are very sophisticated in what they do,” Moir says. “And they’re very ancient, going back to the dawn of multicellular life.” © 2016 Scientific American,

Keyword: Alzheimers; Neuroimmunology
Link ID: 22257 - Posted: 05.28.2016

Martha Bebinger Labels for the first long-acting opioid addiction treatment device are rolling off printing machines Friday. Trainings begin Saturday for doctors who want to learn to insert four matchstick-size rods under the skin. They contain the drug buprenorphine, which staves off opioid cravings. The implant, called Probuphine, was approved by the Food and Drug Administration on Thursday, and is expected to be available to patients by the end of June. "This is just the starting point for us to continue to fight for the cause of patients with opioid addiction," said Behshad Sheldon, CEO of Braeburn Pharmaceuticals, which manufactures Probuphine. But debate continues about how effective the implant will be and whether insurers will cover it. Nora Volkow, head of the National Institute on Drug Abuse, calls Probuphine a game changer, saying it will help addiction patients stay on their meds while their brain circuits recover from the ravages of drug use. And addiction experts say it will be much harder for patients prescribed the implant to sell their medication on the street, which can be a problem with addiction patients prescribed pills. "I think it's fantastic news," said Dr. Sarah Wakeman, medical director of the Substance Use Disorder Initiative at Massachusetts General Hospital. "We need as many tools in the toolbox as possible to deal with the opioid epidemic." © 2016 npr

Keyword: Drug Abuse
Link ID: 22256 - Posted: 05.28.2016

By GINA KOLATA Could it be that Alzheimer’s disease stems from the toxic remnants of the brain’s attempt to fight off infection? Provocative new research by a team of investigators at Harvard leads to this startling hypothesis, which could explain the origins of plaque, the mysterious hard little balls that pockmark the brains of people with Alzheimer’s. It is still early days, but Alzheimer’s experts not associated with the work are captivated by the idea that infections, including ones that are too mild to elicit symptoms, may produce a fierce reaction that leaves debris in the brain, causing Alzheimer’s. The idea is surprising, but it makes sense, and the Harvard group’s data, published Wednesday in the journal Science Translational Medicine, supports it. If it holds up, the hypothesis has major implications for preventing and treating this degenerative brain disease. The Harvard researchers report a scenario seemingly out of science fiction. A virus, fungus or bacterium gets into the brain, passing through a membrane — the blood-brain barrier — that becomes leaky as people age. The brain’s defense system rushes in to stop the invader by making a sticky cage out of proteins, called beta amyloid. The microbe, like a fly in a spider web, becomes trapped in the cage and dies. What is left behind is the cage — a plaque that is the hallmark of Alzheimer’s. So far, the group has confirmed this hypothesis in neurons growing in petri dishes as well as in yeast, roundworms, fruit flies and mice. There is much more work to be done to determine if a similar sequence happens in humans, but plans — and funding — are in place to start those studies, involving a multicenter project that will examine human brains. “It’s interesting and provocative,” said Dr. Michael W. Weiner, a radiology professor at the University of California, San Francisco, and a principal investigator of the Alzheimer’s Disease Neuroimaging Initiative, a large national effort to track the progression of the disease and look for biomarkers like blood proteins and brain imaging to signal the disease’s presence. © 2016 The New York Times Company

Keyword: Alzheimers; Neuroimmunology
Link ID: 22255 - Posted: 05.26.2016

Ronald Crystal The goal of antiaddiction vaccines is to prevent addictive molecules from reaching the brain, where they produce their effects and can create chemical dependencies. Vaccines can accomplish this task, in theory, by generating antibodies—proteins produced by the immune system—that bind to addictive particles and essentially stop them in their tracks. But challenges remain. Among them, addictive molecules are often too small to be spotted by the human immune system. Thus, they can circulate in the body undetected. Researchers have developed two basic strategies for overcoming this problem. One invokes so-called active immunity by tethering an addictive molecule to a larger molecule, such as the proteins that encase a common cold virus. This viral shell does not make people sick but does prompt the immune system to produce high levels of antibodies against it and whatever is attached to it. In our laboratory, we have tested this method in animal models and successfully blocked chemical forms of cocaine or nicotine from reaching the brain. Another approach researchers are testing generates what is known as passive immunity against addictive molecules in the body. They have cultured monoclonal antibodies that can bind selectively to addictive molecules. The hurdle with this particular method is that monoclonal antibodies are expensive to produce and need to be administrated frequently to be effective. © 2016 Scientific American

Keyword: Drug Abuse; Neuroimmunology
Link ID: 22254 - Posted: 05.26.2016

By RUSSELL GOLDMAN There’s an elephant at a zoo outside Seoul that speaks Korean. — You mean, it understands some Korean commands, the way a dog can be trained to understand “sit” or “stay”? No, I mean it can actually say Korean words out loud. — Pics or it didn’t happen. Here, watch the video. To be fair, the elephant, a 26-year-old Asian male named Koshik, doesn’t really speak Korean, any more than a parrot can speak Korean (or English or Klingon). But parrots are supposed to, well, parrot — and elephants are not. And Koshik knows how to say at least five Korean words, which are about five more than I do. The really amazing part is how he does it. Koshik places his trunk inside his mouth and uses it to modulate the tone and pitch of the sounds his voice makes, a bit like a person putting his fingers in his mouth to whistle. In this way, Koshik is able to emulate human speech “in such detail that Korean native speakers can readily understand and transcribe the imitations,” according to the journal Current Biology. What’s in his vocabulary? Things he hears all the time from his keepers: the Korean words for hello, sit down, lie down, good and no. Elephant Speaks Korean | Video Video by LiveScienceVideos Lest you think this is just another circus trick that any Jumbo, Dumbo or Babar could pull off, the team of international scientists who wrote the journal article say Koshik’s skills represent “a wholly novel method of vocal production and formant control in this or any other species.” Like many innovations, Koshik’s may have been born of sad necessity. Researchers say he started to imitate his keepers’s sounds only after he was separated from other elephants at the age of 5 — and that his desire to speak like a human arose from sheer loneliness. © 2016 The New York Times Company

Keyword: Language; Animal Communication
Link ID: 22253 - Posted: 05.26.2016

By Teal Burrell In neuroscience, neurons get all the glory. Or rather, they used to. Researchers are beginning to discover the importance of something outside the neurons—a structure called the perineuronal net. This net might reveal how memories are stored and how various diseases ravage the brain. The realization of important roles for structures outside neurons serves as a reminder that the brain is a lot more complicated than we thought. Or, it’s exactly as complicated as neuroscientists thought it was 130 years ago. In 1882, Italian physician and scientist Camillo Golgi described a structure that enveloped cells in the brain in a thin layer. He later named it the pericellular net. His word choice was deliberate; he carefully avoided the word “neuron” since he was engaged in a battle with another neuroscience luminary, Santiago Ramón y Cajal, over whether the nervous system was a continuous meshwork of cells that were fused together—Golgi’s take—or a collection of discrete cells, called neurons—Ramón y Cajal’s view. Ramón y Cajal wasn’t having it. He argued Golgi was wrong about the existence of such a net, blaming the findings on Golgi’s eponymous staining technique, which, incidentally, is still used today. Ramón y Cajal’s influence was enough to shut down the debate. While some Golgi supporters labored in vain to prove the nets existed, their findings never took hold. Instead, over the next century, neuroscientists focused exclusively on neurons, the discrete cells of the nervous system that relay information between one another, giving rise to movements, perceptions, and emotions. (The two adversaries would begrudgingly share a Nobel Prize in 1906 for their work describing the nervous system.) © 1996-2016 WGBH Educational Foundation

Keyword: Glia
Link ID: 22252 - Posted: 05.26.2016

By Amina Zafar, Tragically Hip frontman ​Gord Downie's resilience and openness about his terminal glioblastoma and his plans to tour could help to reduce stigma and improve awareness, some cancer experts say. Tuesday's news revealed that the singer has an aggressive form of cancer that originated in his brain. An MRI scan last week showed the tumour has responded well to surgery, radiation and chemotherapy, doctors said. "I was quickly impressed by Gord's resilience and courage," Downie's neuro-oncologist, Dr. James Perry of Sunnybrook Health Sciences Centre, told a news conference. Perry said it's daunting for many of his patients to reveal the diagnosis to their family, children and co-workers. "The news today, while sad, also creates for us in brain tumour research an unprecedented opportunity to create awareness and to create an opportunity for fundraising for research that's desperately needed to improve the odds for all people with this disease," Perry said. Dr. James Perry, head of neurology at Toronto's Sunnybrook Health Sciences Centre, calls Gord Downie's sad news an unprecedented opportunity to fundraise for brain tumour research. (Aaron Vincent Elkaim/Canadian Press) "Gord's courage in coming forward with his diagnosis will be a beacon for all patients with glioblastoma in Canada. They will see a survivor continuing with his craft despite its many challenges." ©2016 CBC/Radio-Canada.

Keyword: Glia
Link ID: 22251 - Posted: 05.26.2016

Bradley George All sorts of health information is now a few taps away on your smartphone, from how many steps you take — to how well you sleep at night. But what if you could use your phone and a computer to test your vision? A company is doing just that — and eye care professionals are upset. Some states have even banned it. A Chicago-based company called Opternative offers the test. The site asks some questions about your eyes and overall health; it also wants to know your shoe size to make sure you're the right distance from your computer monitor. You keep your smartphone in your hand and use the Web browser to answer questions about what you see on the computer screen. Like a traditional eye test, there are shapes, lines and letters. It takes about 30 minutes. "We're trying to identify how bad your vision is, so we're kind of testing your vision to failure, is the way I would describe it," says Aaron Dallek, CEO of Opternative. Dallek co-founded the company with an optometrist, who was searching for ways to offer eye exams online. "Me being a lifetime glasses and contact wearer, I was like 'Where do we start?' So, that was about 3 1/2 years ago, and we've been working on it ever since," Dallek says. © 2016 npr

Keyword: ADHD
Link ID: 22250 - Posted: 05.26.2016

Susan Milius Forget it, peacocks. Nice try, elk. Sure, sexy feathers and antlers are showy, but the sperm of a fruit fly could be the most over-the-top, exaggerated male ornamentation of all. In certain fruit fly species, such as Drosophila bifurca, males measuring just a few millimeters produce sperm with a tail as long as 5.8-centimeters, researchers report May 25 in Nature. Adjusted for body size, the disproportionately supersized sperm outdoes such exuberant body parts as pheasant display feathers, deer antlers, scarab beetle horns and the forward-grasping forceps of earwigs. Fruit flies’ giant sperm have been challenging to explain, says study coauthor Scott Pitnick of Syracuse University in New York. Now he and his colleagues propose that a complex interplay of male and female benefits has accelerated sperm length in a runaway-train scenario. Males with longer sperm deliver fewer sperm, bucking a more-is-better trend. Yet, they still manage to transfer a few dozen to a few hundred per mating. And as newly arrived sperm compete to displace those already waiting in a female’s storage organ, longer is better. Fewer sperm per mating means females tend to mate more often, intensifying the sperm-vs.-sperm competition. Females that have the longest storage organs, which favor the longest sperm, benefit too: Males producing megasperm, the researchers found, tend to be the ones with good genes likely to produce robust offspring. “Sex,” says Pitnick, “is a powerful force.” © Society for Science & the Public 2000 - 2016

Keyword: Sexual Behavior; Evolution
Link ID: 22249 - Posted: 05.26.2016