Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By GRETCHEN REYNOLDS For some people with early-stage Alzheimer’s disease, frequent, brisk walks may help to bolster physical abilities and slow memory loss, according to one of the first studies of physical activity as an experimental treatment for dementia. But the study’s results, while encouraging, showed that improvements were modest and not universal, raising questions about just how and why exercise helps some people with dementia and not others. Alzheimer’s disease affects more than five million people in the United States and more than 35 million worldwide, a number that is expected to double within 20 years. There are currently no reliable treatments for the disease. But past studies of healthy elderly people have found relationships between regular exercise and improved memories. Physically active older people are, for instance, significantly less likely than those who are sedentary to develop mild cognitive impairment, a frequent precursor to Alzheimer’s disease. Physically fit older people also tend to have more volume in their brain’s hippocampus than do sedentary people of the same age, brain scans show. The hippocampus is the portion of the brain most intimately linked with memory function. But most of this research has examined whether exercise might prevent Alzheimer’s disease. Little has been known about whether it might change the trajectory of the disease in people who already have the condition. So for the new study, published in February in PLoS One, researchers at the University of Kansas decided to work directly with people who had previously been given a diagnosis of Alzheimer’s disease. Because the disease can affect coordination as it progresses, the researchers focused on men and women in its early stages, who were still living at home and could safely walk by themselves or perform other types of light exercise. © 2017 The New York Times Company
Link ID: 23294 - Posted: 03.01.2017
By Victoria Sayo Turner When you want to learn something new, you practice. Once you get the hang of it, you can hopefully do what you learned—whether it’s parallel parking or standing backflips—on the next day, and the next. If not, you fall back to stage one and practice some more. But your brain may have a shortcut that helps you lock in learning. Instead of practicing until you’re decent at something and then taking a siesta, practicing just a little longer could be the fast track to solidifying a skill. “Overlearning” is the process of rehearsing a skill even after you no longer improve. Even though you seem to have already learned the skill, you continue to practice at that same level of difficulty. A recent study suggests that this extra practice could be a handy way to lock in your hard-earned skills. In the experiment, participants were asked to look at a screen and say when they saw a stripe pattern. Then two images were flashed one after the other. The images were noisy, like static on an old TV, and only one contained a hard-to-see stripe pattern. It took about twenty minutes of practice for people to usually recognize the image with stripes in it. The participants then continued to practice for another twenty minutes for the overlearning portion. Next, the participants took a break before spending another twenty minutes learning a similar “competitor” task where the stripes were oriented at a new angle. Under normal circumstances, this second task would compete with the first and actually overwrite that skill, meaning people should now be able to detect the second pattern but no longer see the first. The researchers wanted to see if overlearning could prevent the first skill from disappearing. © 2017 Scientific American
Keyword: Learning & Memory
Link ID: 23293 - Posted: 03.01.2017
Ed Yong It’s a good time to be interested in the brain. Neuroscientists can now turn neurons on or off with just a flash of light, allowing them to manipulate the behavior of animals with exceptional precision. They can turn brains transparent and seed them with glowing molecules to divine their structure. They can record the activity of huge numbers of neurons at once. And those are just the tools that currently exist. In 2013, Barack Obama launched the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative—a $115 million plan to develop even better technologies for understanding the enigmatic gray blobs that sit inside our skulls. John Krakaeur, a neuroscientist at Johns Hopkins Hospital, has been asked to BRAIN Initiative meetings before, and describes it like “Maleficent being invited to Sleeping Beauty’s birthday.” That’s because he and four like-minded friends have become increasingly disenchanted by their colleagues’ obsession with their toys. And in a new paper that’s part philosophical treatise and part shot across the bow, they argue that this technological fetish is leading the field astray. “People think technology + big data + machine learning = science,” says Krakauer. “And it’s not.” He and his fellow curmudgeons argue that brains are special because of the behavior they create—everything from a predator’s pounce to a baby’s cry. But the study of such behavior is being de-prioritized, or studied “almost as an afterthought.” Instead, neuroscientists have been focusing on using their new tools to study individual neurons, or networks of neurons. According to Krakauer, the unspoken assumption is that if we collect enough data about the parts, the workings of the whole will become clear. If we fully understand the molecules that dance across a synapse, or the electrical pulses that zoom along a neuron, or the web of connections formed by many neurons, we will eventually solve the mysteries of learning, memory, emotion, and more. “The fallacy is that more of the same kind of work in the infinitely postponed future will transform into knowing why that mother’s crying or why I’m feeling this way,” says Krakauer. And, as he and his colleagues argue, it will not. © 2017 by The Atlantic Monthly Group
Keyword: Brain imaging
Link ID: 23292 - Posted: 02.28.2017
By Robert F. Service Scientists are chasing a new lead on a class of drugs that may one day fight both pain and opioid addiction. It’s still early days, but researchers report that they’ve discovered a new small molecule that binds selectively to a long-targeted enzyme, halting its role in pain and addiction while not interfering with enzymes critical to healthy cell function. The newly discovered compound isn’t likely to become a medicine any time soon. But it could jumpstart the search for other binders that could do the job. Pain and addiction have many biochemical roots, which makes it difficult to treat them without affecting other critical functions in cells. Today, the most potent painkillers are opioids, including heroin, oxycodone, and hydrocodone. In addition to interrupting pain, they inhibit enzymes known as adenylyl cyclases (ACs) that convert cells’ energy currency, ATP, into a molecule involved in intracellular chemical communication known as cyclic AMP (cAMP). Chronic opioid use can make cells increase the activity of ACs to compensate, causing cAMP levels to skyrocket. When opioid users try to stop using, their cAMP levels remain high, and drugs that reduce those levels—like buprenorphine—have unwanted side effects. A promising candidate for selectively reducing cAMP is one particular AC enzyme, known as AC1. Humans have 10 ACs, all of which convert ATP to cAMP. But they are expressed at different levels in different tissues, suggesting they serve disparate purposes. Over the last 15 years, experiments on mice without the gene for AC1 have shown they have reduced sensitivity to pain and fewer signs of opioid dependence. But the enzyme, along with its close relative AC8, also appears to be heavily involved in memory formation in a brain region known as the hippocampus. © 2017 American Association for the Advancement of Science.
Keyword: Pain & Touch
Link ID: 23291 - Posted: 02.28.2017
By Daniel Barron On January 2, 1979, Dr. Rafael Osheroff was admitted to Chestnut Lodge, an inpatient psychiatric hospital in Maryland. Osheroff had a bustling nephrology practice. He was married with three children, two from a previous marriage. Everything had been going well except his mood. For the previous two years, Osheroff had suffered from bouts of anxiety and depression. Dr. Nathan Kline, a prominent psychopharmacologist in New York City, had begun Osheroff on a tricyclic antidepressant and, according to Kline’s notes—which were later revealed in court—he improved. But then Osheroff decided, against Kline’s advice, to change his dose. He got worse. So much worse that he was brought to Chestnut Lodge. For the next seven months, Osheroff was treated with intensive psychotherapy for narcissistic personality disorder and depression. It didn’t help. He lost 40 pounds, suffered from excruciating insomnia, and began pacing the floor so incessantly that his feet became swollen and blistered. Osheroff’s family, distressed by the progressive unraveling of his mind, hired a psychiatrist in Washington D.C. to intervene. In response, Chestnut Lodge held a clinical case conference yet decided to not change treatment. Importantly, they decided to not begin medications but to continue psychotherapy. They considered themselves “traditional psychiatrists”—practitioners of psychodynamic psychotherapy, the technique used by Sigmund Freud and other pioneers. © 2017 Scientific American
By Steve Mirsky To conserve water, members of my household abide by the old aphorism “If it's yellow, let it mellow.” You're in a state of ignorance about that wizened phrase? If so, it recommends that one not flush the toilet after each relatively innocent act of micturition. But there's one exception to the rule: after asparagus, it's one and done—because those delicious stalks make urine smell like hell. To me and mine, anyway. The digestion of asparagus produces methanethiol and S-methyl thioesters, chemical compounds containing stinky sulfur, also known as brimstone. Hey, when I said that postasparagus urine smells like hell, I meant it literally. Methanethiol is the major culprit in halitosis and flatus, which covers both ends of that discussion. And although thioesters can also grab your nostrils by the throat, they might have played a key role in the origin of life. So be glad they were there stinking up the abiotic Earth. But does a compound reek if nobody is there to sniff it? Less philosophically, does it reek if you personally can't smell it? For only some of us are genetically gifted enough to fully appreciate the distinctive scents of postasparagus urine. The rest wander around unaware of their own olfactory offenses. Recently researchers dove deep into our DNA to determine, although we've all dealt it, exactly who smelt it. Their findings can be found in a paper entitled “Sniffing Out Significant ‘Pee Values’: Genome Wide Association Study of Asparagus Anosmia.” Asparagus anosmia refers to the inability “to smell the metabolites of asparagus in urine,” the authors helpfully explain. They don't bother to note that their bathroom humor plays on the ubiquity in research papers of the p-value, a statistical evaluation of the data that assesses whether said data look robust or are more likely the stuff that should never be allowed to mellow. © 2017 Scientific American,
By Jessica Hamzelou Fancy a coffee after that cigarette? Smoking makes you drink more caffeinated drinks, possibly by changing your metabolism so that you break down caffeine quicker, pushing you to drink more to get the same hit. That’s according to Marcus Munafò at the University of Bristol, UK, and his colleagues who have looked into the smoking and drinking habits of about 250,000 people. It’s impossible to do a randomised controlled trial (the most rigorous kind of scientific trial) when it comes to smoking, because it would be unethical to ask a randomly selected group of people to smoke. The next best thing is to study huge biobanks of health data. These biobanks contain information about people’s genes, diets and lifestyles. To explore the relationship between smoking and caffeine, Munafo and his colleagues analysed data from biobanks in the UK, Norway and Denmark. They were particularly interested in people who had inherited a variant of a gene that has already been shown to increase cigarette smoking. The team found that people who had this gene variant also consumed more coffee – but only if they smoked. British people with the same variant also drank more tea, although their Danish and Norwegian counterparts didn’t. This is probably due to cultural differences, says Munafò. “People in Norway and Denmark don’t chain drink tea in the same way that people in the UK do,” he says. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 23287 - Posted: 02.27.2017
By JANE E. BRODY In letters to The Times, blind readers reacted with heartfelt reassurance and practical guidance to Edward Hoagland’s essay, “Feeling My Way Into Blindness,” published in November. Stanley F. Wainapel, clinical director of physical medicine and rehabilitation at Montefiore Medical Center in the Bronx, admitted that “adapting to vision loss is a major challenge.” But he disputed Mr. Hoagland’s allusion to “enforced passivity,” pointing out that many advances in technology — from screen-reading software for computers to portable devices that read menus or printed letters “with a delay of only seconds” — can keep productivity, creativity and pleasure very much alive for people who can no longer see. Rabbi Michael Levy, president of Yad HaChazakah, the Jewish Disability Empowerment Center, also acknowledged that “transition to a world without sight is far from easy.” But he insisted, “Blindness does not cut me off from the world.” He cited skillful use of a cane, travel devices that tell him where he is and what is around him and periodicals available in real time by telephone among myriad other gadgets that “see” for him. Annika Ariel, a blind student double-majoring in English and political science at Amherst College, wrote that her problems are not with her blindness but rather from people’s attitudes that depict the blind as helpless and dependent. She said she travels independently, uses assistive technologies to complete her work as efficiently as others who can see, and excels academically and socially. Equally inspiring was the response of Mark Riccobono, president of the National Federation of the Blind, who became legally blind at age 5 and lost all useful vision to glaucoma at 14. © 2017 The New York Times Company
Link ID: 23284 - Posted: 02.27.2017
Geoff Brumfiel When the half-brother of North Korean leader Kim Jong Un collapsed at a Malaysian airport last week, poisoning was instantly suspected. But on Friday, Malaysian authorities revealed that an autopsy had turned up not just any poison, but a rare nerve agent known as VX. VX is among the deadliest chemical weapons ever devised. A colorless, odorless liquid, similar in consistency to motor oil, it kills in tiny quantities that can be absorbed through the skin. A relative of the nerve agent Sarin, VX disrupts communications between nerves and muscles. Victims of VX initially experience nausea and dizziness. Without an antidote, the chemical eventually paralyzes the diaphragm, causing suffocation. That may have been the fate of Kim Jong Nam, the estranged half-brother of North Korea's leader. Security footage showed that Kim was approached by two women who appeared to cover his face with a cloth. Moments later, he fell ill and sought help. He died before reaching a hospital. If the Malaysian analysis is correct and VX was the culprit, that would seem to suggest that the North Korean state itself is behind the killing. "Hardly anybody has it," says Dan Kaszeta, a chemical weapons expert and consultant based in London. The U.S. has destroyed nearly all of its stocks of VX in recent years. North Korea is among the few states in the world that have an active chemical weapons program. It is not a signatory to the Chemical Weapons Convention, which bans the use of such weapons. © 2017 npr
Link ID: 23282 - Posted: 02.25.2017
By James Gallagher Health and science reporter, Maps have revealed "hotspots" of schizophrenia and other psychotic illnesses in England, based on the amount of medication prescribed by GPs. The analysis by the University of East London showed North Kesteven, in Lincolnshire, had the highest rates. The lowest rate of schizophrenia prescriptions was in East Dorset. However, explaining the pattern across England is complicated and the research team says the maps pose a lot of questions. They were developed using anonymous prescription records that are collected from doctors' surgeries in England. They record only prescriptions given out by GPs - not the number of patients treated - so hospital treatment is missed in the analysis. Data between October 2015 and September 2016 showed the average number of schizophrenia prescriptions across England was 19 for every 1,000 people. Prof Allan Brimicombe, one of the researchers from UEL, said: "The pattern is not uniformly spread across the country." He suggests this could be due to "environmental effects" such as different rates of drink or drug abuse. Prof Brimicombe told the BBC: "The top one is in the Lincolnshire countryside and there are others in the countryside." © 2017 BBC
Link ID: 23281 - Posted: 02.25.2017
By Carolyn Gramling Trilobites—three-sectioned, crablike critters that dominated the early Paleozoic—are so abundant that they have become the gateway fossil for most collectors. But paleontologists have found little evidence of how the extinct arthropods reproduced—until now. Researchers studying a fossil specimen of the trilobite Triarthrus eatoni spotted something odd just next to the animal’s head: a collection of small (about 200 micrometers across), round objects (in light blue, above). Those, they determined, are actually eggs—the first time anyone had observed fossil trilobite eggs right next to the critters themselves. The structures were exceptionally well preserved, the eggs and exoskeletons of the trilobites replaced with an iron sulfide ore called pyrite. They came from the Lorraine Group, a rock formation that spans much of the northeastern United States and dates to the Ordovician period (about 485 million to 444 million years ago); it has long been a mecca for trilobite hunters because of the pyritization. The placement of the eggs is suggestive, the researchers report in the March issue of Geology: They hypothesize that trilobites released their eggs and sperm through a genital pore somewhere in the head—much like modern horseshoe crabs do today. One possible reason for the rarity of the find may be that the brooding behavior of T. eatoni was relatively unusual in the trilobite world: The species tended to prefer a harsh, low-oxygen environment, and may have kept a closer eye on their eggs than other trilobite species. But, the authors note, one idea this finding does lay to rest is that trilobites might reproduce via copulation—a titillating but little-regarded hypothesis based on the fact that trilobites are sometimes found clustered on top of one another. Instead, trilobites were most likely spawners—and, in fact, that clustering behavior may be another parallel to horseshoe crabs, which can climb on top of one another in competition to fertilize released eggs. © 2017 American Association for the Advancement of Science
Rae Ellen Bichell Initially, Clint Perry wanted to make a vending machine for bumblebees. He wanted to understand how they solve problems. Perry, a cognitive biologist at Queen Mary University of London, is interested in testing the limits of animal intelligence. "I want to know: How does the brain do stuff? How does it make decisions? How does it keep memory?" says Perry. And how big does a brain need to be in order to do all of those things? He decided to test this on bumblebees by presenting the insects with a puzzle that they'd likely never encounter in the wild. He didn't end up building that vending machine, but he did put bees through a similar scenario. Perry and his colleagues wrote Thursday in the journal Science that, despite bees' miniature brains, they can solve new problems quickly just by observing a demonstration. This suggests that bees, which are important crop pollinators, could in time adapt to new food sources if their environment changed. As we have reported on The Salt before, bee populations around the world have declined in recent years. Scientists think a changing environment is at least partly responsible. Perry and colleagues built a platform with a porous ball sitting at the center of it. If a bee went up to the ball, it would find that it could access a reward, sugar water. One by one, bumblebees walked onto the platform, explored a bit, and then slurped up the sugar water in the middle. "Essentially, the first experiment was: Can bees learn to roll a ball?" says Perry. © 2017 npr
By KATHRYN SHATTUCK After his short film screened at the Sundance Film Festival in 2008, a euphoric Simon Fitzmaurice was walking the snowy streets of Park City, Utah, when his foot began to hurt. Back home in Ireland that summer, by then dealing with a pronounced limp, he received a shattering diagnosis: motor neuron disease, or M.N.D. (more commonly known in the United States as A.L.S., or Lou Gehrig’s Disease), a neurological disorder that causes increasing muscle weakness and eventual paralysis and is, in most cases, fatal. The doctor gave Mr. Fitzmaurice, then 33, three or four years to live. That might have been the end of any normal existence. But Mr. Fitzmaurice, by his own measure a “bit of a stubborn bastard,” was determined to leave his wife, Ruth, and their two young sons — with a third on the way — a legacy other than self-pity. The result is Mr. Fitzmaurice’s first feature film, and perhaps his salvation — “My Name Is Emily.” The movie, which opened in limited release in the United States on Feb. 17, stars Evanna Lynch, the airy Luna Lovegood of “Harry Potter” fame, as a teenage outlier in both her Dublin foster home and high school who goes on the lam with her only friend (George Webster) to free her father (Michael Smiley) from a mental hospital. The film — with gorgeous scenes of Ms. Lynch plunged, nymphlike, into a cerulean sea or riding shotgun through the emerald countryside in a canary-yellow vintage Renault — won for best cinematography when it debuted at the Galway Film Fleadh in 2015. “I am not trying to prove anything,” Mr. Fitzmaurice wrote in an email, before quickly reconsidering. “Actually, I am trying to prove something. I remember thinking, ‘I must do this to show my children to never give up.’” Mr. Fitzmaurice was writing with his hands when he began the script for “My Name Is Emily.” By the time he was finished, he was writing with his eyes. © 2017 The New York Times Company
Keyword: ALS-Lou Gehrig's Disease
Link ID: 23275 - Posted: 02.24.2017
By Diana Kwon Neuroscientists have long debated how itch and pain overlap in the nervous system. Although itch was once thought to arise from the same neurons that generate pain, later observations disputing this theory led many to believe these sensations had distinct neural circuits. In a study published today (February 22) in Neuron, researchers report that a subset of “itch-specific” nerve cells in the murine spinal cord are also involved in sensing pain, bringing the specificity theory into question. “We were surprised that contrary to what the field believes, neurons [in the spinal cord] coded for both pain and itch sensations,” coauthor Shuhao Sun, a neuroscience graduate student at Johns Hopkins University, told The Scientist. “[This] means there can be some crosstalk between these two sensations in the central nervous system.” Historically, the observation that pain could quell itch led many neuroscientists to subscribe to the intensity theory, which suggested that, in the same neurons, weak stimulation generated itch, while strong activation led to pain. However, this theory was largely abandoned around the 1980s when several groups discovered that weak painful stimuli did not lead to itch and that strong itch stimuli did not lead to pain. Instead, many researchers adopted the labeled-line theory, which proposed that there were separate neurons dedicated to each sensation. © 1986-2017 The Scientist
Keyword: Pain & Touch
Link ID: 23273 - Posted: 02.24.2017
Tina Hesman Saey Humans and Neandertals are still in an evolutionary contest, a new study suggests. Geneticist Joshua Akey of the University of Washington in Seattle and colleagues examined gene activity of more than 700 genes in which at least one person carried a human and a Neandertal version of the gene. Human versions of some genes are more active than Neandertal versions, especially in the brain and testes, the researchers report February 23 in Cell. In other tissues, some Neandertal versions of genes were more active than their human counterparts. In the brain, human versions were favored over Neandertal variants in the cerebellum and basal ganglia. That finding may help explain why Neandertals had proportionally smaller cerebellums than humans do. Neandertal versions of genes in the testes, including some needed for sperm function, were also less active than human varieties. That finding is consistent with earlier studies that suggested male human-Neandertal hybrids may have been infertile, Akey says. But Neandertal genes don’t always lose. In particular, the Neandertal version of an immunity gene called TLR1 is more active than the human version, the researchers discovered. Lopsided gene activity may help explain why carrying Neandertal versions of some genes has been linked to human diseases, such as lupus and depression (SN: 3/5/16, p. 18). Usually, both copies contribute equally to a gene’s total activity. Less robust activity of a version inherited from Neandertals might cause total activity to dip to unhealthy levels, for instance. |© Society for Science & the Public 2000 - 2017
By RONI CARYN RABIN Older adults who started sleeping more than nine hours a night — but had not previously slept so much — were at more than double the risk of developing dementia a decade later than those who slept nine hours or less, researchers report. The increased risk was not seen in people who had always slept more than nine hours. “We’re not suggesting you go wake up Grandpa. We think this might be a marker for the risk of dementia, not a cause” of the illness, said Dr. Sudha Seshadri, a professor of neurology at Boston University School of Medicine and the senior author of the study, in Neurology. Using data from 2,457 people, average age 72, who were part of a study in Framingham, Mass., the researchers found that those with a new habit of excessive slumber were at a greater risk of all forms of dementia, including Alzheimer’s, which is characterized by a buildup of beta amyloid, a toxic protein fragment that forms plaques in the brain. “My suspicion is that this is a compensatory mechanism: that at a time when amyloid is building up in the brain, people may be sleeping longer as the body is reacting and trying to remove it from the brain,” Dr. Seshadri added. © 2017 The New York Times Company
Link ID: 23270 - Posted: 02.24.2017
By Jessica Hamzelou Three people with paralysis have learned to type by thought alone using a brain implant – at the fastest speeds recorded using such a system. Two have motor neurone disease, also known as ALS – a degenerative disorder that destroys neurons associated with movement – while the other has a spinal cord injury. All three have weakness or paralysis in all of their limbs. There is a chance that those with ALS will eventually lose the ability to speak, too, says Jaimie Henderson, a neurosurgeon at Stanford University Medical Center in California. People who have lost the ability to talk may be offered devices that allow them to select letters on a screen using head, cheek or eye movements. This is how Stephen Hawking communicates, for example. But brain-machine interfaces are also being developed in the hope that they may one day be a more intuitive way of communicating. These involve reading brain activity, either externally or via an implant embedded in the brain, and turning it into a signal that can be used to direct something in the environment. At the moment, these devices are a little slow. Henderson and his colleagues wanted to make a device that was quicker and easier to use than those currently in trials. © Copyright Reed Business Information Ltd.
By Sam Wong Can a mouse be mindful? Researchers believe they have created the world’s first mouse model of meditation by using light to trigger brain activity similar to what meditation induces. The mice involved appeared less anxious, too. Human experiments show that meditation reduces anxiety, lowers levels of stress hormones and improves attention and cognition. In one study of the effects of two to four weeks of meditation training, Michael Posner of the University of Oregon and colleagues discovered changes in the white matter in volunteers’ brains, related to the efficiency of communication between different brain regions. The changes, picked up in scans, were particularly noticeable between the anterior cingulate cortex (ACC) and other areas. Since the ACC regulates activity in the amygdala, a region that controls fearful responses, Posner’s team concluded that the changes in white matter could be responsible for meditation’s effects on anxiety. The mystery was how meditation could alter the white matter in this way. Posner’s team figured that it was related to changes in theta brainwaves, measured using electrodes on the scalp. Meditation increases theta wave activity, even when people are no longer meditating. To test the theory, the team used optogenetics – they genetically engineered certain cells to be switched on by light. In this way, they were able to use pulses of light on mice to stimulate theta brainwave-like activity in the ACC. © Copyright Reed Business Information Ltd.
Link ID: 23260 - Posted: 02.21.2017
Many people think of fish and seafood as being healthy. However, new research suggests eating certain species that tend to have high levels of mercury may be linked to a greater risk of developing amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease. Questions remain about the possible impact of mercury in fish, according to a preliminary study released Monday that will be presented at the American Academy of Neurology's 69th annual meeting in Boston in April. Fish and seafood consumption as a regular part of the diet was not associated with ALS, the study said. "For most people, eating fish is part of a healthy diet," said study author Elijah Stommel of Dartmouth College in Hanover, N.H., and a fellow of the academy. In addition, the authors said their study does not negate the fact that eating fish provides many health benefits. Instead, it suggests people may want to choose species that are known to have a lower mercury content, and avoid consuming fish caught in waters where there is mercury contamination. The researchers stressed that more research is needed before fish consumption guidelines for neurodegenerative illness can be made. While the exact cause of ALS is not known, some previous studies have suggested the neurotoxic metal to be a risk factor for ALS, a progressive neurological disease. ©2017 CBC/Radio-Canada.
By Robert F. Service Predicting color is easy: Shine a light with a wavelength of 510 nanometers, and most people will say it looks green. Yet figuring out exactly how a particular molecule will smell is much tougher. Now, 22 teams of computer scientists have unveiled a set of algorithms able to predict the odor of different molecules based on their chemical structure. It remains to be seen how broadly useful such programs will be, but one hope is that such algorithms may help fragrancemakers and food producers design new odorants with precisely tailored scents. This latest smell prediction effort began with a recent study by olfactory researcher Leslie Vosshall and colleagues at The Rockefeller University in New York City, in which 49 volunteers rated the smell of 476 vials of pure odorants. For each one, the volunteers labeled the smell with one of 19 descriptors, including “fish,” “garlic,” “sweet,” or “burnt.” They also rated each odor’s pleasantness and intensity, creating a massive database of more than 1 million data points for all the odorant molecules in their study. When computational biologist Pablo Meyer learned of the Rockefeller study 2 years ago, he saw an opportunity to test whether computer scientists could use it to predict how people would assess smells. Besides working at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, Meyer heads something called the DREAM challenges, contests that ask teams of computer scientists to solve outstanding biomedical problems, such as predicting the outcome of prostate cancer treatment based on clinical variables or detecting breast cancer from mammogram data. “I knew from graduate school that olfaction was still one of the big unknowns,” Meyer says. Even though researchers have discovered some 400 separate odor receptors in humans, he adds, just how they work together to distinguish different smells remains largely a mystery. © 2017 American Association for the Advancement of Science