Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 3255

Rae Ellen Bichell Initially, Clint Perry wanted to make a vending machine for bumblebees. He wanted to understand how they solve problems. Perry, a cognitive biologist at Queen Mary University of London, is interested in testing the limits of animal intelligence. "I want to know: How does the brain do stuff? How does it make decisions? How does it keep memory?" says Perry. And how big does a brain need to be in order to do all of those things? He decided to test this on bumblebees by presenting the insects with a puzzle that they'd likely never encounter in the wild. He didn't end up building that vending machine, but he did put bees through a similar scenario. Perry and his colleagues wrote Thursday in the journal Science that, despite bees' miniature brains, they can solve new problems quickly just by observing a demonstration. This suggests that bees, which are important crop pollinators, could in time adapt to new food sources if their environment changed. As we have reported on The Salt before, bee populations around the world have declined in recent years. Scientists think a changing environment is at least partly responsible. Perry and colleagues built a platform with a porous ball sitting at the center of it. If a bee went up to the ball, it would find that it could access a reward, sugar water. One by one, bumblebees walked onto the platform, explored a bit, and then slurped up the sugar water in the middle. "Essentially, the first experiment was: Can bees learn to roll a ball?" says Perry. © 2017 npr

Keyword: Learning & Memory; Evolution
Link ID: 23278 - Posted: 02.24.2017

By KATHRYN SHATTUCK After his short film screened at the Sundance Film Festival in 2008, a euphoric Simon Fitzmaurice was walking the snowy streets of Park City, Utah, when his foot began to hurt. Back home in Ireland that summer, by then dealing with a pronounced limp, he received a shattering diagnosis: motor neuron disease, or M.N.D. (more commonly known in the United States as A.L.S., or Lou Gehrig’s Disease), a neurological disorder that causes increasing muscle weakness and eventual paralysis and is, in most cases, fatal. The doctor gave Mr. Fitzmaurice, then 33, three or four years to live. That might have been the end of any normal existence. But Mr. Fitzmaurice, by his own measure a “bit of a stubborn bastard,” was determined to leave his wife, Ruth, and their two young sons — with a third on the way — a legacy other than self-pity. The result is Mr. Fitzmaurice’s first feature film, and perhaps his salvation — “My Name Is Emily.” The movie, which opened in limited release in the United States on Feb. 17, stars Evanna Lynch, the airy Luna Lovegood of “Harry Potter” fame, as a teenage outlier in both her Dublin foster home and high school who goes on the lam with her only friend (George Webster) to free her father (Michael Smiley) from a mental hospital. The film — with gorgeous scenes of Ms. Lynch plunged, nymphlike, into a cerulean sea or riding shotgun through the emerald countryside in a canary-yellow vintage Renault — won for best cinematography when it debuted at the Galway Film Fleadh in 2015. “I am not trying to prove anything,” Mr. Fitzmaurice wrote in an email, before quickly reconsidering. “Actually, I am trying to prove something. I remember thinking, ‘I must do this to show my children to never give up.’” Mr. Fitzmaurice was writing with his hands when he began the script for “My Name Is Emily.” By the time he was finished, he was writing with his eyes. © 2017 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 23275 - Posted: 02.24.2017

By Diana Kwon Neuroscientists have long debated how itch and pain overlap in the nervous system. Although itch was once thought to arise from the same neurons that generate pain, later observations disputing this theory led many to believe these sensations had distinct neural circuits. In a study published today (February 22) in Neuron, researchers report that a subset of “itch-specific” nerve cells in the murine spinal cord are also involved in sensing pain, bringing the specificity theory into question. “We were surprised that contrary to what the field believes, neurons [in the spinal cord] coded for both pain and itch sensations,” coauthor Shuhao Sun, a neuroscience graduate student at Johns Hopkins University, told The Scientist. “[This] means there can be some crosstalk between these two sensations in the central nervous system.” Historically, the observation that pain could quell itch led many neuroscientists to subscribe to the intensity theory, which suggested that, in the same neurons, weak stimulation generated itch, while strong activation led to pain. However, this theory was largely abandoned around the 1980s when several groups discovered that weak painful stimuli did not lead to itch and that strong itch stimuli did not lead to pain. Instead, many researchers adopted the labeled-line theory, which proposed that there were separate neurons dedicated to each sensation. © 1986-2017 The Scientist

Keyword: Pain & Touch
Link ID: 23273 - Posted: 02.24.2017

Tina Hesman Saey Humans and Neandertals are still in an evolutionary contest, a new study suggests. Geneticist Joshua Akey of the University of Washington in Seattle and colleagues examined gene activity of more than 700 genes in which at least one person carried a human and a Neandertal version of the gene. Human versions of some genes are more active than Neandertal versions, especially in the brain and testes, the researchers report February 23 in Cell. In other tissues, some Neandertal versions of genes were more active than their human counterparts. In the brain, human versions were favored over Neandertal variants in the cerebellum and basal ganglia. That finding may help explain why Neandertals had proportionally smaller cerebellums than humans do. Neandertal versions of genes in the testes, including some needed for sperm function, were also less active than human varieties. That finding is consistent with earlier studies that suggested male human-Neandertal hybrids may have been infertile, Akey says. But Neandertal genes don’t always lose. In particular, the Neandertal version of an immunity gene called TLR1 is more active than the human version, the researchers discovered. Lopsided gene activity may help explain why carrying Neandertal versions of some genes has been linked to human diseases, such as lupus and depression (SN: 3/5/16, p. 18). Usually, both copies contribute equally to a gene’s total activity. Less robust activity of a version inherited from Neandertals might cause total activity to dip to unhealthy levels, for instance. |© Society for Science & the Public 2000 - 2017

Keyword: Evolution; Genes & Behavior
Link ID: 23272 - Posted: 02.24.2017

By RONI CARYN RABIN Older adults who started sleeping more than nine hours a night — but had not previously slept so much — were at more than double the risk of developing dementia a decade later than those who slept nine hours or less, researchers report. The increased risk was not seen in people who had always slept more than nine hours. “We’re not suggesting you go wake up Grandpa. We think this might be a marker for the risk of dementia, not a cause” of the illness, said Dr. Sudha Seshadri, a professor of neurology at Boston University School of Medicine and the senior author of the study, in Neurology. Using data from 2,457 people, average age 72, who were part of a study in Framingham, Mass., the researchers found that those with a new habit of excessive slumber were at a greater risk of all forms of dementia, including Alzheimer’s, which is characterized by a buildup of beta amyloid, a toxic protein fragment that forms plaques in the brain. “My suspicion is that this is a compensatory mechanism: that at a time when amyloid is building up in the brain, people may be sleeping longer as the body is reacting and trying to remove it from the brain,” Dr. Seshadri added. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23270 - Posted: 02.24.2017

By Jessica Hamzelou Three people with paralysis have learned to type by thought alone using a brain implant – at the fastest speeds recorded using such a system. Two have motor neurone disease, also known as ALS – a degenerative disorder that destroys neurons associated with movement – while the other has a spinal cord injury. All three have weakness or paralysis in all of their limbs. There is a chance that those with ALS will eventually lose the ability to speak, too, says Jaimie Henderson, a neurosurgeon at Stanford University Medical Center in California. People who have lost the ability to talk may be offered devices that allow them to select letters on a screen using head, cheek or eye movements. This is how Stephen Hawking communicates, for example. But brain-machine interfaces are also being developed in the hope that they may one day be a more intuitive way of communicating. These involve reading brain activity, either externally or via an implant embedded in the brain, and turning it into a signal that can be used to direct something in the environment. At the moment, these devices are a little slow. Henderson and his colleagues wanted to make a device that was quicker and easier to use than those currently in trials. © Copyright Reed Business Information Ltd.

Keyword: ALS-Lou Gehrig's Disease ; Robotics
Link ID: 23264 - Posted: 02.22.2017

By Sam Wong Can a mouse be mindful? Researchers believe they have created the world’s first mouse model of meditation by using light to trigger brain activity similar to what meditation induces. The mice involved appeared less anxious, too. Human experiments show that meditation reduces anxiety, lowers levels of stress hormones and improves attention and cognition. In one study of the effects of two to four weeks of meditation training, Michael Posner of the University of Oregon and colleagues discovered changes in the white matter in volunteers’ brains, related to the efficiency of communication between different brain regions. The changes, picked up in scans, were particularly noticeable between the anterior cingulate cortex (ACC) and other areas. Since the ACC regulates activity in the amygdala, a region that controls fearful responses, Posner’s team concluded that the changes in white matter could be responsible for meditation’s effects on anxiety. The mystery was how meditation could alter the white matter in this way. Posner’s team figured that it was related to changes in theta brainwaves, measured using electrodes on the scalp. Meditation increases theta wave activity, even when people are no longer meditating. To test the theory, the team used optogenetics – they genetically engineered certain cells to be switched on by light. In this way, they were able to use pulses of light on mice to stimulate theta brainwave-like activity in the ACC. © Copyright Reed Business Information Ltd.

Keyword: Attention
Link ID: 23260 - Posted: 02.21.2017

Many people think of fish and seafood as being healthy. However, new research suggests eating certain species that tend to have high levels of mercury may be linked to a greater risk of developing amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease. Questions remain about the possible impact of mercury in fish, according to a preliminary study released Monday that will be presented at the American Academy of Neurology's 69th annual meeting in Boston in April. Fish and seafood consumption as a regular part of the diet was not associated with ALS, the study said. "For most people, eating fish is part of a healthy diet," said study author Elijah Stommel of Dartmouth College in Hanover, N.H., and a fellow of the academy. In addition, the authors said their study does not negate the fact that eating fish provides many health benefits. Instead, it suggests people may want to choose species that are known to have a lower mercury content, and avoid consuming fish caught in waters where there is mercury contamination. The researchers stressed that more research is needed before fish consumption guidelines for neurodegenerative illness can be made. While the exact cause of ALS is not known, some previous studies have suggested the neurotoxic metal to be a risk factor for ALS, a progressive neurological disease. ©2017 CBC/Radio-Canada.

Keyword: ALS-Lou Gehrig's Disease ; Neurotoxins
Link ID: 23257 - Posted: 02.21.2017

By Robert F. Service Predicting color is easy: Shine a light with a wavelength of 510 nanometers, and most people will say it looks green. Yet figuring out exactly how a particular molecule will smell is much tougher. Now, 22 teams of computer scientists have unveiled a set of algorithms able to predict the odor of different molecules based on their chemical structure. It remains to be seen how broadly useful such programs will be, but one hope is that such algorithms may help fragrancemakers and food producers design new odorants with precisely tailored scents. This latest smell prediction effort began with a recent study by olfactory researcher Leslie Vosshall and colleagues at The Rockefeller University in New York City, in which 49 volunteers rated the smell of 476 vials of pure odorants. For each one, the volunteers labeled the smell with one of 19 descriptors, including “fish,” “garlic,” “sweet,” or “burnt.” They also rated each odor’s pleasantness and intensity, creating a massive database of more than 1 million data points for all the odorant molecules in their study. When computational biologist Pablo Meyer learned of the Rockefeller study 2 years ago, he saw an opportunity to test whether computer scientists could use it to predict how people would assess smells. Besides working at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, Meyer heads something called the DREAM challenges, contests that ask teams of computer scientists to solve outstanding biomedical problems, such as predicting the outcome of prostate cancer treatment based on clinical variables or detecting breast cancer from mammogram data. “I knew from graduate school that olfaction was still one of the big unknowns,” Meyer says. Even though researchers have discovered some 400 separate odor receptors in humans, he adds, just how they work together to distinguish different smells remains largely a mystery. © 2017 American Association for the Advancement of Science

Keyword: Chemical Senses (Smell & Taste); Robotics
Link ID: 23254 - Posted: 02.20.2017

By Michael Price BOSTON--Among mammals, primates are unique in that certain species have three different types of light-sensitive cone cells in their eyes rather than two. This allows humans and their close relatives to see what we think of as the standard spectrum of color. (Humans with red-green color blindness, of course, see a different spectrum.) The standard explanation for why primates developed trichromacy, as this kind of vision is called, is that it allowed our early ancestors to see colorful ripe fruit more easily against a background of mostly green forest. A particular Old World monkey, the rhesus macaque (pictured), has a genetic distinction that offers a convenient natural test of this hypothesis: a common genetic variation makes some females have three types of cone cells and others have two. Studies with captive macaques has shown that trichromatic females are faster than their dichromatic peers at finding fruit, but attempts to see whether that’s true for wild monkeys has been complicated by the fact that macaques are hard to find, and age and rank also play big roles in determining who eats when. A vision researcher reported today at the annual meeting of AAAS, which publishes Science, that after making more than 20,000 individual observations of 80 different macaques feeding from 30 species of trees on Cayo Santiago, Puerto Rico, she can say with confidence that wild trichromatic female monkeys do indeed appear to locate and eat fruit more quickly than dichromatic ones, lending strong support to the idea that this advantage helped drive the evolution of trichromacy in humans and our relatives. © 2017 American Association for the Advancement of Science.

Keyword: Vision; Evolution
Link ID: 23252 - Posted: 02.20.2017

Emotions are a cognitive process that relies on “higher-order states” embedded in cortical (conscious) brain circuits; emotions are not innately programmed into subcortical (nonconscious) brain circuits, according to a potentially earth-shattering new paper by Joseph LeDoux and Richard Brown. The February 2017 paper, “A Higher-Order Theory of Emotional Consciousness,” was published online today ahead of print in the journal Proceedings of the National Academy of Sciences. This paper was written by neuroscience legend Joseph LeDoux of New York University and Richard Brown, professor of philosophy at the City University of New York's LaGuardia College. Joseph LeDoux has been working on the link between emotion, memory, and the brain since the 1990s. He's credited with putting the amygdala in the spotlight and making this previously esoteric subcortical brain region a household term. LeDoux founded the Emotional Brain Institute (EBI). He’s also a professor in the Departments of Psychiatry and Child and Adolescent Psychiatry at NYU Langone Medical Center. Why Is This New Report From LeDoux and Brown Significant? In the world of cognitive neuroscience, there's an ongoing debate about the interplay between emotional states of consciousness (or feelings) within cortical and subcortical brain regions. (Most experts believe that cortical brain regions house “thinking” neural circuits within the cerebral cortex. Subcortical brain regions are considered to be housed in “non-thinking” neural circuits beneath the 'thinking cap' of the cerebral cortex.) © 1991-2017 Sussex Publishers, LLC

Keyword: Emotions
Link ID: 23249 - Posted: 02.18.2017

By Timothy Revell It can be difficult to communicate when you can only move your eyes, as is often the case for people with ALS (also known as motor neurone disease). Microsoft researchers have developed an app to make talking with your eyes easier, called GazeSpeak. GazeSpeak runs on a smartphone and uses artificial intelligence to convert eye movements into speech, so a conversation partner can understand what is being said in real time. The app runs on the listener’s device. They point their smartphone at the speaker as if they are taking a photo. A sticker on the back of the phone, visible to the speaker, shows a grid with letters grouped into four boxes corresponding to looking left, right, up and down. As the speaker gives different eye signals, GazeSpeak registers them as letters. “For example, to say the word ‘task’ they first look down to select the group containing ‘t’, then up to select the group containing ‘a’, and so on,” says Xiaoyi Zhang, who developed GazeSpeak whilst he was an intern at Microsoft. GazeSpeak selects the appropriate letter from each group by predicting the word the speaker wants to say based on the most common English words, similar to predictive text messaging. The speaker indicates they have finished a word by winking or looking straight ahead for two seconds. The system also takes into account added lists of words, like names or places that the speaker is likely to use. The top four word predictions are shown onscreen, and the top one is read aloud. © Copyright Reed Business Information Ltd.

Keyword: ALS-Lou Gehrig's Disease ; Robotics
Link ID: 23248 - Posted: 02.18.2017

By Emma Hiolski There’s more to those love handles than meets the eye. Fat tissue can communicate with other organs from afar, sending out tiny molecules that control gene activity in other parts of the body, according to a new study. This novel route of cell-to-cell communication could indicate fat plays a much bigger role in regulating metabolism than previously thought. It could also mean new treatment options for diseases such as obesity and diabetes. “I found this very interesting and, frankly, very exciting,” says Robert Freishtat of Children’s National Health System in Washington, D.C., a pediatrician and researcher who has worked with metabolic conditions like obesity and diabetes. Scientists have long known that fat is associated with all sorts of disease processes, he says, but they don’t fully understand how the much-reviled tissue affects distant organs and their functions. Scientists have identified hormones made by fat that signal the brain to regulate eating, but this new study—in which Freishtat was not involved—takes a fresh look at another possible messenger: small snippets of genetic material called microRNAs, or miRNAs. MiRNAs, tiny pieces of RNA made inside cells, help control the expression of genes and, consequently, protein production throughout the body. But some tumble freely through the bloodstream, bundled into tiny packets called exomes. There, high levels of some miRNAs have been associated with obesity, diabetes, cancer, and cardiovascular disease. © 2017 American Association for the Advancement of Science.

Keyword: Obesity
Link ID: 23247 - Posted: 02.17.2017

By Alice Callahan Once fat cells are formed, can you ever get rid of them? The number of fat cells in a person’s body seems to be able to change in only one direction: up. Fat cell number increases through childhood and adolescence and generally stabilizes in adulthood. But this doesn’t mean that fat cells, or adipocytes, are stagnant. The size of individual fat cells is remarkably variable, expanding and contracting with weight gain or weight loss. And as with most cell types in the body, adipocytes die eventually. “Usually when old ones die, they are replaced by new fat cells,” said Dr. Michael Jensen, an endocrinologist and obesity researcher at the Mayo Clinic. Cell death and production appear to be tightly coupled, so although about 10 percent of adipocytes die each year, they’re replaced at the same rate. Even among bariatric surgery patients, who can lose massive amounts of weight, the number of fat cells tends to remain the same, although the cells shrink in size, studies show. Liposuction reduces the number of fat cells in a person’s body, but studies show the weight lost is typically regained within a year. It isn’t known whether this regain occurs through the production of new fat cells or expansion of existing ones. People who are obese tend to have more fat cells than those who are not, and several studies have found an increase in fat cell number with weight regain following weight loss. The fact that fat cell number can be increased but not decreased most likely contributes to the body’s drive to regain weight after weight loss, said Dr. Kirsty L. Spalding, a cell biologist at the Karolinska Institute in Sweden and the lead author of a 2008 study showing that fat cells die and are replaced. Beyond their role in storing fat, adipocytes secrete proteins and hormones that affect energy metabolism. © 2017 The New York Times Company

Keyword: Obesity
Link ID: 23246 - Posted: 02.17.2017

By Kelly Clancy More than two hundred years ago, a French weaver named Joseph Jacquard invented a mechanism that greatly simplified textile production. His design replaced the lowly draw boy—the young apprentice who meticulously chose which threads to feed into the loom to create a particular pattern—with a series of paper punch cards, which had holes dictating the lay of each stitch. The device was so successful that it was repurposed in the first interfaces between humans and computers; for much of the twentieth century, programmers laid out their code like weavers, using a lattice of punched holes. The cards themselves were fussy and fragile. Ethereal information was at the mercy of its paper substrate, coded in a language only experts could understand. But successive computer interfaces became more natural, more flexible. Immutable program instructions were softened to “If x, then y. When a, try b.” Now, long after Jacquard’s invention, we simply ask Amazon’s Echo to start a pot of coffee, or Apple’s Siri to find the closest car wash. In order to make our interactions with machines more natural, we’ve learned to model them after ourselves. Early in the history of artificial intelligence, researchers came up against what is referred to as Moravec’s paradox: tasks that seem laborious to us (arithmetic, for example) are easy for a computer, whereas those that seem easy to us (like picking out a friend’s voice in a noisy bar) have been the hardest for A.I. to master. It is not profoundly challenging to design a computer that can beat a human at a rule-based game like chess; a logical machine does logic well. But engineers have yet to build a robot that can hopscotch. The Austrian roboticist Hans Moravec theorized that this might have something to do with evolution. Since higher reasoning has only recently evolved—perhaps within the last hundred thousand years—it hasn’t had time to become optimized in humans the way that locomotion or vision has. The things we do best are largely unconscious, coded in circuits so ancient that their calculations don’t percolate up to our experience. But because logic was the first form of biological reasoning that we could perceive, our thinking machines were, by necessity, logic-based. © 2017 Condé Nast.

Keyword: Brain imaging; Robotics
Link ID: 23245 - Posted: 02.17.2017

Bret Stetka In a series of recent interviews, President Donald Trump's longtime personal physician Dr. Harold N. Bornstein told The New York Times that our new commander in chief has what amounts to a pretty unremarkable medical chart. Like about a quarter of American adults, Trump is on a statin for high cholesterol. He also takes a daily baby aspirin for heart health, an occasional antibiotic for rosacea, a skin condition, and Propecia, a pill to promote hair growth. Bornstein also told the Times that should he be appointed White House doctor, he probably wouldn't test the president for baseline dementia risk, something many doctors have argued should be mandatory. At 70, Trump is the oldest American president to ever take office. Couple his age with a family history of dementia — his father Fred developed Alzheimer's disease in his 80s — and one could argue that the question of baseline cognitive testing for the U.S. head of state has taken on new relevance. An assortment of fairly simple tests exist that can establish a reference point for cognitive capacity and detect early symptoms of mental decline. One of the most common such screens is the Mini-Mental Status Examination, a series of questions that gauges attention, orientation and short-term memory. It takes about five to 10 minutes to complete. Yet admitting vulnerability of any kind isn't something politicians have been keen to do. The true health of politicians has likely been cloaked in secrecy since the days of Mesopotamian kings, but definitely since the Wilson administration. © 2017 npr

Keyword: Alzheimers
Link ID: 23243 - Posted: 02.17.2017

By LISA SANDERS, M.D. The 3-year-old girl was having a very bad day — a bad week, really. She’d been angry and irritable, screaming and kicking at her mother over nothing. Her mother was embarrassed by this unusual behavior, because her husband’s sister, Amber Bard, was visiting. Bard, a third-year medical student at Michigan State, was staying in the guest room while working with a local medical practice in Grand Rapids so that she could spend a little time with her niece. The behavior was strange, but the mother was more concerned about her child’s left eye. A few days earlier it was red and bloodshot. It no longer was, but now the girl had little bumps near the eye. The mother asked Bard whether she could look at the eye. “I’m a third-year medical student,” Bard told her. “I know approximately nothing.” But Bard was happy to try. She turned to the girl, who immediately averted her face. “Can you show me your eye?” she asked. The girl shouted: “No! No, no, no!” Eventually Bard was able to coax her into allowing her a quick look at the eye. She saw a couple of tiny pimples along the lower lid, near the lashes, and a couple more just next to the eye. The eye itself wasn’t red; the lid wasn’t swollen. She couldn’t see any discharge. Once the child was in bed, Bard opened her laptop and turned to a database she’d been using for the past week when she started to see patients. Called VisualDx, it’s one of a dozen or so programs known as decision-support software, designed to help doctors make a diagnosis. This one focuses mostly on skin findings.

Keyword: Vision
Link ID: 23242 - Posted: 02.17.2017

By Pallab Ghosh Scientists are appealing for more people to donate their brains for research after they die. They say they are lacking the brains of people with disorders such as depression and post-traumatic stress disorder. In part, this shortage results from a lack of awareness that such conditions are due to changes in brain wiring. The researchers' aim is to develop new treatments for mental and neurological disorders. The human brain is as beautiful as it is complex. Its wiring changes and grows as we do. The organ is a physical embodiment of our behaviour and who we are. In recent years, researchers have made links between the shape of the brain and mental and neurological disorders. Most of their specimens are from people with mental or neurological disorders. Samples are requested by scientists to find new treatments for Parkinson's, Alzheimer's and a whole host of psychiatric disorders. But there is a problem. Scientists at McLean Hospital and at brain banks across the world do not have enough specimens for the research community. According Dr Kerry Ressler, who is the chief scientific officer at McLean hospital, new treatments for many mental and neurological diseases are within the grasp of the research community. However, he says it is the lack of brain tissue that is holding back their development. © 2017 BBC.

Keyword: Brain imaging
Link ID: 23241 - Posted: 02.17.2017

By John Carroll, Scratch yet another Phase III Alzheimer’s drug hopeful. Merck announced late Tuesday that it is shuttering its EPOCH trial for the BACE inhibitor verubecestat in mild-to-moderate Alzheimer’s after the external data monitoring committee concluded that the drug was a bust, with “virtually” no chance of success. A separate Phase III study in prodromal patients, set to read out in two years, will continue as investigators found no signs of safety issues. This is one of Merck’s top late-stage drugs, and news of the failure drove down the pharma giant’s shares in after-market trading by 2.45%. BACE drugs essentially seek to interfere in the process that creates amyloid beta, a toxic protein often found in the brains of Alzheimer’s patients. As the top amyloid beta drugs like bapineuzumab and solanezumab — which sought to extract existing amyloid beta loads — ground their way to repeated failures, developers in the field turned increasingly to BACE therapies as an alternative mechanism that could provide the key to slowing this disease down. Merck’s effort was the most advanced in the pipeline, but Eli Lilly and others are still in hot pursuit with their own persistent BACE efforts. Teams from Biogen/Eisai and Novartis/Amgen are also beavering away on BACE. “Alzheimer’s disease is one of the most pressing and daunting medical issues of our time, with inherent, substantial challenges to developing an effective disease-modifying therapy for people with mild-to-moderate disease. Studies such as EPOCH are critical, and we are indebted to the patients in this study and their caregivers,” said Dr. Roger M. Perlmutter, president, Merck Research Laboratories. © 2017 American Association for the Advancement of Science.

Keyword: Alzheimers
Link ID: 23238 - Posted: 02.16.2017

By Mitch Leslie Fasting is all the rage. Self-help books promise it will incinerate excess fat, spruce up your DNA, and prolong your life. A new scientific study has backed up some health claims about eating less. The clinical trial reveals that cutting back on food for just 5 days a month could help prevent or treat age-related illnesses like diabetes and cardiovascular disease. “It’s not trivial to do this kind of study,” says circadian biologist Satchidananda Panda of the Salk Institute for Biological Studies in San Diego, California, who wasn’t connected to the research. “What they have done is commendable.” Previous studies in rodents and humans have suggested that periodic fasting can reduce body fat, cut insulin levels, and provide other benefits. But there are many ways to fast. One of the best known programs, the 5:2 diet, allows you to eat normally for 5 days a week. On each of the other 2 days, you restrict yourself to 500 to 600 calories, about one-fourth of what the average American consumes. An alternative is the so-called fasting-mimicking diet, devised by biochemist Valter Longo of the University of Southern California in Los Angeles and colleagues. For most of the month, participants eat as much of whatever they want. Then for five consecutive days they stick to a menu that includes chips, energy bars, and soups, consuming about 700 to 1100 calories a day. © 2017 American Association for the Advancement of Science

Keyword: Obesity
Link ID: 23236 - Posted: 02.16.2017