Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1660

By GINA KOLATA After three decades of failure, researchers have found a treatment that greatly improves the prognosis for people having the most severe and disabling strokes. By directly removing large blood clots blocking blood vessels in the brain, they can save brain tissue that would have otherwise died, enabling many to return to an independent life. The study, published online Wednesday in The New England Journal of Medicine and conducted by researchers in the Netherlands, is being met with an outpouring of excitement. One reason the treatment worked, researchers suspect, is that doctors used a new type of snare to grab the clots. It is a stent, basically a small wire cage, on the end of a catheter that is inserted in the groin and threaded through an artery to the brain. When the tip of the catheter reaches the clot, the stent is opened and pushed into the clot. It snags the clot, allowing the doctor to withdraw the catheter and pull out the stent with the clot attached. About 630,000 Americans each year have strokes caused by clots blocking blood vessels in the brain. In about a third to half, the clot is in a large vessel, which has potentially devastating consequences. People with smaller clots are helped by the lifesaving drug tPA, which dissolves them. But for those with big clots, tPA often does not help. Until now, no other treatments had been shown to work. One in five patients who had tPA alone recovered enough to return to living independently. But one in three who also had their clot removed directly were able to take care of themselves after their stroke. And that, said Dr. Larry B. Goldstein, director of the Duke Stroke Center, is “a significant and meaningful improvement in what people are able to do.” © 2014 The New York Times Company

Keyword: Stroke
Link ID: 20429 - Posted: 12.18.2014

By James Gallagher Health editor, BBC News website A link between autism and air pollution exposure during pregnancy has been suggested by scientists. The Harvard School of Public Health team said high levels of pollution had been linked to a doubling of autism in their study of 1,767 children. They said tiny particulate matter, which can pass from the lungs to the bloodstream, may be to blame. Experts said pregnant women should minimise their exposure, although the link had still to be proven. Air pollution is definitely damaging. The World Health Organization estimates it causes 3.7 million deaths each year. The study, published in Environmental Health Perspectives, investigated any possible link with autism. It analysed 245 children with autism and 1,522 without. By looking at estimated pollution exposure during pregnancy, based on the mother's home address, the scientists concluded high levels of pollution were more common in children with autism. The strongest link was with fine particulate matter - invisible specks of mineral dust, carbon and other chemicals - that enter the bloodstream and cause damage throughout the body. Yet, the research is unable to conclusively say that pollution causes autism as there could be other factors that were not accounted for in the study. Consistent pattern There is a large inherited component to autism, but lead researcher Dr Marc Weisskopf said there was mounting evidence that air pollution may play a role too. BBC © 2014

Keyword: Autism; Neurotoxins
Link ID: 20428 - Posted: 12.18.2014

Jason G Goldman We humans don’t typically agree on all that much, but there is at least one thing that an impressive amount of us accept: which hand is easiest to control. If you use one hand for writing, you probably use the same one for eating as well, and most of us – around 85% of our species – prefer our right hands. In fact, "there has never been any report of a human population in which left-handed individuals predominate", according to archaeologist Natalie Uomini at the University of Liverpool in the UK. Lateralisation of limb use – that is, a bias towards one side or the other – usually begins in the brain. We know that some tasks are largely controlled by brain activity in the left hemisphere, while the right hemisphere governs other tasks. Confusingly, there is some crossing of nerves between the body and the brain, which means it’s actually the left side of the brain that has more control over the right side of the body and vice versa. In other words, the brain’s left hemisphere helps control the operation of the right hand, eye, leg and so on. Some argue that this division of neurological labour has been a feature of animals for half a billion years. Perhaps it evolved because it is more efficient to allow the two hemispheres to carry out different computations at the same time. The left side of the brain, for instance, might have evolved to carry out routine operations – things like foraging for food – while the right side was kept free to detect and react rapidly to unexpected challenges in the environment – an approaching predator, for instance. This can be seen in various fish, toads and birds, which are all more likely to attack prey seen in the right eye. © 2014 BBC.

Keyword: Laterality; Evolution
Link ID: 20426 - Posted: 12.18.2014

By Candy Schulman My mother’s greatest fear was Alzheimer’s. She got Lewy body dementia, or LBD, instead. This little known, oddly named, debilitating illness afflicts an estimated 1.3 million Americans, the actor and comedian Robin Williams possibly among them. It is often misdiagnosed because its signs, such as hallucinations and body rigidity, do not seem like those of dementia, but in the end it robs people of themselves even more painfully. I first noticed my mother’s cognitive difficulties when she was 88. Until then, she’d led an extraordinarily active life: She was a competitive golfer with a bureau full of trophies, a painter and a sculptor. Every Hanukkah she hosted a lively feast for her eight grandchildren and nine great-grandchildren. This time, though, she needed my help planning, shopping and cooking. She was having difficulty with the guest list, trying to write every family member’s name on a piece of paper, adding up the numbers to see how many potatoes to buy for latkes. Her concentration became frayed and she kept ripping it up and starting again, close to tears. Several months before that, she had sent me a Mother’s Day card that was illustrated with childlike prose, colorful illustrations and glitter hearts. The poem on the cover was printed in a playful purple font: “For you, Mom. For kissing my boo-boos, for wiping my face. . . . For calming my fears with your loving embrace.” On Mother’s Day and the rest of the year, Mom added in a shaky script, “thanks.”

Keyword: Alzheimers
Link ID: 20422 - Posted: 12.16.2014

|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American

Keyword: Alzheimers; Genes & Behavior
Link ID: 20421 - Posted: 12.16.2014

|By Ingrid Wickelgren Confusion is one symptom of a concussion. But confusion may also characterize decisions about how soon to let an athlete play after taking a hit to the head. Sizing up symptoms such as dizziness and nausea is subjective, after all. Now a study suggests that a blood test could objectively determine whether or not the damage is bad enough to put a player on the bench. The work is in the Journal of Neurotrauma. [Robert Siman et al, Serum SNTF Increases in Concussed Professional Ice Hockey Players and Relates to the Severity of Post Concussion Symptoms] A strong blow to the head causes chemical changes within nerve cells that damage their structural proteins. Among the debris is a protein fragment called SNTF—which in more severe cases, spills into the bloodstream. The new study followed 20 professional hockey players who got concussions with symptoms that lasted six days or more. And blood levels of SNTF were much higher one hour to six days later than were levels of the protein fragment in eight other athletes who had gotten concussions that cleared up within five days. Levels were also low in 45 non-concussed players tested during the pre-season. A blood test for SNTF might thus forecast recovery time from a head injury. Combined with other neurological tests, levels of this molecule could help doctors tell athletes when it’s safe to suit up again. © 2014 Scientific American

Keyword: Brain Injury/Concussion
Link ID: 20419 - Posted: 12.16.2014

By Bruce Bower In the movie Roxanne, Steve Martin plays a lovesick guy who mocks his own huge schnoz by declaring: “It’s not the size of a nose that’s important. It’s what’s in it that matters.” Scientists demonstrated the surprising truth behind that joke this year: People can whiff an average of more than 1 trillion different odors, regardless of nose size (SN: 4/19/14, p. 6). No one had systematically probed how many scents people can actually tell apart. So a team led by Leslie Vosshall of Rockefeller University in New York City asked 26 men and women to discriminate between pairs of scents created from mixes of 128 odor molecules. Volunteers easily discriminated between smells that shared as much as 51 percent of their odor molecules. Errors gradually rose as pairs of scents became chemically more alike. Vosshall’s group calculated that an average participant could tell apart a minimum of more than 1 trillion smells made up of different combinations of 30 odor molecules. Really good smellers could have detected way more than 1 trillion odor mixtures, the scientists said. Smell lags behind sight and hearing as a sense that people need to find food, avoid dangers and otherwise succeed at surviving. Still, detecting the faint odor of spoiled food and other olfactory feats must have contributed to the success of Homo sapiens over the last 200,000 years. Perhaps many animals can whiff the difference between a trillion or more smells. For now, odor-detection studies modeled on Vosshall’s approach have been conducted only with humans. © Society for Science & the Public 2000 - 2014.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20417 - Posted: 12.16.2014

by Andy Coghlan It may not sound very appetising, but an edible powder made from waste excreted by bacteria in our guts may help people to avoid gaining weight. Stabilising a person's weight could have a major health impact, says Gary Frost of Imperial College London, because as people on Western diets grow older, they tend to put on between 0.3 and 0.8 kilograms per year on average. A fatty acid called propionate is released when the bacteria in our gut digest fibre. Propionate makes people feel full by activating cells in the large intestine that produce the satiety hormones GLP-1 and PYY: these tell the brain that it's time to stop eating. But to trigger a big enough dose of this appetite-suppressing signal from gut bacteria alone, people would have to eat extremely large amounts of fibre. To get around that, Frost and his team made the molecule in a concentrated form called inulin-propionate ester (IPE). "That gives you eight times the amount of someone following a typical Western diet," he says. To test its appetite-stemming properties, the team gave powdered IPE, mixed in with fruit juice or a milkshake, to a group of overweight volunteers every day for six months. A type of ordinary fibre was given to another set of people, who acted as controls. Only one of the 25 volunteers taking IPE put on more than 3 per cent of their body weight over that time, compared with six of the 24 controls. One reason for this might be that the IPE recipients ate around 9 per cent less over the six months. © Copyright Reed Business Information Ltd.

Keyword: Obesity
Link ID: 20416 - Posted: 12.13.2014

|By Lindsey Konkel For 28 years, Bill Gilmore lived in a New Hampshire beach town, where he surfed and kayaked. “I’ve been in water my whole life,” he said. “Before the ocean, it was lakes. I’ve been a water rat since I was four.” Now Gilmore can no longer swim, fish or surf, let alone button a shirt or lift a fork to his mouth. Earlier this year, he was diagnosed with amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease. In New England, medical researchers are now uncovering clues that appear to link some cases of the lethal neurological disease to people’s proximity to lakes and coastal waters. About five years ago, doctors at a New Hampshire hospital noticed a pattern in their ALS patients—many of them, like Gilmore, lived near water. Since then, researchers at Dartmouth-Hitchcock Medical Center have identified several ALS hot spots in lake and coastal communities in New England, and they suspect that toxic blooms of blue-green algae—which are becoming more common worldwide—may play a role. Now scientists are investigating whether breathing a neurotoxin produced by the algae may raise the risk of the disease. They have a long way to go, however: While the toxin does seem to kill nerve cells, no research, even in animals, has confirmed the link to ALS. As with all ALS patients, no one knows what caused Bill Gilmore’s disease. He was a big, strong guy—a carpenter by profession. One morning in 2011, his arms felt weak. “I couldn’t pick up my tools. I thought I had injured myself,” said Gilmore, 59, who lived half his life in Hampton and now lives in Rochester, N.H. © 2014 Scientific American

Keyword: ALS-Lou Gehrig's Disease ; Neurotoxins
Link ID: 20415 - Posted: 12.13.2014

by Colin Barras It's not just great minds that think alike. Dozens of the genes involved in the vocal learning that underpins human speech are also active in some songbirds. And knowing this suggests that birds could become a standard model for investigating the genetics of speech production – and speech disorders. Complex language is a uniquely human trait, but vocal learning – the ability to pick up new sounds by imitating others – is not. Some mammals, including whales, dolphins and elephants, share our ability to learn new vocalisations. So do three groups of birds: the songbirds, parrots and hummingbirds. The similarities between vocal learning in humans and birds are not just superficial. We know, for instance, that songbirds have specialised vocal learning brain circuits that are similar to those that mediate human speech. What's more, a decade ago we learned that FOXP2, a gene known to be involved in human language, is also active in "area X" of the songbird brain – one of the brain regions involved in those specialised vocal learning circuits. Andreas Pfenning at the Massachusetts Institute of Technology and his colleagues have now built on these discoveries. They compared maps of genetic activity – transcriptomes – in brain tissue taken from the zebra finch, budgerigar and Anna's hummingbird, representing the three groups of vocal-learning birds. © Copyright Reed Business Information Ltd.

Keyword: Language; Genes & Behavior
Link ID: 20414 - Posted: 12.13.2014

|By Claudia Wallis Touch a hot frying pan and the searing message of pain sprints up to your brain and back down to your hand so fast that the impulse to withdraw your fingers seems instantaneous. That rapid-fire signal begins in a heat-sensing molecule called a TRPV1 channel. This specialized protein is abundant on the surface of sensory nerve cells in our fingers and elsewhere and is a shape-shifter that can take an open or closed configuration. Heat opens a central pore in the molecule, so do certain spider toxins and capsaicin—the substance that gives chili peppers their burn. Once the pore is open, charged ions of sodium and calcium flow into the nerve cell, triggering the pain signal. Ouch! As neuroscientist-journalist Stephani Sutherland explains in “Pain that Won’t Quit,” in the December Scientific American, researchers have long been interested in finding ways to moderate the action of this channel—and other ion channels—in patients who suffer from chronic pain. Shutting down the TRPV1 channel completely, however, is not an option because it plays a vital role in regulating body temperature. In two papers published in Nature in December 2013 investigators at the University of California, San Francisco, gave pain researchers a big leg up in understanding TRPV1. They revealed, in exquisite atomic detail, the structure of the channel molecule (from a rat) using an electron cryomicroscope, an instrument designed to explore the 3-D structure of molecules at very low temperatures. One of those investigators, Yifan Cheng, also created this colorful animation, showing how the molecule looks when the channel is open. © 2014 Scientific American

Keyword: Pain & Touch
Link ID: 20411 - Posted: 12.13.2014

By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014

By Nsikan Akpan Gut surgery is often the only option for life-threatening obesity and diabetes, but what if doctors could cut the pounds without using a knife? Scientists have engineered an antiobesity drug that rivals the dramatic benefits seen with surgery, dropping excess body weight by a third. Though the work was done only in rodents, the drug is the first to influence three obesity-related hormones in the gut at once. Bariatric surgery, including gastric bypass, typically involves limiting food intake by removing part of the stomach or intestines. Yet it does more than shrink the size of patient’s stomach or intestines. It also changes the release of multiple gut-related hormones, explains clinical endocrinologist Stephen O'Rahilly of the University of Cambridge in the United Kingdom, who wasn’t involved with the study. That’s important, because years of eating a diet high in fat and sugar can throw a person’s metabolism into disarray. Cells undergo genetic reprogramming that negatively impacts how they process sugar and store fat, locking in obesity. This pattern makes it harder and harder to lose weight, even if a person changes their diet and begins exercising. Bariatric surgery interrupts that cycle by stimulating the production of several hormones that reduce blood sugar, burn fat, and curb appetite. (It may also change the composition of the gut’s microbes.) Three of these hormones are called glucagon-like peptide-1 (GLP-1), gastric inhibitory peptide (GIP), and glucagon. Cells in your gut release GLP-1 and GIP after a meal to keep your body’s blood sugar levels in a normal range. GLP-1 also curbs appetite, signaling to your brain that you are full. In type 2 diabetes, the body stops responding to GLP-1 and GIP, which contributes to hyperglycemia, or too much blood sugar. Hyperglycemia causes the devastating hallmarks of diabetes, such as kidney injury, cardiovascular disease, and nerve damage. © 2014 American Association for the Advancement of Science.

Keyword: Obesity
Link ID: 20408 - Posted: 12.10.2014

By Tina Rosenberg When Ebola ends, the people who have suffered, who have lost loved ones, will need many things. They will need ways to rebuild their livelihoods. They will need a functioning health system, which can ensure that future outbreaks do not become catastrophes. And they will need mental health care. Depression is the most important thief of productive life for women around the world, and the second-most important for men. We sometimes imagine it is a first-world problem, but depression is just as widespread, if not more so, in poor countries, where there is a good deal more to be depressed about. And it is more debilitating, as a vast majority of sufferers have no safety net. Health care for all must include mental health care. It’s hard to believe but both Liberia and Sierra Leone have only a single psychiatrist. The Ebola crisis has exposed these countries’ malignant neglect of their health systems. People can’t get care for diarrhea and malaria. How will these countries take care of an epidemic of depression? This isn’t really a medical question. We know how to treat depression. What we don’t know yet is how to make effective treatment cheap, culturally appropriate, convenient and non-stigmatizing — all needed to get treatment out to millions and millions of people. But some researchers are finding out. They are doing so despite the fact that growing attention to this issue hasn’t been accompanied by money. The U.S. National Institute of Mental Health last year provided just $24.5 million for global mental health efforts, and the Canadian government’s Grand Challenges Canada, which is said to have the largest portfolio of mental health innovation in developing countries, has spent only $28 million on them since it began in 2010. © 2014 The New York Times Company

Keyword: Depression
Link ID: 20404 - Posted: 12.08.2014

By Lenny Bernstein There are 60 million epileptics on the planet, and while advances in medication and implantable devices have helped them, the ability to better detect and even predict when they will have debilitating seizures would be a significant improvement in their everyday lives. Imagine, for example, if an epileptic knew with reasonable certainty that his next seizure would not occur for an hour or a day or a week. That might allow him to run to the market or go out for the evening or plan a short vacation with less concern. Computers and even dogs have been tested in the effort to do this, but now a group of organizations battling epilepsy is employing "big data" to help. They sponsored an online competition that drew 504 entrants who tried to develop algorithms that would detect and predict epileptic seizures. Instead of the traditional approach of asking researchers in a handful of labs to tackle the problem, the groups put huge amounts of data online that was recorded from the brains of dogs and people as they had seizures over a number of months. They then challenged anyone interested to use the information to develop detection and prediction models. "Seizure detection and seizure prediction," said Walter J. Koroshetz, deputy director of the National Institute of Neurological Disorders and Stroke (NINDS), are "two fundamental problems in the field that are poised to take significant advantage of large data computation algorithms and benefit from the concept of sharing data and generating reproducible results."

Keyword: Epilepsy
Link ID: 20403 - Posted: 12.08.2014

By Quassim Cassam Most people wonder at some point in their lives how well they know themselves. Self-knowledge seems a good thing to have, but hard to attain. To know yourself would be to know such things as your deepest thoughts, desires and emotions, your character traits, your values, what makes you happy and why you think and do the things you think and do. These are all examples of what might be called “substantial” self-knowledge, and there was a time when it would have been safe to assume that philosophy had plenty to say about the sources, extent and importance of self-knowledge in this sense. Not any more. With few exceptions, philosophers of self-knowledge nowadays have other concerns. Here’s an example of the sort of thing philosophers worry about: suppose you are wearing socks and believe you are wearing socks. How do you know that that’s what you believe? Notice that the question isn’t: “How do you know you are wearing socks?” but rather “How do you know you believe you are wearing socks?” Knowledge of such beliefs is seen as a form of self-knowledge. Other popular examples of self-knowledge in the philosophical literature include knowing that you are in pain and knowing that you are thinking that water is wet. For many philosophers the challenge is explain how these types of self-knowledge are possible. This is usually news to non-philosophers. Most certainly imagine that philosophy tries to answer the Big Questions, and “How do you know you believe you are wearing socks?” doesn’t sound much like one of them. If knowing that you believe you are wearing socks qualifies as self-knowledge at all — and even that isn’t obvious — it is self-knowledge of the most trivial kind. Non-philosophers find it hard to figure out why philosophers would be more interested in trivial than in substantial self-knowledge. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20402 - Posted: 12.08.2014

By JOHN McWHORTER “TELL me, why should we care?” he asks. It’s a question I can expect whenever I do a lecture about the looming extinction of most of the world’s 6,000 languages, a great many of which are spoken by small groups of indigenous people. For some reason the question is almost always posed by a man seated in a row somewhere near the back. Asked to elaborate, he says that if indigenous people want to give up their ancestral language to join the modern world, why should we consider it a tragedy? Languages have always died as time has passed. What’s so special about a language? The answer I’m supposed to give is that each language, in the way it applies words to things and in the way its grammar works, is a unique window on the world. In Russian there’s no word just for blue; you have to specify whether you mean dark or light blue. In Chinese, you don’t say next week and last week but the week below and the week above. If a language dies, a fascinating way of thinking dies along with it. I used to say something like that, but lately I have changed my answer. Certainly, experiments do show that a language can have a fascinating effect on how its speakers think. Russian speakers are on average 124 milliseconds faster than English speakers at identifying when dark blue shades into light blue. A French person is a tad more likely than an Anglophone to imagine a table as having a high voice if it were a cartoon character, because the word is marked as feminine in his language. This is cool stuff. But the question is whether such infinitesimal differences, perceptible only in a laboratory, qualify as worldviews — cultural standpoints or ways of thinking that we consider important. I think the answer is no. Furthermore, extrapolating cognitive implications from language differences is a delicate business. In Mandarin Chinese, for example, you can express If you had seen my sister, you’d have known she was pregnant with the same sentence you would use to express the more basic If you see my sister, you know she’s pregnant. One psychologist argued some decades ago that this meant that Chinese makes a person less sensitive to such distinctions, which, let’s face it, is discomfitingly close to saying Chinese people aren’t as quick on the uptake as the rest of us. The truth is more mundane: Hypotheticality and counterfactuality are established more by context in Chinese than in English. © 2014 The New York Times Company

Keyword: Language
Link ID: 20401 - Posted: 12.08.2014

Carl Zimmer For thousands of years, fishermen knew that certain fish could deliver a painful shock, even though they had no idea how it happened. Only in the late 1700s did naturalists contemplate a bizarre possibility: These fish might release jolts of electricity — the same mysterious substance as in lightning. That possibility led an Italian physicist named Alessandro Volta in 1800 to build an artificial electric fish. He observed that electric stingrays had dense stacks of muscles, and he wondered if they allowed the animals to store electric charges. To mimic the muscles, he built a stack of metal disks, alternating between copper and zinc. Volta found that his model could store a huge amount of electricity, which he could unleash as shocks and sparks. Today, much of society runs on updated versions of Volta’s artificial electric fish. We call them batteries. Now a new study suggests that electric fish have anticipated other kinds of technology. The research, by Kenneth C. Catania, a biologist at Vanderbilt University, reveals a remarkable sophistication in the way electric eels deploy their shocks. Dr. Catania, who published the study on Thursday in the journal Science, found that the eels use short shocks like a remote control on their victims, flushing their prey out of hiding. And then they can deliver longer shocks that paralyze their prey at a distance, in precisely the same way that a Taser stops a person cold. “It shows how finely adapted eels are to attack prey,” said Harold H. Zakon, a biologist at the University of Texas at Austin, who was not involved in the study. He considered Dr. Catania’s findings especially impressive since scientists have studied electric eels for more than 200 years. © 2014 The New York Times Company

Keyword: Evolution
Link ID: 20400 - Posted: 12.06.2014

by Michael Slezak The elusive link between obesity and high blood pressure has been pinned down to the action of leptin in the brain, and we might be able to block it with drugs. We've known for more than 30 years that fat and high blood pressure are linked, but finding what ties them together has been difficult. One of the favourite candidates has been leptin – a hormone produced by fat cells. Under normal circumstances, when fat cells produce leptin, the hormone sends the message that you've had enough food. But in people with obesity, the body stops responding to this message, and large levels of leptin build up. Leptin is known to activate the regulatory network called the sympathetic nervous system, and it's the activation of sympathetic nerves on the kidneys that seem to be responsible for raising blood pressure. Leptin has thus been linked to blood pressure. However, conclusive evidence has been hard to come by. Michael Cowley of Monash University in Melbourne, Australia, and his colleagues have now conducted a string of experiments that provide some evidence. Through genetic and drug experiments in mice, they have pinpointed an area in the mouse brain that increases blood pressure when it is exposed to high leptin levels. This region is called the dorsomedial hypothalamus, and is thought to be involved in controlling energy consumption. Their findings show that high levels in leptin do indeed boost blood pressure, via this brain region. © Copyright Reed Business Information Ltd.

Keyword: Obesity
Link ID: 20398 - Posted: 12.06.2014

By Neuroskeptic | An important new study could undermine the concept of ‘endophenotypes’ – and thus derail one of the most promising lines of research in neuroscience and psychiatry. The findings are out now in Psychophysiology. Unusually, an entire special issue of the journal is devoted to presenting the various results of the study, along with commentary, but here’s the summary paper: Knowns and unknowns for psychophysiological endophenotypes by Minnesota researchers William Iacono, Uma Vaidyanathan, Scott Vrieze and Stephen Malone. In a nutshell, the researchers ran seven different genetic studies to try to find the genetic basis of a total of seventeen neurobehavioural traits, also known as ‘endophenotypes’. Endophenotypes are a hot topic in psychiatric neuroscience, although the concept is somewhat vague. The motivation behind interest in endophenotypes comes mainly from the failure of recent studies to pin down the genetic cause of most psychiatric syndromes: endophenotypes_A Essentially an endophenotype is some trait, which could be almost anything, which is supposed to be related to (or part of) a psychiatric disorder or symptom, but which is “closer to genetics” or “more biological” than the disorder itself. Rather than thousands of genes all mixed together to determine the risk of a psychiatric disorder, each endophenotype might be controlled by only a handful of genes – which would thus be easier to find.

Keyword: Schizophrenia; Genes & Behavior
Link ID: 20396 - Posted: 12.06.2014