Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 8821 - 8840 of 29351

Henry Nicholls Andy Russell had entered the lecture hall late and stood at the back, listening to the close of a talk by Marta Manser, an evolutionary biologist at the University of Zurich who works on animal communication. Manser was explaining some basic concepts in linguistics to her audience, how humans use meaningless sounds or “phonemes” to generate a vast dictionary of meaningful words. In English, for instance, just 40 different phonemes can be resampled into a rich vocabulary of some 200,000 words. But, explained Manser, this linguistic trick of reorganising the meaningless to create new meaning had not been demonstrated in any non-human animal. This was back in 2012. Russell’s “Holy shit, man” excitement was because he was pretty sure he had evidence for phoneme structuring in the chestnut-crowned babbler, a bird he’s been studying in the semi-arid deserts of south-east Australia for almost a decade. After the talk, Russell (a behavioural ecologist at the University of Exeter) travelled to Zurich to present his evidence to Manser’s colleague Simon Townsend, whose research explores the links between animal communication systems and human language. The fruits of their collaboration are published today in PLoS Biology. One of Russell’s students Jodie Crane had been recording the calls of the chestnut-crowned babbler for her PhD. The PLoS Biology paper focuses on two of these calls, which appear to be made up of two identical elements, just arranged in a different way. © 2015 Guardian News and Media Limited

Keyword: Language; Evolution
Link ID: 21110 - Posted: 06.30.2015

By Ariana Eunjung Cha One of the most heartbreaking things about Alzheimer's is that it has been impossible for doctors to predict who will get it before symptoms begin. And without early detection, researchers say, a treatment or cure may be impossible. Governments, drug companies and private foundations have poured huge amounts of money into trying to come up with novel ways to detect risk through cutting-edge technologies ranging from brain imaging, protein analysis of cerebrospinal fluid and DNA profiling. Now a new study, published in the journal Neurology, shows that perhaps something more old-fashioned could be the answer: a memory test. The researchers tracked 2,125 participants in four Chicago neighborhoods for 18 years, giving them tests of memory and thinking every three years. They found that those who scored lowest on the tests during the first year were 10 times more likely to be diagnosed with Alzheimer's down the road -- indicating that cognitive impairment may be affecting the brain "substantially earlier than previously established," the researchers wrote.

Keyword: Alzheimers; Learning & Memory
Link ID: 21109 - Posted: 06.30.2015

Allison Aubrey Bite into that bread before your main meal, and you'll spike your blood sugar and amp up your appetite. Waiting until the end of your dinner to nosh on bread can blunt those effects. Bite into that bread before your main meal, and you'll spike your blood sugar and amp up your appetite. Waiting until the end of your dinner to nosh on bread can blunt those effects. iStockphoto Ah, the bread basket. You sit down for a nice meal out, and there it appears: piping hot, giving off a waft of yeasty divinity. There's a reason this age-old tradition prevails. Even in the era of paleo and gluten-free, there are still hordes of us who will gladly nosh on crusty, chewy, soul-warming bread. But the downside may be more than just some extra calories. Turns out, eating all those carbs before a meal can amp up our appetites and spike our blood sugar. "The worst situation is having refined carbohydrates on an empty stomach, because there's nothing to slow down the digestion of that carbohydrate into sugar," explains David Ludwig, director of the Optimal Weight for Life Clinic at Boston Children's Hospital. © 2015 NPR

Keyword: Obesity
Link ID: 21108 - Posted: 06.30.2015

Emma Bowman In a small, sparse makeshift lab, Melissa Malzkuhn practices her range of motion in a black, full-body unitard dotted with light-reflecting nodes. She's strapped on a motion capture, or mocap, suit. Infrared cameras that line the room will capture her movement and translate it into a 3-D character, or avatar, on a computer. But she's not making a Disney animated film. Three-dimensional motion capture has developed quickly in the last few years, most notably as a Hollywood production tool for computer animation in films like Planet of the Apes and Avatar. Behind the scenes though, leaders in the deaf community are taking on the technology to create and improve bilingual learning tools in American Sign Language. Malzkuhn has suited up to record a simple nursery rhyme. Being deaf herself, she spoke with NPR through an interpreter. "I know in English there's just a wealth of nursery rhymes available, but we really don't see as much in ASL," she says. "So we're gonna be doing some original work here in developing nursery rhymes." That's because sound-based rhymes don't cross over well into the visual language of ASL. Malzkuhn heads the Motion Light Lab, or ML2. It's the newest hub of the National Science Foundation Science of Learning Center, Visual Language and Visual Learning (VL2) at Gallaudet University, the premier school for deaf and hard of hearing students. © 2015 NPR

Keyword: Language
Link ID: 21107 - Posted: 06.29.2015

By Sunnie Huang, CBC News The story of a Newfoundland man who was struck by a moose but doesn't remember it is not just a curious tale of luck. It also highlights the complex underpinnings of human memory, a neuroscience expert says. Stephen Bromley, from Conche, N.L., struck a moose with his car on Monday, but said he had no recollection it, even days after the collision. It's not the first time that something was amiss about human memory after a moose encounter. hi-moose-car-2012 Michelle Higgins said the roof of her car was peeled back "like a sardine can" after she struck a moose. Another Newfoundlander drove about 40 kilometres with her car's roof peeled back "like a sardine can" after crashing into a moose in 2012. Three years later, she said she still can't recall the incident. The blackout doesn't surprise Scott Watter, a McMaster University professor who specializes in neuroscience, psychology and behaviour. "They are lucky in that sense, but it doesn't seem like a thing that breaks the rules of everything we know about how brains work," he told CBC News. People who sustain head trauma often have poor memory of the event, especially when tested on specific details, Watter said. Also, the more severe the injury gets, the further back the memory loss extends, Watter said. The system at the heart of our memory is a seahorse-shaped section of the brain called the hippocampus, Watter explained. It's responsible for linking different parts of human experience to form a coherent memory. In the most severe — but rare — cases of hippocampus damage, the person can no longer create or retain new memory, as seen in Christopher Nolan's 2000 box office hit Memento. ©2015 CBC/Radio-Canada.

Keyword: Learning & Memory
Link ID: 21106 - Posted: 06.29.2015

Vaughan Bell Marketing has discovered neuroscience and the shiny new product has plenty of style but very little substance. “Neuromarketing” is lighting up the eyes of advertising executives and lightening the wallets of public relations companies. It promises to target the unconscious desires of consumers, which are supposedly revealed by measuring the brain. The more successful agencies have some of the world’s biggest brands on their books and these mega-corporations are happy to trumpet their use of brain science in targeting their key markets. The holy grail of neuromarketing is to predict which ads will lead to most sales before they’ve been released but the reality is a mixture of bad science, bullshit and hope. First, it’s important to realise that the concept of neuroscience is used in different ways in marketing. Sometimes, it’s just an empty ploy aimed at consumers – the equivalent of putting a bikini-clad body next to your product for people who believe they’re above the bikini ploy. A recent Porsche advert apparently showed a neuroscience experiment suggesting that the brain reacts in a similar way to driving their car and flying a fighter jet, but it was all glitter and no gold. The images were computer-generated, the measurements impossible, and the scientist an actor. In complete contrast, neuromarketing is also a serious research area. This is a scientifically sound, genuinely interesting field in cognitive science, where the response to products and consumer decision-making is understood on the level of body and mind. This might involve looking at how familiar brand logos engage the memory systems in the brain, or examining whether the direction of eye gaze of people in ads affects how attention-grabbing they are, or testing whether the brain’s electrical activity varies when watching subtly different ads. Like most of cognitive neuroscience, the studies are abstract, ultra-focused and a long way from everyday experience. © 2015 Guardian News and Media Limited

Keyword: Emotions
Link ID: 21105 - Posted: 06.29.2015

Amy Standen A doctor I interviewed for this story told me something that stuck with me. He said for every person with dementia he treats, he finds himself caring for two patients. That's how hard it can be to be a caregiver for someone with dementia. The doctor is Bruce Miller. He directs the Memory and Aging Center at the University of California, San Francisco. According to Miller, 50 percent of caregivers develop a major depressive illness because of the caregiving. "The caregiver is so overburdened that they don't know what to do next," he says. "This adds a huge burden to the medical system." This burden is going increase dramatically in the coming decade. By 2025, 7 million Americans will have Alzheimer's disease, according to one recent estimate. Millions more will suffer from other types of dementia. Together these diseases may become the most expensive segment of the so-called "silver tsunami" — 80 million baby boomers who are getting older and needing more medical care. The cost of caring for Alzheimer's patients alone is expected to triple by 2050, to more than $1 trillion a year. So UCSF, along with the University of Nebraska Medical Center, is beginning a $10 million study funded by the federal Centers for Medicare & Medicaid Innovation. Researchers plan to develop a dementia "ecosystem," which aims to reduce the cost of caring for the growing number of dementia patients and to ease the strain on caregivers. © 2015 NPR

Keyword: Alzheimers
Link ID: 21104 - Posted: 06.29.2015

by Clare Wilson Do you dream of where you'd like to go tomorrow? It looks like rats do. When the animals are shown a food treat at the end of a path they cannot access and then take a nap, the neurons representing that route in their brains fire as they sleep – as if they are dreaming about running down the corridor to grab the grub. "It's like looking at a holiday brochure for Greece the day before you go – that night you might dream about the pictures," says Hugo Spiers of University College London. Like people, rats store mental maps of the world in their hippocampi, two curved structures on either side of the brain. Putting electrodes into rats' brains as they explore their environment has shown that different places are recorded and remembered by different combinations of hippocampal neurons firing together. These "place cells" fire not only when a rat is in a certain location, but also when it sleeps, as if it is dreaming about where it has been in the past. Spiers's team wondered whether this activity during sleep might also reflect where a rat wants to go in future. They placed four rats at the bottom of a T-shaped pathway, with entry to the top bar of the T blocked by a grille. Food was placed at the end of one arm, in a position visible to the animals. Next they encouraged the rats to sleep in a cosy nest and recorded their hippocampus activity with about 50 electrodes each as they rested. Finally they put the rats back into the maze, but now with the grille and the treat removed. © Copyright Reed Business Information Ltd.

Keyword: Sleep
Link ID: 21103 - Posted: 06.27.2015

Sharon Darwish Bottlenose dolphins have an average brain mass of 1.6 kg, slightly greater than that of humans, and about four times the size of chimpanzee brains. Although you couldn’t really imagine a dolphin writing poetry, dolphins demonstrate high levels of intelligence and social behaviour. For example, they display mirror self-recognition, as well as an understanding of symbol-based communication systems. Research into the differing brain sizes and intellectual capabilities within the animal kingdom is fascinating. Why have some species evolved to be more intelligent than others? Does brain size affect cognitive ability? Some studies say yes, but some insist otherwise. It really depends which species we are talking about. In humans, for example, larger brains do not indicate higher intelligence – otherwise Einstein, who had an average-sized brain, may have not been quite as successful in his career. (Yes, that link was to a 23-pager on the analysis of Einstein’s brain. It makes for great bedtime reading.) Most neuroscientists now believe that it is the structure of the brain on a cellular and molecular level that determines its computational capacity. Within certain animal species however, a larger brain offers evolutionary advantage. For example, large-brained female guppies are better survivors and demonstrate greater cognitive strengths than their smaller-brained counterparts. © 2015 Guardian News and Media Limited

Keyword: Development of the Brain; Genes & Behavior
Link ID: 21102 - Posted: 06.27.2015

By Bret Stetka Plenty of us have known a dog on Prozac. We have also witnessed the eye rolls that come with the mention of canine psychiatry. Doting pet owners—myself included—ascribe all kinds of questionable psychological ills to our pawed companions. But in fact, the science suggests that numerous nonhuman species do suffer from psychiatric symptoms. Birds obsess; horses on occasion get pathologically compulsive; dolphins and whales, especially those in captivity, self-mutilate. And that thing when your dog woefully watches you pull out of the driveway from the window—that might be DSM-certified separation anxiety. “Every animal with a mind has the capacity to lose hold of it from time to time,” wrote science historian and author Laurel Braitman in her 2014 book Animal Madness. But at least one mental malady, while common in humans, seems to have spared other animals: schizophrenia, which affects an estimated 0.4 to 1 percent of adults. Although animal models of psychosis exist in laboratories, and odd behavior has been observed in creatures confined to cages, most experts agree that psychosis has not typically been seen in other species, whereas depression, obsessive-compulsive disorder and anxiety traits have been reported in many nonhuman species. This raises the question of why such a potentially devastating, often lethal disease is still hanging around plaguing humanity. We know from an abundance of recent research that schizophrenia is heavily genetic in origin. One would think that natural selection would have eliminated the genes that predispose to psychosis. A study published earlier this year in Molecular Biology and Evolution provides clues as to how the potential for schizophrenia may have arisen in the human brain and, in doing so, suggests possible treatment targets. It turns out that psychosis may be an unfortunate cost of having a big brain that is capable of complex cognition. © 2015 Scientific American

Keyword: Schizophrenia
Link ID: 21101 - Posted: 06.27.2015

Matthew C Keller & Peter M Visscher Epidemiological studies and anecdotal evidence show overlap between psychiatric disorders and creativity, but why? A new study uses genome-wide association data from schizophrenia and bipolar disorder to show that genetics are part of the explanation. Thinkers contemplating the human condition have long associated creativity with psychiatric illness—the 'mad genius' archetype. According to Aristotle, “no great genius was without a mixture of insanity.” And there are the oft-repeated anecdotes: the psychotic breaks of Vincent van Gogh and John Nash, the manic and depressive episodes of Virginia Woolf and Ernest Hemingway. There is, in fact, some empirical evidence that the psychological factors underlying psychiatric disorders are linked to increased creativity. Unaffected relatives of those with bipolar disorder (BD) have greater creativity1 and are over-represented in creative professions2, and similar findings have been reported for schizophrenia (SCZ)2, 3. What these studies have not shown is whether this overlap is a result of genetic variation that influences both creativity and BD/SCZ or whether some environmental factor explains the association. For example, highly unstructured rearing environments might contribute to both creativity and risk of the disorders. Understanding whether shared gene variants are responsible for the overlap is important. It can help to elucidate the biological underpinnings of these disorders and shine light on the puzzle of why psychiatric diseases persist in the population. Power et al.4, in work reported in this issue of Nature Neuroscience, asked whether creativity and psychiatric disorders might be associated through common variation in the genome. They used a large discovery sample of 86,292 adults from Iceland and four replication samples totaling over 27,000 adults from Sweden and the Netherlands. All had genome-wide SNP genotyping and their professions were known. None of them knowingly suffered from a psychiatric illness. About 1% of them were artists, including actors, dancers, musicians and writers. © 2015 Macmillan Publishers Limited

Keyword: Schizophrenia; Genes & Behavior
Link ID: 21100 - Posted: 06.27.2015

By GARY MARCUS SCIENCE has a poor track record when it comes to comparing our brains to the technology of the day. Descartes thought that the brain was a kind of hydraulic pump, propelling the spirits of the nervous system through the body. Freud compared the brain to a steam engine. The neuroscientist Karl Pribram likened it to a holographic storage device. Many neuroscientists today would add to this list of failed comparisons the idea that the brain is a computer — just another analogy without a lot of substance. Some of them actively deny that there is much useful in the idea; most simply ignore it. Often, when scientists resist the idea of the brain as a computer, they have a particular target in mind, which you might call the serial, stored-program machine. Here, a program (or “app”) is loaded into a computer’s memory, and an algorithm, or recipe, is executed step by step. (Calculate this, then calculate that, then compare what you found in the first step with what you found in the second, etc.) But humans don’t download apps to their brains, the critics note, and the brain’s nerve cells are too slow and variable to be a good match for the transistors and logic gates that we use in modern computers. If the brain is not a serial algorithm-crunching machine, though, what is it? A lot of neuroscientists are inclined to disregard the big picture, focusing instead on understanding narrow, measurable phenomena (like the mechanics of how calcium ions are trafficked through a single neuron), without addressing the larger conceptual question of what it is that the brain does. This approach is misguided. Too many scientists have given up on the computer analogy, and far too little has been offered in its place. In my view, the analogy is due for a rethink. To begin with, all the standard arguments about why the brain might not be a computer are pretty weak. Take the argument that “brains are parallel, but computers are serial.” Critics are right to note that virtually every time a human does anything, many different parts of the brain are engaged; that’s parallel, not serial. © 2015 The New York Times Company

Keyword: Brain imaging
Link ID: 21099 - Posted: 06.27.2015

By Sarah Lewin Evolutionary biologists have long wondered why the eardrum—the membrane that relays sound waves to the inner ear—looks in humans and other mammals remarkably like the one in reptiles and birds. Did the membrane and therefore the ability to hear in these groups evolve from a common ancestor? Or did the auditory systems evolve independently to perform the same function, a phenomenon called convergent evolution? A recent set of experiments performed at the University of Tokyo and the RIKEN Evolutionary Morphology Laboratory in Japan resolves the issue. When the scientists genetically inhibited lower jaw development in both fetal mice and chickens, the mice formed neither eardrums nor ear canals. In contrast, the birds grew two upper jaws, from which two sets of eardrums and ear canals sprouted. The results, published in Nature Communications, confirm that the middle ear grows out of the lower jaw in mammals but emerges from the upper jaw in birds—all supporting the hypothesis that the similar anatomy evolved independently in mammals and in reptiles and birds. (Scientific American is part of Springer Nature.) Fossils of auditory bones had supported this conclusion as well, but eardrums do not fossilize and so could not be examined directly. © 2015 Scientific American

Keyword: Hearing; Evolution
Link ID: 21098 - Posted: 06.27.2015

By Michael Balter For much of the time dinosaurs were lording over the land, sleek marine reptiles called ichthyosaurs were the masters of the sea. The dolphinlike predators had enormous eyes for hunting and grew as long as 20 meters. But paleontologists have long been baffled by their brain structure, because most fossil specimens have been squished flat by marine sediments. One rare exception—discovered in the 1800s in southern England’s Bristol Channel—is a spectacularly preserved, 180-million-year-old ichthyosaur named Hauffiopteryx. Now, using computerized tomography (CT) scanning, researchers have created a 3D digital reconstruction of Hauffiopteryx’s skull, making a “ghost image” of its brain known as a digital endocast (above). The team, which reported its findings online earlier this month in Palaeontology, found that the brain’s optic lobes were particularly large; so were the cerebellum, which controls motor functions, and the olfactory region, where odors are processed. Taken together, the team concludes these features show ichthyosaurs were highly mobile predators with a keen sense of sight and smell. © 2015 American Association for the Advancement of Science.

Keyword: Evolution
Link ID: 21097 - Posted: 06.27.2015

Helen Shen In April 2011, Robert Froemke and his team were reprogramming the brains of virgin mice with a single hormone injection. Before the treatment, the female mice were largely indifferent to the cries of a distressed baby, and were even known to trample over them. But after an injection of oxytocin, the mice started to respond more like mothers, picking up the mewling pup in their mouths. Froemke, a neuroscientist at New York University's Langone Medical Center in New York City, was monitoring the animals' brains to find out why that happened. At first, the mice showed an irregular smattering of neural impulses when they heard the baby's cries. Then, as the oxytocin kicked in, the signal evolved into a more orderly pattern typical of a maternal brain. The study showed in unusual detail how the hormone changed the behaviour of neurons1. “Oxytocin is helping to transform the brain, to make it respond to those pup calls,” Froemke says. Oxytocin has been of keen interest to neuroscientists since the 1970s, when studies started to show that it could drive maternal behaviour and social attachment in various species. Its involvement in a range of social behaviours2, including monogamy in voles, mother–infant bonding in sheep, and even trust between humans, has earned it a reputation as the 'hug hormone'. “People just concluded it was a bonding molecule, a cuddling hormone, and that's the pervasive view in the popular press,” says Larry Young, a neuroscientist at Emory University in Atlanta, Georgia, who has been studying the molecule since the 1990s. “What we need to start thinking about is the more fundamental role that oxytocin has in the brain.” © 2015 Nature Publishing Group,

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 21095 - Posted: 06.25.2015

By JAIME LOWE The manila folder is full of faded faxes. The top sheet contains a brief description of my first medically confirmed manic episode, more than 20 years ago, when I was admitted as a teenager to U.C.L.A.’s Neuropsychiatric Institute: “Increased psychomotor rate, decreased need for sleep (about two to three hours a night), racing thoughts and paranoid ideation regarding her parents following her and watching her, as well as taping the phone calls that she was making.” I believed I had special powers, the report noted; I knew ‘‘when the end of the world was coming due to toxic substances’’ and felt that I was the only one who could stop it. There was also an account of my elaborate academic sponsorship plan so I could afford to attend Yale — some corporation would pay for a year of education in exchange for labor or repayment down the line. (Another grand delusion. I was a B-plus student, at best.) After I was admitted to the institute's adolescent ward, I thought the nurses and doctors and therapists were trying to poison me. So was the TV in the rec room. I warned my one friend in the ward that its rays were trying to kill him. The generator outside my window was pumping in gas. The place, I was sure, was a death camp. I refused meds because they were obviously agents of annihilation. It took four orderlies to medicate me: They pinned me to the floor while a nurse plunged a syringe into my left hip. Over time, I became too tired to refuse medication. Or perhaps the cocktail of antipsychotics started working. The Dixie cup full of pills included lithium, which slowly took hold of my mania. After a few weeks, I stopped whispering to the other patients that we were all about to be killed. Eventually, I stopped believing it myself. Mark DeAntonio, the U.C.L.A. psychiatrist who was treating me, said I had bipolar disorder. Here’s the phrasing from the National Institute of Mental Health: ‘‘unusually intense emotional states that occur in distinct periods called ‘mood episodes.’ Each mood episode represents a drastic change from a person’s usual mood and behavior. An overly joyful or overexcited state is called a manic episode, and an extremely sad or hopeless state is called a depressive episode.’’ The generic definition doesn’t quite cover the extremes of the disease or its symptoms, which include inflated self-esteem, sleeplessness, loquaciousness, racing thoughts and doing things that, according to the Mayo Clinic, ‘‘have a high potential for painful consequences — for example, unrestrained buying sprees, sexual indiscretions or foolish business investments.’’ © 2015 The New York Times Company

Keyword: Schizophrenia
Link ID: 21094 - Posted: 06.25.2015

By Nellie Bowles One recent Friday night, at a software-development firm’s warehouse in San Francisco, Mikey Siegel called to order the hundred and fifty or so meditators, video gamers, and technocrats who had gathered for one of the city’s biweekly Consciousness Hacking meet-ups. Siegel, the primary organizer of the event and the founder of a Santa Cruz–based biofeedback startup called Bio Fluent, asked the crowd, men and women of widely varied ages, to go around the room introducing themselves in three words. Everyone laughed, but took the task seriously. The introductions moved quickly through the room in a brisk beat: “Me Technological Cartoon” “Heather Curious About Brains” “Neuromore Singularity Atom Here” “Dan Thoughtful Helpful Software” “Harry Self-Modification Exploration” “David Psychiatrist Technological Retarded Curious” “Jordan Moving Meditation Butts” “Juliana Joel’s Aunt” “Ben Existence Existence Existence” “Zohara Chocolate Maker Meditation Awareness” “Lila Awake Empath Warrior” San Francisco’s Consciousness Hacking meet-ups are an opportunity for engineers, entrepreneurs, and enthusiasts to test the fleet of still experimental self-examination technologies emerging largely from Silicon Valley. The region’s tech community is a body culture, obsessed with monitoring and perfecting its food (Soylent), fitness (Fitbit), and physiology (23andMe). As brain-wave technologies get cheaper and more popular, some company founders hope that consumers, who seem to be acclimating to devices like the increasingly ubiquitous Fitbit, will consider other, more cumbersome devices and procedures. Consciousness Hackers are a kind of self-selected early market-research group. Tonight, that was especially clear.

Keyword: Brain imaging; Depression
Link ID: 21093 - Posted: 06.25.2015

By John Horgan Transcranial magnetic stimulation is becoming an increasingly popular treatment for depression in spite of a lack of objective evidence of effectiveness. Illustration: National Institute of Mental Health. Delving into the history of treatments for mental illness can be depressing. Rather than developing ever-more-potent therapies, psychiatrists and others in the mental-health industry seem merely to recycle old ones. Consider, for example, therapies that stimulate the brain with electricity. In 1901, H. Lewis Jones, a physician, stated in the Journal of Mental Science: "The employment of electricity in medicine has passed through many vicissitudes, being at one time recognized and employed at the hospitals, and again being neglected, and left for the most part in the hands of ignorant persons, who continue to perpetrate the grossest impositions in the name of electricity. As each fresh important discovery in electric science has been reached, men’s minds have been turned anew to the subject, and interest in its therapeutic properties has been stimulated. Then after extravagant hopes and promises of cure, there have followed failures, which have thrown the employment of this agent into disrepute, to be again after time revived and brought into popular favor." Jones’s concerns could apply to our era, when electro-cures for mental illness have once again been "brought into popular favor." Below I briefly review the evidence—or lack thereof--for five electrotherapies: transcranial magnetic stimulation, cranial electrotherapy stimulation, vagus-nerve stimulation, deep-brain stimulation and electroconvulsive therapy.

Keyword: Depression
Link ID: 21092 - Posted: 06.25.2015

By Sarah C. P. Williams Parrots, like the one in the video above, are masters of mimicry, able to repeat hundreds of unique sounds, including human phrases, with uncanny accuracy. Now, scientists say they have pinpointed the neurons that turn these birds into copycats. The discovery could not only illuminate the origins of bird-speak, but might shed light on how new areas of the brain arise during evolution. Parrots, songbirds, and hummingbirds—which can all chirp different dialects, pick up new songs, and mimic sound—all have a “song nuclei” in their brain: a group of interconnected neurons that synchronizes singing and learning. But the exact boundaries of that region are fuzzy; some researchers define it as larger or smaller than others do, depending on what criteria they use to outline the area. And differences between the song nuclei of parrots—which can better imitate complex sounds—and other birds are hard to pinpoint. Neurobiologist Erich Jarvis of Duke University in Durham, North Carolina, was studying the activation of PVALB—a gene that had been previously found in songbirds—within the brains of parrots when he noticed something strange. Stained sections of deceased parrot brains revealed that the gene was turned on at distinct levels within two distinct areas of what he thought was the song nuclei of the birds’ brains. Sometimes, the gene was activated in a spherical central core of the nuclei. But other times, it was only active in an outer shell of cells surrounding that core. When he and collaborators looked more closely, they found that the inner core and the outer shell—like the chocolate and surrounding candy shell of an M&M—varied in many more ways as well.

Keyword: Language; Evolution
Link ID: 21091 - Posted: 06.25.2015

By Megan Cartwright Don’t pet the platypus. I know it’s tempting: Given the chance, I’d want to stroke their thick brown fur, tickle those big webbed feet, and pat that funny duck bill. And why not? What harm could come from this cute, egg-laying mammal from eastern Australia? Plenty. As someone who doesn’t enjoy “long lasting excruciating pain that cannot be relieved with conventional painkillers,” I’d really regret petting a platypus. Especially a male platypus, in late winter, when there’s only one thing on his mind and, even worse, something nasty on his feet. When British biologist Sir Everard Home got ahold of some platypus specimens in 1801, he told his fellow nerds at the Royal Society how the male specimen had a half-inch long “strong crooked spur” on the heel of each rear foot. The female, however, was spur-free. Home suggested that it “is probably by means of these spurs or hooks, that the female is kept from withdrawing herself in the act of copulation.” A very reasonable suggestion. But a wrong one. To be fair to Home, he could only study dead platypuses. If Home could have spent a year hanging out with living platypuses in their river homes, he would’ve seen that this “shy, semi-aquatic, mainly nocturnal” mammal is mostly interested in hunting on the river bottom for delicious insect larvae, crayfish, and shrimp. In other words, the platypus is usually an eater, not a lover. © 2014 The Slate Group

Keyword: Neurotoxins; Pain & Touch
Link ID: 21090 - Posted: 06.25.2015