Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Charles Schmidt The notion that the state of our gut governs our state of mind dates back more than 100 years. Many 19th- and early 20th-century scientists believed that accumulating wastes in the colon triggered a state of “auto-intoxication,” whereby poisons emanating from the gut produced infections that were in turn linked with depression, anxiety and psychosis. Patients were treated with colonic purges and even bowel surgeries until these practices were dismissed as quackery. The ongoing exploration of the human microbiome promises to bring the link between the gut and the brain into clearer focus. Scientists are increasingly convinced that the vast assemblage of microfauna in our intestines may have a major impact on our state of mind. The gut-brain axis seems to be bidirectional—the brain acts on gastrointestinal and immune functions that help to shape the gut's microbial makeup, and gut microbes make neuroactive compounds, including neurotransmitters and metabolites that also act on the brain. These interactions could occur in various ways: microbial compounds communicate via the vagus nerve, which connects the brain and the digestive tract, and microbially derived metabolites interact with the immune system, which maintains its own communication with the brain. Sven Pettersson, a microbiologist at the Karolinska Institute in Stockholm, has recently shown that gut microbes help to control leakage through both the intestinal lining and the blood-brain barrier, which ordinarily protects the brain from potentially harmful agents. Microbes may have their own evolutionary reasons for communicating with the brain. They need us to be social, says John Cryan, a neuroscientist at University College Cork in Ireland, so that they can spread through the human population. © 2015 Scientific American
By Roni Caryn Rabin When my mother, Pauline, was 70, she lost her sense of balance. She started walking with an odd shuffling gait, taking short steps and barely lifting her feet off the ground. She often took my hand, holding it and squeezing my fingers. Her decline was precipitous. She fell repeatedly. She stopped driving, and she could no longer ride her bike in a straight line along the C&O Canal. The woman who taught me the sidestroke couldn’t even stand in the shallow end of the pool. “I feel like I’m drowning,” she’d say. A retired psychiatrist, my mother had numerous advantages — education, resources and insurance — but, still, getting the right diagnosis took nearly 10 years. Each expert saw the problem through the narrow prism of a single specialty. Surgeons recommended surgery. Neurologists screened for common incurable conditions. The answer was under their noses, in my mother’s hunches and her family history. But it took a long time before someone connected the dots. My mother was using a walker by the time she was told she had a rare condition that causes gait problems and cognitive loss, and is one of the few treatable forms of dementia. The bad news was that it had taken so long to get the diagnosis that some of the damage might not be reversible. “This should be one of the first things physicians look for in an older person,” my mother said recently. “You can actually do something about it.”
Link ID: 20643 - Posted: 03.03.2015
By Felicity Muth Visual illusions are fun: we know with our rational mind that, for example, these lines are parallel to each other, yet they don’t appear that way. Similarly, I could swear that squares A and B are different colours. But they are not. This becomes clearer when a connecting block is drawn between the two squares (see the image below). Illusions aren’t just fun tricks for us to play with, they can also tell us something about our minds. Things in the world look to us a certain way, but that doesn’t mean that they are that way in reality. Rather, our brain represents the world to us in a particular way; one that has been selected over evolutionary time. Having such a system means that, for example, we can see some animals running but not others; we couldn’t see a mouse moving from a mile away like a hawk could. This is because there hasn’t been the evolutionary selective pressures on our visual system to be able to do such a thing, whereas there has on the hawk’s. We can also see a range of wavelengths of light, represented as particular colours in our brain, while not being able to see other wavelengths (that, for example, bees and birds can see). Having a system limited by what evolution has given us means that there are many things we are essentially blind to (and wouldn’t know about if it weren’t for technology). It also means that sometimes our brain misrepresents physical properties of the external world in a way that can be confusing once our rational mind realises it. Of course, all animals have their own representation of the world. How a dog visually perceives the world will be different to how we perceive it. But how can we know how other animals perceive the world? What is their reality? One way we can try to get this is through visual illusions. © 2015 Scientific American
By Nicholas Bakalar Sleeping more than eight hours a day is associated with a higher risk for stroke, a new study has found. Researchers studied 9,692 people, ages 42 to 81, who had never had a stroke. The study tracked how many hours a night the people slept at the beginning of the study and how much nightly sleep they were getting four years later. Over the 10-year study, 346 of the study subjects suffered strokes. After controlling for more than a dozen other health and behavioral variables, the researchers found that people who slept more than eight hours a day were 46 percent more likely to have had a stroke than those who slept six to eight hours. The study, published online last week in Neurology, also found that the risk of stroke was higher among people who reported that their need for sleep had increased over the study period. The authors caution that the data on sleep duration depended on self-reports, which can be unreliable. In addition, the study identified an association between sleep and stroke risk, rather than cause and effect. Sleeping more may be an early symptom of disease that leads to stroke, rather than a cause. “It could be that there’s already something happening in the brain that precedes the stroke risk and of which excessive sleep is an early sign,” said the lead author, Yue Leng, a doctoral candidate at the University of Cambridge. In any case, she added, “we don’t have enough evidence to apply this in clinical settings. We don’t want people to think if they sleep longer it will necessarily lead to stroke.” © 2015 The New York Times Company
|By Christof Koch In the Dutch countryside, a tall, older man, dressed in a maroon sports coat, his back slightly stooped, stands out because of his height and a pair of extraordinarily bushy eyebrows. His words, inflected by a British accent, are directed at a middle-aged man with long, curly brown hair, penetrating eyes and a dark, scholarly gown, who talks in only a halting English that reveals his native French origins. Their strangely clashing styles of speaking and mismatched clothes do not seem to matter to them as they press forward, with Eyebrows peering down intently at the Scholar. There is something distinctly odd about the entire meeting—a crossing of time, place and disciplines. Eyebrows: So I finally meet the man who doubts everything. The Scholar: (not missing a beat) At this time, I admit nothing that is not necessarily true. I'm famous for that! Eyebrows: Is there anything that you are certain of? (sotto voce) Besides your own fame? The Scholar: (evading the sarcastic jibe) I can't be certain of my fame. Indeed, I can't even be certain that there is a world out there, for I could be dreaming or hallucinating it. I can't be certain about the existence of my own body, its shape and extension, its corporality, for again I might be fooling myself. But now what am I, when I suppose that there is some supremely powerful and, if I may be permitted to say so, malicious deceiver who deliberately tries to fool me in any way he can? Given this evil spirit, how do I know that my sensations about the outside world—that is, it looks, feels and smells in a particular way—are not illusions, conjured up by Him to deceive me? It seems to me that therefore I can never know anything truly about the world. Nothing, rien du tout. I have to doubt everything. © 2015 Scientific American
Link ID: 20640 - Posted: 03.03.2015
By JULIE HOLLAND WOMEN are moody. By evolutionary design, we are hard-wired to be sensitive to our environments, empathic to our children’s needs and intuitive of our partners’ intentions. This is basic to our survival and that of our offspring. Some research suggests that women are often better at articulating their feelings than men because as the female brain develops, more capacity is reserved for language, memory, hearing and observing emotions in others. These are observations rooted in biology, not intended to mesh with any kind of pro- or anti-feminist ideology. But they do have social implications. Women’s emotionality is a sign of health, not disease; it is a source of power. But we are under constant pressure to restrain our emotional lives. We have been taught to apologize for our tears, to suppress our anger and to fear being called hysterical. The pharmaceutical industry plays on that fear, targeting women in a barrage of advertising on daytime talk shows and in magazines. More Americans are on psychiatric medications than ever before, and in my experience they are staying on them far longer than was ever intended. Sales of antidepressants and antianxiety meds have been booming in the past two decades, and they’ve recently been outpaced by an antipsychotic, Abilify, that is the No. 1 seller among all drugs in the United States, not just psychiatric ones. As a psychiatrist practicing for 20 years, I must tell you, this is insane. At least one in four women in America now takes a psychiatric medication, compared with one in seven men. Women are nearly twice as likely to receive a diagnosis of depression or anxiety disorder than men are. For many women, these drugs greatly improve their lives. But for others they aren’t necessary. The increase in prescriptions for psychiatric medications, often by doctors in other specialties, is creating a new normal, encouraging more women to seek chemical assistance. Whether a woman needs these drugs should be a medical decision, not a response to peer pressure and consumerism. © 2015 The New York Times Company
By ROBERT PEAR WASHINGTON — Federal investigators say they have found evidence of widespread overuse of psychiatric drugs by older Americans with Alzheimer’s disease, and are recommending that Medicare officials take immediate action to reduce unnecessary prescriptions. The findings will be released Monday by the Government Accountability Office, an arm of Congress, and come as the Obama administration has already been working with nursing homes to reduce the inappropriate use of antipsychotic medications like Abilify, Risperdal, Zyprexa and clozapine. But in the study, investigators said officials also needed to focus on overuse of such drugs by people with dementia who live at home or in assisted living facilities. The Department of Health and Human Services “has taken little action” to reduce the use of antipsychotic drugs by older adults living outside nursing homes, the report said. Doctors sometimes prescribe antipsychotic drugs to calm patients with dementia who display disruptive behavior like hitting, yelling or screaming, the report said. Researchers said this was often the case in nursing homes that had inadequate numbers of employees. Dementia is most commonly associated with a decline in memory, but doctors say it can also cause changes in mood or personality and, at times, agitation or aggression. Experts have raised concern about the use of antipsychotic drugs to address behavioral symptoms of Alzheimer’s and other forms of dementia. The Food and Drug Administration says antipsychotic drugs are often associated with an increased risk of death when used to treat older adults with dementia who also have psychosis. © 2015 The New York Times Company
By Neuroskeptic In an interesting short paper just published in Trends in Cognitive Science, Caltech neuroscientist Ralph Adolphs offers his thoughts on The Unsolved Problems of Neuroscience. Here’s Adolphs’ list of the top 23 questions (including 3 “meta” issues), which, he says, was inspired by Hilbert’s famous set of 23 mathematical problems: Problems that are solved, or soon will be: I. How do single neurons compute? II. What is the connectome of a small nervous system, like that of Caenorhabitis elegans (300 neurons)? III. How can we image a live brain of 100,000 neurons at cellular and millisecond resolution? IV. How does sensory transduction work? Problems that we should be able to solve in the next 50 years: V. How do circuits of neurons compute? VI. What is the complete connectome of the mouse brain (70,000,000 neurons)? VII. How can we image a live mouse brain at cellular and millisecond resolution? VIII. What causes psychiatric and neurological illness? IX. How do learning and memory work? X. Why do we sleep and dream? XI. How do we make decisions? XII. How does the brain represent abstract ideas? Problems that we should be able to solve, but who knows when: XIII. How does the mouse brain compute? XIV. What is the complete connectome of the human brain (80,000,000,000 neurons)? XV. How can we image a live human brain at cellular and millisecond resolution? XVI. How could we cure psychiatric and neurological diseases? XVII. How could we make everybody’s brain function best? Problems we may never solve: XVIII. How does the human brain compute? XIX. How can cognition be so flexible and generative? XX. How and why does conscious experience arise? Meta-questions: XXI. What counts as an explanation of how the brain works? (and which disciplines would be needed to provide it?) XXII. How could we build a brain? (how do evolution and development do it?) XXIII. What are the different ways of understanding the brain? (what is function, algorithm, implementation?) Adolphs R (2015). The unsolved problems of neuroscience. Trends in cognitive sciences PMID: 25703689
Link ID: 20637 - Posted: 03.02.2015
Anti-depressants are the most commonly-prescribed medication in the U.S., with one in 10 Americans currently taking pills like Zoloft and Lexapro to treat depression. But these pharmaceuticals are only fully effective roughly 30 percent of the time, and often come with troublesome side effects. In a controversial new paper published in the journal Neuroscience & Biobehavioral Reviews, psychologist Paul Andrews of McMaster University in Ontario argues that this failure of medication may be based in a misunderstanding of the underlying chemistry related to depression. Andrews surveyed 50 years' worth of research supporting the serotonin theory of depression, which suggests that the disease is caused by low levels of the "happiness" neurotransmitter, serotonin. But Andrews argues that depression may actually be caused by elevated levels of serotonin. And this fundamental misunderstanding may be responsible for inappropriate treatment: The most common form of antidepressants are selective serotonin re-uptake inhibitors (SSRIs), which operate by targeting serotonin receptors in the brain in an effort to amplify serotonin production. Currently, scientists are unable to measure precisely how the brain releases and uses serotonin, because it can't be safely observed in a human brain. But Andrews points to research on animals which suggests that serotonin might work just the opposite from what we've assumed. ©2015 TheHuffingtonPost.com, Inc.
Link ID: 20636 - Posted: 03.02.2015
By JONATHAN MAHLER The mother of the bride wore white and gold. Or was it blue and black? From a photograph of the dress the bride posted online, there was broad disagreement. A few days after the wedding last weekend on the Scottish island of Colonsay, a member of the wedding band was so frustrated by the lack of consensus that she posted a picture of the dress on Tumblr, and asked her followers for feedback. “I was just looking for an answer because it was messing with my head,” said Caitlin McNeill, a 21-year-old singer and guitarist. Within a half-hour, her post attracted some 500 likes and shares. The photo soon migrated to Buzzfeed and Facebook and Twitter, setting off a social media conflagration that few were able to resist. As the debate caught fire across the Internet — even scientists could not agree on what was causing the discrepancy — media companies rushed to get articles online. Less than a half-hour after Ms. McNeil’s original Tumblr post, Buzzfeed posted a poll: “What Colors Are This Dress?” As of Friday afternoon, it had been viewed more than 28 million times. (White and gold was winning handily.) At its peak, more than 670,000 people were simultaneously viewing Buzzfeed’s post. Between that and the rest of Buzzfeed’s blanket coverage of the dress Thursday night, the site easily smashed its previous records for traffic. So did Tumblr. Everyone, it seems, had an opinion. And everyone was convinced that he, or she, was right. “I don’t understand this odd dress debate and I feel like it’s a trick somehow,” Taylor Swift wrote on Twitter. “PS it’s OBVIOUSLY BLUE AND BLACK.” “IT’S A BLUE AND BLACK DRESS!” wrote Mindy Kaling. “ARE YOU KIDDING ME,” she continued, including an unprintable modifier for emphasis. © 2015 The New York Times Company
Link ID: 20635 - Posted: 02.28.2015
By Pascal Wallisch If you are just encountering The Dress for the first time, you might first want to click here to see what all the fuss was about. The brain lives in a bony shell. The completely light-tight nature of the skull renders this home a place of complete darkness. So the brain relies on the eyes to supply an image of the outside world, but there are many processing steps between the translation of light energy into electrical impulses that happens in the eye and the neural activity that corresponds to a conscious perception of the outside world. In other words, the brain is playing a game of telephone and—contrary to popular belief—our perception corresponds to the brain’s best guess of what is going on in the outside world, not necessarily to the way things actually are. This has been recognized for at least 150 years, since the time of Hermann von Helmholtz. This week, it was recognized by masses of people on the Internet, who have been debating furiously over what should be a simple question: What color is this dress? Many parts of the brain contribute to any given perception, and it should not be surprising that different people can reconstruct the outside world in different ways. This is true for many perceptual qualities, including form and motion. While this guessing game is going on all the time, it is possible to demonstrate it clearly by generating impoverished stimulus displays that are consistent with different, mutually exclusive interpretations. That means the brain will not necessarily commit to one interpretation, but will switch back and forth. These are known as ambiguous or bi-stable stimuli, and they illustrate the point that the brain is ultimately only guessing when perceiving the world. It usually just has more information to disambiguate the interpretation. © 2014 The Slate Group LLC. All
Link ID: 20634 - Posted: 02.28.2015
Carmen Fishwick It’s not every day that fashion and science come together to polarise the world. Tumblr blogger Caitlin posted a photograph of what is now known as #TheDress – a layered lace dress and jacket that was causing much distress among her friends. The distress spread rapidly across social media, with Taylor Swift admitting she was “confused and scared”. The internet is now made up by people firmly in two camps: the white and gold, and the blue and black – with each thinking the other is completely wrong. But Ron Chrisley, director of the Centre for Research in Cognitive Science at the University of Sussex, believes that the problem mainly lies in the fact that everyone has forgotten we are dealing with a colour illusion. Chrisley said: “The first step in reaching a truce in the dress war is to construct a demonstration that can show to the white-and-gold crowd how the very same dress can also look blue and black under different conditions.” The image below, tweeted by @namin3485, demonstrates that even though the right-hand side of each image is the same, in the context of the two different left halves, the right is interpreted as being either white and gold, or blue and black. So does this mean people who are less self-confident are more likely to be able to see both, at least eventually? Chrisley said: “My guess is it’s not to do with self-confidence. It’s a perceptual issue. I could imagine someone that’s open minded could still see it only one way. This is below the level of us trying to understand other peoples views. It’s more physiological than that.” © 2015 Guardian News and Media Limited
Link ID: 20633 - Posted: 02.28.2015
By Adam Rogers The fact that a single image could polarize the entire Internet into two aggressive camps is, let’s face it, just another Thursday. But for the past half-day, people across social media have been arguing about whether a picture depicts a perfectly nice bodycon dress as blue with black lace fringe or white with gold lace fringe. And neither side will budge. This fight is about more than just social media—it’s about primal biology and the way human eyes and brains have evolved to see color in a sunlit world. Light enters the eye through the lens—different wavelengths corresponding to different colors. The light hits the retina in the back of the eye where pigments fire up neural connections to the visual cortex, the part of the brain that processes those signals into an image. Critically, though, that first burst of light is made of whatever wavelengths are illuminating the world, reflecting off whatever you’re looking at. Without you having to worry about it, your brain figures out what color light is bouncing off the thing your eyes are looking at, and essentially subtracts that color from the “real” color of the object. “Our visual system is supposed to throw away information about the illuminant and extract information about the actual reflectance,” says Jay Neitz, a neuroscientist at the University of Washington. “But I’ve studied individual differences in color vision for 30 years, and this is one of the biggest individual differences I’ve ever seen.” (Neitz sees white-and-gold.) Usually that system works just fine. This image, though, hits some kind of perceptual boundary. That might be because of how people are wired. Human beings evolved to see in daylight, but daylight changes color. WIRED.com © 2015 Condé Nast
Distinct changes in the immune systems of patients with ME or chronic fatigue syndrome have been found, say scientists. Increased levels of immune molecules called cytokines were found in people during the early stages of the disease, a Columbia University study reported. It said the findings could help improve diagnosis and treatments. UK experts said further refined research was now needed to confirm the results. People with ME (myalgic encephalopathy) or CFS (chronic fatigue syndrome) suffer from exhaustion that affects everyday life and does not go away with sleep or rest. They can also have muscle pain and difficulty concentrating. ME can also cause long-term illness and disability, although many people improve over time. It is estimated that around 250,000 people in the UK have the disease. Disease pattern The US research team, who published their findings in the journal Science Advances, tested blood samples from nearly 300 ME patients and around 350 healthy people. They found specific patterns of immune molecules in patients who had the disease for up to three years. These patients had higher levels of of cytokines, particularly one called interferon gamma, which has been linked to the fatigue that follows many viral infections. Healthy patients and those who had the disease for longer than three years did not show the same pattern. Lead author Dr Mady Hornig said this was down to the way viral infections could disrupt the immune system. "It appears that ME/CFS patients are flush with cytokines until around the three-year mark, at which point the immune system shows evidence of exhaustion and cytokine levels drop."
Fatty liver disease, or the buildup of fat in the liver, and sleep apnea, marked by snoring and interruptions of breathing at night, share some things in common. The two conditions frequently strike people who are overweight or obese. Each afflicts tens of millions of Americans, and often the diseases go undiagnosed. Researchers used to believe that sleep apnea and fatty liver were essentially unrelated, even though they occur together in many patients. But now studies suggest that the two may be strongly linked, with sleep apnea directly exacerbating fatty liver. In a study published last year in the journal Chest, researchers looked at 226 obese middle-aged men and women who were referred to a clinic because they were suspected of having sleep apnea. They found that two-thirds had fatty liver disease, and that the severity of the disease increased with the severity of their sleep apnea. A study last year in The Journal of Pediatrics found a similar relationship in children. The researchers identified sleep apnea in 60 percent of young subjects with fatty liver disease. The worse their apnea episodes, the more likely they were to have fibrosis, or scarring of the liver. Though it is still somewhat unclear, some doctors suspect that the loss of oxygen from sleep apnea may increase chronic inflammation, which worsens fatty liver. Although fat in the liver can be innocuous at first, as inflammation sets in, the fat turns to scar tissue, and that can lead to liver failure. © 2015 The New York Times Company
Link ID: 20630 - Posted: 02.28.2015
By Nicholas Weiler The grizzled wolf stalks from her rival’s den, her mouth caked with blood of the pups she has just killed. It’s a brutal form of birth control, but only the pack leader is allowed to keep her young. For her, this is a selfish strategy—only her pups will carry on the future of the pack. But it may also help the group keep its own numbers in check and avoid outstripping its resources. A new survey of mammalian carnivores worldwide proposes that many large predators have the ability to limit their own numbers. The results, though preliminary, could help explain how top predators keep the food chains beneath them in balance. Researchers often assume that predator numbers grow and shrink based on their food supply, says evolutionary biologist Blaire Van Valkenburgh of the University of California, Los Angeles, who was not involved in the new study. But several recent examples, including an analysis of the resurgent wolves of Yellowstone National Park, revealed that some large predators keep their own numbers in check. The new paper is the first to bring all the evidence together, Van Valkenburgh says, and presents a “convincing correlation.” Hunting and habitat loss are killing off big carnivores around the world, just as ecologists are discovering how important they are for keeping ecosystems in balance. Mountain lions sustain woodlands by hunting deer that would otherwise graze the landscape bare. Coyotes protect scrub-dwelling birds by keeping raccoons and foxes in line. Where top carnivores disappear, these smaller predators often explode in numbers, with potentially disastrous consequences for small birds and mammals. © 2015 American Association for the Advancement of Science
By Elizabeth Pennisi Last week, researchers expanded the size of the mouse brain by giving rodents a piece of human DNA. Now another team has topped that feat, pinpointing a human gene that not only grows the mouse brain but also gives it the distinctive folds found in primate brains. The work suggests that scientists are finally beginning to unravel some of the evolutionary steps that boosted the cognitive powers of our species. “This study represents a major milestone in our understanding of the developmental emergence of human uniqueness,” says Victor Borrell Franco, a neurobiologist at the Institute of Neurosciences in Alicante, Spain, who was not involved with the work. The new study began when Wieland Huttner, a developmental neurobiologist at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and his colleagues started closely examining aborted human fetal tissue and embryonic mice. “We specifically wanted to figure out which genes are active during the development of the cortex, the part of the brain that is greatly expanded in humans and other primates compared to rodents,” says Marta Florio, the Huttner graduate student who carried out the main part of the work. That was harder than it sounded. Building a cortex requires several kinds of starting cells, or stem cells. The stem cells divide and sometimes specialize into other types of “intermediate” stem cells that in turn divide and form the neurons that make up brain tissue. To learn what genes are active in the two species, the team first had to develop a way to separate out the various types of cortical stem cells. © 2015 American Association for the Advancement of Science
by Helen Thomson We meet in a pub, we have a few drinks, some dinner and then you lean in for a kiss. You predict, based on our previous interactions, that the kiss will be reciprocated – rather than landing you with a slap in the face. All our social interactions require us to anticipate another person's undecided intentions and actions. Now, researchers have discovered specific brain cells that allow monkeys to do this. It is likely that the cells do the same job in humans. Keren Haroush and Ziv Williams at Harvard Medical School trained monkeys to play a version of the prisoner's dilemma, a game used to study cooperation. The monkeys sat next to each other and decided whether or not they wanted to cooperate with their companion, by moving a joystick to pick either option. Moving the joystick towards an orange circle meant cooperate, a blue triangle meant "not this time". Neither monkey could see the other's face, or receive any clues about their planned action. If the monkeys cooperated, both received four drops of juice. If one cooperated and the other decided not to, the one who cooperated received one drop, and the other received six drops of juice. If both declined to work together they both received two drops of juice. Once both had made their selections, they could see what the other monkey had chosen and hear the amount of juice their companion was enjoying. © Copyright Reed Business Information Ltd.
Link ID: 20627 - Posted: 02.27.2015
Elizabeth Gibney DeepMind, the Google-owned artificial-intelligence company, has revealed how it created a single computer algorithm that can learn how to play 49 different arcade games, including the 1970s classics Pong and Space Invaders. In more than half of those games, the computer became skilled enough to beat a professional human player. The algorithm — which has generated a buzz since publication of a preliminary version in 2013 (V. Mnih et al. Preprint at http://arxiv.org/abs/1312.5602; 2013) — is the first artificial-intelligence (AI) system that can learn a variety of tasks from scratch given only the same, minimal starting information. “The fact that you have one system that can learn several games, without any tweaking from game to game, is surprising and pretty impressive,” says Nathan Sprague, a machine-learning scientist at James Madison University in Harrisonburg, Virginia. DeepMind, which is based in London, says that the brain-inspired system could also provide insights into human intelligence. “Neuroscientists are studying intelligence and decision-making, and here’s a very clean test bed for those ideas,” says Demis Hassabis, co-founder of DeepMind. He and his colleagues describe the gaming algorithm in a paper published this week (V. Mnih et al. Nature 518, 529–533; 2015. Games are to AI researchers what fruit flies are to biology — a stripped-back system in which to test theories, says Richard Sutton, a computer scientist who studies reinforcement learning at the University of Alberta in Edmonton, Canada. “Understanding the mind is an incredibly difficult problem, but games allow you to break it down into parts that you can study,” he says. But so far, most human-beating computers — such as IBM’s Deep Blue, which beat chess world champion Garry Kasparov in 1997, and the recently unveiled algorithm that plays Texas Hold ’Em poker essentially perfectly (see Nature http://doi.org/2dw; 2015)—excel at only one game. © 2015 Nature Publishing Group
By Michael Erard Freckle, a male rhesus monkey, was greeted warmly by his fellow monkeys at his new home in Amherst, Massachusetts, when he arrived in 2000. But he didn’t return the favor: He terrorized his cagemate by stealing his fleece blanket and nabbed each new blanket the researchers added, until he had 10 and his cagemate none. After a few months, Freckle had also acquired a new name: Ivan, short for Ivan the Terrible. Freckle/Ivan, now at Melinda Novak’s primate research lab at the University of Massachusetts, may be unusual in having two names, but all of his neighbors have at least one moniker, Novak says. “You can say, ‘Kayla and Zoe are acting out today,’ and everybody knows who Kayla and Zoe are,” Novak says. “If you say ‘ZA-56 and ZA-65 are acting up today,’ people pause.” Scientists once shied away from naming research animals, and many of the millions of mice and rats used in U.S. research today go nameless, except for special individuals. But a look at many facilities suggests that most of the other 891,161 U.S. research animals have proper names, including nonhuman primates, dogs, pigs, rabbits, cats, and sheep. Rats are Pia, Splinter, Oprah, Persimmon. Monkeys are Nyah, Nadira, Tas, Doyle. One octopus is called Nixon. Breeder pairs of mice are “Tom and Katie,” or “Brad and Angelina.” If you’re a mouse with a penchant for escape, you’ll be Mighty Mouse or Houdini. If you’re a nasty mouse, you’ll be Lucifer or Lucifina. Animals in research are named after shampoos, candy bars, whiskeys, family members, movie stars, and superheroes. They’re named after Russians (Boris, Vladimir, Sergei), colors, the Simpsons, historical figures, and even rival scientists. These unofficial names rarely appear in publications, except sometimes in field studies of primates. But they’re used daily. © 2015 American Association for the Advancement of Science.
Keyword: Animal Rights
Link ID: 20625 - Posted: 02.27.2015