Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Carl Zimmer In the early 1970s, Sarah Blaffer Hrdy, then a graduate student at Harvard, traveled to India to study Hanuman langurs, monkeys that live in troops, each made up of several females and a male. From time to time, Dr. Hrdy observed a male invade a troop, driving off the patriarch. And sometimes the new male performed a particularly disturbing act of violence. He attacked the troop’s infants. There had been earlier reports of infanticide by adult male mammals, but scientists mostly dismissed the behavior as an unimportant pathology. But in 1974, Dr. Hrdy made a provocative counter proposal: infanticide, she said, is the product of mammalian evolution. By killing off babies of other fathers, a male improves his chances of having more of his own offspring. Dr. Hrdy went on to become a professor at the University of California, Davis, and over the years she broadened her analysis, arguing that infanticide might well be a common feature of mammalian life. She spurred generations of scientists to document the behavior in hundreds of species. “She’s the goddess of all this stuff,” said Kit Opie, a primatologist at University College London. Forty years after Dr. Hrdy’s initial proposal, two evolutionary biologists at the University of Cambridge have surveyed the evolution of infanticide across all mammals. In a paper published Thursday in Science, the scientists concluded that only certain conditions favor the evolution of infanticide — the conditions that Dr. Hrdy had originally proposed. “My main comment is, ‘Well done,'” said Dr. Hrdy. She said the study was particularly noteworthy for its scope, ranging from opossum to lions. The authors of the new study, Dieter Lukas and Elise Huchard, started by plowing through the scientific literature, looking for evidence of infanticide in a variety of mammalian species. The researchers ended up with data on 260 species, and in 119 of them — over 45 percent — males had been observed killing unrelated young animals. © 2014 The New York Times Company
by Helen Thomson Could a futuristic society of humans with the power to control their own biological functions ever become reality? It's not as out there as it sounds, now the technical foundations have been laid. Researchers have created a link between thoughts and cells, allowing people to switch on genes in mice using just their thoughts. "We wanted to be able to use brainwaves to control genes. It's the first time anyone has linked synthetic biology and the mind," says Martin Fussenegger, a bioengineer at ETH Zurich in Basel, Switzerland, who led the team behind the work. They hope to use the technology to help people who are "locked-in" – that is, fully conscious but unable to move or speak – to do things like self-administer pain medication. It might also be able to help people with epilepsy control their seizures. In theory, the technology could be used for non-medical purposes, too. For example, we could give ourselves a hormone burst on demand, much like in the Culture – Iain M. Banks's utopian society, where people are able to secrete hormones and other chemicals to change their mood. Fussenegger's team started by inserting a light-responsive gene into human kidney cells in a dish. The gene is activated, or expressed, when exposed to infrared light. The cells were engineered so that when the gene activated, it caused a cascade of chemical reactions leading to the expression of another gene – the one the team wanted to switch on. © Copyright Reed Business Information Ltd.
Details of the role of glutamate, the brain’s excitatory chemical, in a drug reward pathway have been identified for the first time. This discovery in rodents — published today in Nature Communications — shows that stimulation of glutamate neurons in a specific brain region (the dorsal raphe nucleus) leads to activation of dopamine-containing neurons in the brain’s reward circuit (dopamine reward system). Dopamine is a neurotransmitter present in regions of the brain that regulate movement, emotion, motivation, and feelings of pleasure. Glutamate is a neurotransmitter whose receptors are important for neural communication, memory formation, and learning. The research was conducted at the Intramural Research Program (IRP) of the National Institute on Drug Abuse (NIDA), which is part of the National Institutes of Health. The research focused on the dorsal raphe nucleus, which has long been a brain region of interest to drug abuse researchers, since nerve cells in this area connect to part of the dopamine reward system. Many of the pathways are rich in serotonin, a neurotransmitter linked to mood regulation. Even though electrical stimulation of the dorsal raphe nucleus promotes reward-related behaviors, drugs that increase serotonin have low abuse potential. As a result, this region of the brain has always presented a seeming contradiction, since it is involved in drug reward but is also abundant in serotonin - a chemical not known for a role in drug reinforcement. This has led researchers to theorize that another neurotransmitter may be responsible for the role that the dorsal raphe nucleus plays in reward.
Keyword: Drug Abuse
Link ID: 20308 - Posted: 11.13.2014
By Kate Kelland LONDON (Reuters) - British scientists say they have found the best way yet to analyze the effects of smoking on the brain -- by taking functional magnetic resonance imaging (fMRI) scans of people while they puff on e-cigarettes. In a small pilot study, the researchers used electronic cigarettes, or e-cigarettes, to mimic the behavioral aspects of smoking tobacco cigarettes, and say future studies could help scientists understand why smoking is so addictive. E-cigarettes use battery-powered cartridges to produce a nicotine-laced vapor to inhale -- hence the new term "vaping". Their use has rocketed in recent years, but there is fierce debate about the risks and benefits. Some public health experts say they could help millions quit tobacco cigarettes, while others argue they could "normalize" the habit and lure children into smoking. While that argument rages, tobacco kills some 6 million people a year, and the World Health Organization estimates that could rise beyond 8 million by 2030. Matt Wall, an imaging scientist at Imperial College London who led the study using e-cigarettes, said he was not aiming to pass judgment on their rights or wrongs, but to use them to dig deeper into smoking addiction. The fact that other forms of nicotine replacement therapy, such as patches or gum, have had only limited success in getting hardened smokers to quit suggests they are hooked on more than just nicotine, he noted. © 2014 Scientific American
Emily Anthes Anna's life began to unravel in 2005 when her husband of 30 years announced that he had fallen in love with another woman. “It had never even occurred to me that my marriage could ever end,” recalls Anna, a retired lawyer then living in Philadelphia, Pennsylvania. “It was pretty shocking.” Over the course of several months, Anna stopped wanting to get up in the morning. She felt tired all the time, and consumed by negative thoughts. “'I'm worthless.' 'I messed up everything.' 'It's all my fault.'” She needed help, but her first therapist bored her and antidepressants only made her more tired. Then she found Cory Newman, director of the Center for Cognitive Therapy at the University of Pennsylvania, who started her on a different kind of therapy. Anna learned how to obsess less over her setbacks and give herself more credit for her triumphs. “It was so helpful to talk to someone who steered me to more positive ways of thinking,” says Anna, whose name has been changed at her request. Cognitive therapy, commonly known as cognitive behavioural therapy (CBT), aims to help people to identify and change negative, self-destructive thought patterns. And although it does not work for everyone with depression, data have been accumulating in its favour. “CBT is one of the clear success stories in psychotherapy,” says Stefan Hofmann, a psychologist at Boston University in Massachusetts. Antidepressant drugs are usually the first-line treatment for depression. They are seen as a quick, inexpensive fix — but clinical trials reveal that only 22–40% of patients emerge from depression with drugs alone. Although there are various approaches to psychotherapy, CBT is the most widely studied; a meta-analysis1 published this year revealed that, depending on how scientists measure outcomes, between 42% and 66% of patients no longer meet the criteria for depression after therapy. © 2014 Nature Publishing Group
Link ID: 20306 - Posted: 11.13.2014
Heidi Ledford If the extent of human suffering were used to decide which diseases deserve the most medical attention, then depression would be near the top of the list. More than 350 million people are affected by depression, making it one of the most common disorders in the world. It is the biggest cause of disability, and as many as two-thirds of those who commit suicide have the condition. But although depression is common, it is often ignored. Three-quarters of people with depression in the United Kingdom go undiagnosed or untreated — and even if the disorder is diagnosed, today's medications will work well for only about half of those who seek help. “It's unbelievable,” says Tom Foley, a psychiatrist at Newcastle University, UK. “If that was the case in cancer care, it would be an absolute scandal.” The comparison between depression and cancer is a common one. Cancer, too, is a terrible blight: it affects more than 32 million people and kills some 8 million a year, many more than depression. But at least in developed countries, the vast majority of people with recognized cancers do receive treatment. In research, too, depression has failed to keep up with cancer. Cancer research today is a thriving field, unearthing vast catalogues of disease-associated mutations, cranking out genetically targeted therapies and developing sophisticated animal models. Research into depression, meanwhile, seems to have floundered: once-hopeful therapies have failed in clinical trials, genetic studies have come up empty-handed. The field is still struggling to even define the disease — and overcome the stigma associated with it. © 2014 Nature Publishing Grou
Link ID: 20305 - Posted: 11.13.2014
By Abby Phillip You know the ones: They seem to be swaying to their own music or clapping along to a beat only they can hear. You may even think that describes you. The majority of humans, however, do this very well. We clap, dance, march in unison with few problems; that ability is part of what sets us apart from other animals. But it is true that rhythm — specifically, coordinating your movement with something you hear — doesn't come naturally to some people. Those people represent a very small sliver of the population and have a real disorder called "beat deafness." Unfortunately, your difficulty dancing or keeping time in band class probably doesn't quite qualify. A new study by McGill University researchers looked more closely at what might be going on with "beat deaf" individuals, and the findings may shed light on why some people seem to be rhythm masters while others struggle. Truly beat deaf people have a very difficult time clapping or tapping to an auditory beat or swaying to one. It's a problem that is far more severe than a lack of coordination. And it isn't attributable to motor skills, hearing problems or even a person's inability to create a regular rhythm. Illustrating how rare the disorder really is, McGill scientists received hundreds of inquiries from people who thought they were beat deaf, but only two qualified as having truly severe problems.
Link ID: 20304 - Posted: 11.13.2014
By Jia You Like humans, flies are attracted to alcohol. Fruit flies (Drosophila melanogaster, above) prefer to lay their eggs on rotten food that can contain ethanol in as high as 7% concentration. (That’s 14 proof to you bar hoppers.) And just like people, the insects differ in their ability to hold their drinks. Biologists know that compared with flies from tropical Africa, flies from temperate regions such as Europe survive longer when exposed to ethanol vapors of high concentrations, and they know it has something to do with enzymes on the flies’ second chromosomes, which break down alcohol and are more active in European flies. But now, biologist James Fry of the University of Rochester in New York has pinpointed a missing piece of the story: the role played by the flies’ third chromosomes. After studying flies collected from Vienna and Cameroon, Fry found that the Vienna flies break down alcohol much faster than Cameroon ones, as expected. But when he replaced the third chromosomes in Cameroon flies with those from Vienna, the African flies gained much more resistance, Fry reports online today in The Journal of Experimental Biology. In a specialized population of flies that could not detoxify alcohol, however, the genetic engineering made no difference whatsoever. Fry suggests that’s because the third chromosomes in European flies help them tolerate acetic acid, a byproduct of internal alcohol breakdown that also gives vinegar its sour taste. There’s no telling what the acetic acid does to the flies, but previous studies on mice have found that it may be responsible for hangover headaches, Fry says. © 2014 American Association for the Advancement of Science
Email David By David Grimm Place a housecat next to its direct ancestor, the Near Eastern wildcat, and it may take you a minute to spot the difference. They’re about the same size and shape, and, well, they both look like cats. But the wildcat is fierce and feral, whereas the housecat, thanks to nearly 10,000 years of domestication, is tame and adaptable enough to have become the world’s most popular pet. Now scientists have begun to pinpoint the genetic changes that drove this remarkable transformation. The findings, based on the first high-quality sequence of the cat genome, could shed light on how other creatures, even humans, become tame. “This is the closest thing to a smoking gun we’ve ever had,” says Greger Larson, an evolutionary biologist at the University of Oxford in the United Kingdom who has studied the domestication of pigs, dogs, and other animals. “We’re much closer to understanding the nitty-gritty of domestication than we were a decade ago.” Cats first entered human society about 9500 years ago, not long after people first took up farming in the Middle East. Drawn to rodents that had invaded grain stores, wildcats slunk out of the deserts and into villages. There, many scientists suspect, they mostly domesticated themselves, with the friendliest ones able to take advantage of human table scraps and protection. Over thousands of years, cats shrank slightly in size, acquired a panoply of coat colors and patterns, and (largely) shed the antisocial tendencies of their past. Domestic animals from cows to dogs have undergone similar transformations, yet scientists know relatively little about the genes involved. Researchers led by Michael Montague, a postdoc at the Washington University School of Medicine in St. Louis, have now pinpointed some of them. The scientists started with the genome of a domestic cat—a female Abyssinian—that had been published in draft form in 2007, then filled in missing sequences and identified genes. They compared the resulting genome with those of cows, tigers, dogs, and humans. © 2014 American Association for the Advancement of Science.
By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.
Link ID: 20296 - Posted: 11.10.2014
By James Gallagher Health editor, BBC News website The brain has specialist neurons for each of the five taste categories - salty, bitter, sour, sweet and umami - US scientists have discovered. The study, published in the journal Nature, should settle years of debate on how the brain perceives taste. The Columbia University team showed the separate taste sensors on the tongue had a matching partner in the brain. The scientists hope the findings could be used to help reverse the loss of taste sensation in the elderly. It is a myth that you taste sweet only on the tip of the tongue. Each of the roughly 8,000 taste buds scattered over the tongue is capable of sensing the full suite of tastes. But specialised cells within the taste bud are tuned to either salty, bitter, sour, sweet or umami tastes. When they detect the signal, a message is sent to the brain. Although how the brain deals with the information has been up for discussion. A team at Columbia University engineered mice so that their taste neurons would fluoresce when they were activated. They then trained their endoscopes on the neurons deep at their base of the brain. The animals were fed chemicals to trigger either a salty, bitter, sour, sweet or umami response on the tongue and the researchers monitored the change in the brain. They found a "hard wired" connection between tongue and brain. Prof Charles Zuker told the BBC News website: "The cells were beautifully tuned to discrete individual taste qualities, so you have a very nice match between the nature of the cells in your tongue and the quality they represent [in the brain]." It scotches the alternative idea that brain cells respond to multiple tastes. BBC © 2014
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20295 - Posted: 11.10.2014
Stem cells can be used to heal the damage in the brain caused by Parkinson's disease, according to scientists in Sweden. They said their study on rats heralded a "huge breakthrough" towards developing effective treatments. There is no cure for the disease, but medication and brain stimulation can alleviate symptoms. Parkinson's UK said there were many questions still to be answered before human trials could proceed. The disease is caused by the loss of nerve cells in the brain that produce the chemical dopamine ,which helps to control mood and movement. To simulate Parkinson's, Lund University researchers killed dopamine-producing neurons on one side of the rats' brains. They then converted human embryonic stem cells into neurons that produced dopamine. Parkinson's Disease Parkinson's is one of the commonest neurodegenerative diseases These were injected into the rats' brains, and the researchers found evidence that the damage was reversed. There have been no human clinical trials of stem-cell-derived neurons, but the researchers said they could be ready for testing by 2017. Malin Parmar, associate professor of developmental and regenerative neurobiology, said: "It's a huge breakthrough in the field [and] a stepping stone towards clinical trials." A similar method has been tried in a limited number of patients. It involved taking brain tissue from multiple aborted foetuses to heal the brain. Clinical trials were abandoned after mixed results, but about a third of the patients had foetal brain cells that functioned for 25 years. BBC © 2014
Link ID: 20292 - Posted: 11.08.2014
By Tracy Jarrett Autism advocates on Friday applauded Jerry Seinfeld's disclosure that he may be autistic, while warning against making him the poster boy for a disorder that is no laughing matter. “I think, on a very drawn-out scale, I think I’m on the spectrum,” Seinfeld told NBC Nightly News’ Brian Williams. "Basic social engagement is really a struggle. I'm very literal, when people talk to me and they use expressions, sometimes I don't know what they're saying," he said. "But I don't see it as dysfunctional, I just think of it as an alternate mindset." Seinfeld's revelation sends a positive message that the autism community is much larger and more diverse than people often understand, Ari Ne’eman, president of the Autistic Advocacy Network, told NBC News. Ne’eman is living with autism and says that there is still a tremendous amount of stigma surrounding autism that hinders the opportunities available to those with the disorder. “Think about what this does for a closeted autistic person who goes into the workplace knowing that their co-workers have just seen somebody they know, respect, and have a positive opinion of, like Jerry Seinfeld, identify in this way — it’s a valuable and important step in building a greater tolerance for autism,” Ne’eman said. Liz Feld, president of Autism Speaks, agreed, pointing out that “there are many people on the autism spectrum who can relate to Jerry’s heartfelt comments about his own experiences.”
Link ID: 20289 - Posted: 11.08.2014
by Penny Sarchet It's frustrating when your smartphone loses its signal in the middle of a call or when downloading a webpage. But for bats, a sudden loss of its sonar signal means missing an insect meal in mid-flight. Now there's evidence to suggest that bats are sneakily using sonar jamming techniques to make their fellow hunters miss their tasty targets. Like other bats, the Mexican free-tailed bat uses echolocation to pinpoint prey insects in the dark. But when many bats hunt in the same space, they can interfere with each other's echoes, making detection more difficult. Jamming happens when a sound disrupts a bat's ability to extract location information from the echoes returning from its prey, explains Aaron Corcoran of Johns Hopkins University in Baltimore, Maryland. Previous research has shown that Mexican free-tailed bats can get around this jamming by switching to higher pitches. Using different sound frequencies to map the hunting grounds around them allows many bats to hunt in the same space. In these studies, jamming of each other's signals was seemingly inadvertent – a simple consequence of two bats attempting to echolocate in close proximity. But Corcoran has found evidence of sneakier goings-on. Corcoran has found a second type of sonar jamming in these bats – intentional sabotage of a fellow bat. "In this study, the jamming is on purpose and the jamming signal has been designed by evolution to maximally disrupt the other bat's echolocation," he says. © Copyright Reed Business Information Ltd.
Link ID: 20288 - Posted: 11.08.2014
By Dwayne Godwin and Jorge Cham © 2014 Scientific American
by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.
Kate Baggaley Much of the increase in autism diagnoses in recent decades may be tied to changes in how the condition is reported. Sixty percent of the increase in autism cases in Denmark can be explained by these changes, scientists report November 3 in JAMA Pediatrics. The researchers followed all 677,915 people born in Denmark in 1980 through 1991, monitoring them from birth through the end of 2011. Among children born in this period, diagnoses increased fivefold, until 1 percent of children born in the early 1990s were diagnosed with autism by age 20. During these decades, Denmark experienced two changes in the way autism is reported. In 1994, the criteria physicians rely on to diagnose autism were updated in both the International Classification of Diseases manual used by Denmark and in its American counterpart, the Diagnostic and Statistical Manual of Mental Disorders. Then in 1995, the Danish Psychiatric Register began reporting diagnoses where doctors had only outpatient contact with children, in addition to cases where autism was diagnosed after children had been kept overnight. The researchers estimated Danish children’s likelihood of being diagnosed with autism before and after the two reporting changes. These changes accounted for 60 percent of the increase in diagnoses. © Society for Science & the Public 2000 - 2014.
Link ID: 20280 - Posted: 11.05.2014
By Amy Robinson Whether you’re walking, talking or contemplating the universe, a minimum of tens of billions of synapses are firing at any given second within your brain. “The weak link in understanding ourselves is really about understanding how our brains generate our minds and how our minds generate our selves,” says MIT neuroscientist Ed Boyden. One cubic millimeter in the brain contains over 100,000 neurons connected through a billion synapses computing on a millisecond timescale. To understand how information flows within these circuits, we first need a “brain parts” list of neurons and glia. But such a list is not enough. We’ll also need to chart how cells are connected and to monitor their activity over time both electrically and chemically. Researchers can do this at small scale thanks to a technology developed in the 1970s called patch clamping. Bringing a tiny glass needle very near to a neuron living within a brain allows researchers to perform microsurgery on single neurons, piercing the cell membrane to do things like record the millivolt electrical impulses flowing through it. Patch clamping also facilitates measurement of proteins contained within the cell, revealing characteristic molecules and contributing to our understanding of why one neuron may behave differently than another. Neuroscientists can even inject glowing dyes in order to see the shape of cells. Patch clamping is a technique that has been used in neuroscience for 40 years. Why now does it make an appearance as a novel neuroscience technology? In a word: robots. © 2014 Scientific American
Keyword: Brain imaging
Link ID: 20279 - Posted: 11.05.2014
By Kelly Servick Using data from old clinical trials, two groups of researchers have found a better way to predict how amyotrophic lateral sclerosis (ALS) progresses in different patients. The winning algorithms—designed by non-ALS experts—outperformed the judgments of a group of ALS clinicians given the same data. The advances could make it easier to test whether new drugs can slow the fatal neurodegenerative disease. The new work was inspired by the so-called ALS Prediction Prize, a joint effort by the ALS-focused nonprofit Prize4Life and Dialogue for Reverse Engineering Assessments and Methods, a computational biology project whose sponsors include IBM, Columbia University, and the New York Academy of Sciences. Announced in 2012, the $50,000 award was designed to bring in experts from outside the ALS field to tackle the notoriously unpredictable illness. Liuxia Wang, a data analyst at the marketing company Sentrana in Washington, D.C., was used to helping companies make business decisions based on big data sets, such as information about consumer choices, but says she “didn’t know too much about this life science thing” until she got an unusual query from a client. One of the senior managers she worked with revealed that her son had just been diagnosed with ALS and wondered if Sentrana’s analytics could apply to patient data, too. When Wang set out to investigate, she found the ALS Prediction Prize. The next step, she said, was to learn something about ALS. The disease destroys the neurons that control muscle movement, causing gradual paralysis and eventually killing about half of patients within 3 years of diagnosis. But the speed of its progression varies widely. About 10% of patients live a decade or more after being diagnosed. That makes it hard for doctors to answer patients’ questions about the future, and it’s a big problem for testing new ALS treatments. © 2014 American Association for the Advancement of Science.
Keyword: ALS-Lou Gehrig's Disease
Link ID: 20278 - Posted: 11.04.2014
James Gorman Here is something to keep arachnophobes up at night. The inside of a spider is under pressure, like the air in a balloon, because spiders move by pushing fluid through valves. They are hydraulic. This works well for the spiders, but less so for those who want to study what goes on in the brain of a jumping spider, an aristocrat of arachnids that, according to Ronald R. Hoy, a professor of neurobiology and behavior at Cornell University, is one of the smartest of all invertebrates. If you insert an electrode into the spider’s brain, what’s inside might squirt out, and while that is not the kind of thing that most people want to think about, it is something that the researchers at Cornell had to consider. Dr. Hoy and his colleagues wanted to study jumping spiders because they are very different from most of their kind. They do not wait in a sticky web for lunch to fall into a trap. They search out prey, stalk it and pounce. “They’ve essentially become cats,” Dr. Hoy said. And they do all this with a brain the size of a poppy seed and a visual system that is completely different from that of a mammal: two big eyes dedicated to high-resolution vision and six smaller eyes that pick up motion. Dr. Hoy gathered four graduate students in various disciplines to solve the problem of recording activity in a jumping spider’s brain when it spots something interesting — a feat nobody had accomplished before. In the end, they not only managed to record from the brain, but discovered that one neuron seemed to be integrating the information from the spider’s two independent sets of eyes, a computation that might be expected to involve a network of brain cells. © 2014 The New York Times Company