Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 20539

By BENEDICT CAREY Marijuana use did not increase among teenagers in the states in which medical marijuana has become legal, researchers reported Monday. The new analysis is the most comprehensive effort to date to answer a much-debated question: Does decriminalization of marijuana lead more adolescents to begin using it? The study found that states that had legalized medical use had higher prevailing rates of teenage marijuana use before enacting the laws, compared with states where the drug remains illegal. Those higher levels were unaffected by the changes in the law, the study found. The report, published in The Lancet Psychiatry, covered a 24-year period and was based on surveys of more than one million adolescents in 48 states. The research says nothing about the effect of legalizing recreational use, however. A primary concern on both sides of the debate over medical marijuana has been that loosening marijuana restrictions might send the wrong message to young people, and make the drug both more available and more appealing. Teenagers who develop and sustain a heavy, daily habit increase their risk of having cognitive difficulties later on, several studies now suggest. Previous research on usage trends in the wake of the laws has been mixed, some reporting evidence of an increase among adolescents and others — including two recent, multistate studies — finding no difference. The new analysis should carry far more weight, experts said, not only because of its size and scope but also because the funders included the National Institute of Drug Abuse, whose director has been outspoken about the risks of increased use. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 21056 - Posted: 06.16.2015

by Colin Barras Bacteria aren't renowned for their punctuality – but perhaps one day they will be. A working circadian clock has been inserted in E. coli that allows the microbes to keep to a 24-hour schedule. The tiny timekeepers could eventually be used in biological computers or for combating the effects of jet lag. Many plants and animals use circadian clocks to regulate their daily activities – but bacterial circadian rhythms are much less well understood. The best studied belongs to photosynthetic cyanobacteria: other common microbes, like E. coli, don't carry clocks at all, says Pamela Silver of Harvard Medical School. The cyanobacterial clock is based around the kaiABC gene cluster and ATP – the molecular fuel that nearly all living cells rely on. During the day, while the cyanobacteria are active, the KaiA protein encourages the KaiC protein to bind to phosphate groups from ATP. At night, the KaiB protein kicks into action, disrupting the activity of KaiA and encouraging KaiC to hand back the phosphate. Silver, her former student Anna Chen and other colleagues have transplanted this kaiABC clock wholesale into E. coli – the first time such a sophisticated clock has been slotted into a new microbe. But would the bacteria use their new clocks to keep time? "That's the cleverest part – and it's down to Anna's genius," says Silver. Chen suggested hooking up the kaiABC clock to a green fluorescent protein so that the phosphorylated KaiC protein would make the E. coli glow. Sure enough, the E. coli became gradually more fluorescent and then returned to a non-fluorescent state over a 24-hour period, proving that the kaiABC clock kept ticking even after it was transplanted. © Copyright Reed Business Information Ltd.

Keyword: Biological Rhythms
Link ID: 21055 - Posted: 06.16.2015

Aaron E. Carroll One of my family’s favorite shows is “The Biggest Loser.” Although some viewers don’t appreciate how it pushes people so hard to lose weight, the show probably inspires some overweight people to regain control of their lives. But one of the most frustrating parts of the show, at least for me, is its overwhelming emphasis on exercise. Because when it comes to reaching a healthy weight, what you don’t eat is much, much more important. Think about it this way: If an overweight man is consuming 1,000 more calories than he is burning and wants to be in energy balance, he can do it by exercising. But exercise consumes far fewer calories than many people think. Thirty minutes of jogging or swimming laps might burn off 350 calories. Many people, fat or fit, can’t keep up a strenuous 30-minute exercise regimen, day in and day out. They might exercise a few times a week, if that. Or they could achieve the same calorie reduction by eliminating two 16-ounce sodas each day. Proclamations that people need to be more active are ubiquitous in the media. The importance of exercise for proper weight management is reinforced when people bemoan the loss of gym class in schools as a cause of the obesity epidemic. Michelle Obama’s Let’s Move program places the focus on exercise as a critical component in combating excess weight and obesity. Exercise has many benefits, but there are problems with relying on it to control weight. First, it’s just not true that Americans, in general, aren’t listening to calls for more activity. From 2001 to 2009, the percentage of people who were sufficiently physically active increased. But so did the percentage of Americans who were obese. The former did not prevent the latter. © 2015 The New York Times Company

Keyword: Obesity
Link ID: 21054 - Posted: 06.15.2015

By Stephen L. Macknik, Susana Martinez-Conde and Bevil Conway This past February a photograph of a dress nearly broke the Internet. It all started when a proud mother-in-law-to-be snapped a picture of the dress she planned to wear to her daughter's wedding. When she shared her picture with her daughter and almost-son-in-law, the couple could not agree on the color: she saw white and gold, but he saw blue and black. A friend of the bride posted the confusing photo on Tumblr. Followers then reposted it to Twitter, and the image went viral. “The Dress” pitted the opinions of superstar celebrities against one another (Kanye and Kim disagreed, for instance) and attracted millions of views on social media. The public at large was split into white-and-gold and blue-and-black camps. So much attention was drawn, you would have thought the garment was conjured by a fairy godmother and accessorized with glass slippers. To sort out the conundrum, the media tapped dozens of neuroscientists and psychologists for comment. Pride in our heightened relevance to society gave way to embarrassment as we realized that our scientific explanations for the color wars were not only diverse but also incomplete. Especially perplexing was the fact that people saw it differently on the same device under the same viewing conditions. This curious inconsistency suggests that The Dress is a new type of perceptual phenomenon, previously unknown to scientists. Although some early explanations for the illusion focused on individual differences in the ocular structure of the eye, such as the patterning and function of rod and cone photoreceptor cells or the light-filtering properties internal to the eye, the most important culprit may be the brain's color-processing mechanisms. These might vary from one person to the next and can depend on prior experiences and beliefs. © 2015 Scientific American

Keyword: Vision
Link ID: 21053 - Posted: 06.15.2015

Hannah Devlin Science correspondent When Lucy Tonge started drifting off in front of the television as a 13-year-old, her parents put it down to typical teenage lethargy. And when she developed a strange habit of slumping forward when she laughed, her mum told her: “Stop doing that stupid thing when you laugh. It makes you look silly.” But she couldn’t. It was only when she started collapsing with no warning that her family sought medical advice that led to a diagnosis of narcolepsy. Soon afterwards, Tonge discovered that her sleeping disorder was very likely to have been triggered by the swine flu vaccine, which she had received in 2009 a couple of months before her symptoms first emerged. Swine flu vaccine can trigger narcolepsy, UK government concedes The government has acknowledged the rare side-effect of the Pandemrix jab, which was given to 6 million people in Britain during the 2009 and 2010 swine flu pandemic, but the Department for Work and Pensions (DWP)has rejected the compensation claims of about 80 people including Tonge on the grounds that their disabilities were not “severe”. This week, the group was given fresh hope that the challenges they face will be acknowledged after a tribunal ordered the government to pay £120,000 in damages to a 12-year-old boy whose narcolepsy was also linked to Pandemrix. © 2015 Guardian News and Media Limited

Keyword: Sleep
Link ID: 21052 - Posted: 06.15.2015

By Michael Hedrick I’ve had a little success dating in the nearly 10 years I’ve lived with schizophrenia. But there are a lot of obstacles. Schizophrenia is a terrifying word for many people. It conjures up ideas of murderous intent, lack of control and a host of other scary things. I live with this word, though; I am the word. But it is not a word you can just drop into a conversation and follow with “It’s not a big deal, though.” I seem to fall in love easily, but it’s always with girls who don’t feel the same way about me. I have seen more rejection than I care to admit, putting myself on the line like that, and it’s been a chore for me not to let my emotions get the best of me. If it’s not outright rejection, it seems to be something else that always seems to happen. I can remember one date I went on some months back. She was a big girl with blonde hair and eyes that had that squinty “I’m up to no good” look. We met over Match.com, and I was struck by how much time she spent going to Phish shows. Her profile was scattered with a number of bands that I had loved at different points in my life. She was a teacher, and she mentioned in her profile something along the lines that because of her love of sparkles, arts-and- crafts, and rainbows, she was a 6-year-old in a woman’s body. Before I knew it, I was asking if she wanted to go get a beer. She said yes, a little too eagerly I thought. I got to the restaurant about 15 minutes early and ordered a beer, apprehensive knowing that eventually I would have to tell her about my illness. Soon enough she walked in, and I was struck by the fact that she seemed a little disappointed to be there. There was no smile as she sat down to join me. © 2015 The New York Times Company

Keyword: Schizophrenia
Link ID: 21051 - Posted: 06.15.2015

by Meghan Rosen When we brought Baby S home from the hospital six months ago, his big sister, B, was instantly smitten. She leaned her curly head over his car seat, tickled his toes and cooed like a pro — in a voice squeakier than Mickey Mouse’s. B’s voice — already a happy toddler squeal — sounded as if she'd sucked in some helium. My husband and I wondered about her higher pitch. Are humans hardwired to chitchat squeakily to babies, or did B pick up vocal cues from us? (I don’t sound like that, do I?) If I’m like other mothers, I probably do. American English-speaking moms dial up their pitch drastically when talking to their children. But dads’ voices tend to stay steady, researchers reported May 19 in Pittsburgh at the 169th Meeting of the Acoustical Society of America. “Dads talk to kids like they talk to adults,” says study coauthor Mark VanDam, a speech scientist at Washington State University. But that doesn’t mean fathers are doing anything wrong, he says. Rather, they may be doing something right: offering their kids a kind of conversational bridge to the outside world. Scientists have studied infant- or child-directed speech (often called “motherese” or “parentese”) for decades. In American English, this type of babytalk typically uses high pitch, short utterances, repetition, loud volume and slowed-down speech. Mothers who speak German Japanese, French, and other languages also tweak their pitch and pace when talking to children. But no one had really studied dads, VanDam says. © Society for Science & the Public 2000 - 2015.

Keyword: Language; Development of the Brain
Link ID: 21050 - Posted: 06.15.2015

by Michael Sean Pepper and Beverley Kramer People who are attracted to others of the same sex develop their orientation before they are born. This is not a choice. And scientific evidence shows their parents cannot be blamed. Research proving that there is biological evidence for sexual orientation has been available since the 1980s. The links have been emphasised by new scientific research. In 2014, researchers confirmed the association between same-sex orientation in men and a specific chromosomal region. This is similar to findings originally published in the 1990s, which, at that time, gave rise to the idea that a “gay gene” must exist. But this argument has never been substantiated, despite the fact that studies have shown that homosexuality is a heritable trait. Evidence points towards the existence of a complex interaction between genes and environment, which are responsible for the heritable nature of sexual orientation. These findings are part of a report released by the Academy of Science South Africa. The report is the outcome of work conducted by a panel put together in 2014 to evaluate all research on the subject of sexual orientation done over the last 50 years. It did this against the backdrop of a growing number of new laws in Africa which discriminate against people attracted to others of the same sex. The work was conducted in conjunction with the Ugandan Academy of Science.

Keyword: Sexual Behavior
Link ID: 21049 - Posted: 06.15.2015

By ANDREW HIGGINS WIJK BIJ DUURSTEDE, Netherlands — The hiss of gas, released by a red lever turned by Arie den Hertog in the back of his white van, signaled the start of the massacre. The victims, crammed into a sealed, coffin-like wooden case, squawked as they struggled to breathe. Then, after barely two minutes, they fell silent. Glancing at the timer on his cellphone, Mr. Den Hertog declared the deed done. “Now it is all over,” he said proudly of his gruesomely efficient handiwork, on a gloriously sunny day beneath a row of poplar trees on the banks of the Lower Rhine. Reviled as a Nazi by animal rights activists but hailed as a hero by Dutch farmers, Mr. Den Hertog, 40, is the Netherlands’ peerless expert in the theory and practice of killing large numbers of wild geese. On his recent outing to Wijk bij Duurstede, a village in the Utrecht region southeast of Amsterdam, he killed 570 graylag geese in his portable gas chamber, fitted with two big canisters of carbon dioxide. That brought his death toll to more than 7,000 for the week. “It is not fun, but it has to be done,” he said of his work. The Dutch authorities insist it must be done, too. They pay Mr. Den Hertog to keep a ballooning geese population from devouring the grass of cow pastures and flying into planes taking off from Amsterdam’s Schiphol Airport, a major hub in Europe. He is the unpleasant answer to what has become a problem on a grand scale for the Netherlands. Geese populations here have skyrocketed, buoyed by a 1999 ban on hunting them; farmers’ increasing use of nitrogen-rich fertilizer, which geese apparently love; and the expansion of protected nature areas. That combination, plus an abundance of rivers and canals, has made the country a “goose El Dorado,” said Julia Stahl, head of research at Sovon, a group that monitors wild bird populations in the Netherlands. © 2015 The New York Times Company

Keyword: Animal Rights
Link ID: 21048 - Posted: 06.15.2015

By Nicholas Bakalar Statins, the widely used cholesterol-lowering drugs, have been blamed for memory loss, but a new study suggests that the association is an illusion. The report, in JAMA Internal Medicine, found that the apparent association was likely a result of detection bias — visiting the doctor and starting a new medicine makes people more acutely aware of health issues they might otherwise not notice. Researchers compared 482,543 statin users with the same number of people using no lipid-lowering drugs and with 26,484 people using non-statin lipid lowering drugs. Use of statin drugs was associated with an increase in memory loss during the first 30 days of starting the drugs compared with people who did not take cholesterol-lowering drugs. But so was use of non-statin lipid-lowering drugs. After accounting for many health and behavioral variables, the scientists concluded that either all lipid lowering drugs, statins or not, cause memory loss or, more likely, that previous findings were based on the expectations of the patients rather than any physiological effect of the medicine. “As you think about whether you should be taking statins, there are questions about uncommon side effects worth raising,” said the lead author, Dr. Brian L. Strom, chancellor of Rutgers Biomedical and Health Sciences. “But the question of impairing memory is a nonissue.” © 2015 The New York Times Company

Keyword: Learning & Memory
Link ID: 21047 - Posted: 06.15.2015

Sara Reardon Traumatic experiences, such as those encountered during warfare, can cause long-lasting stress. Tweaking the immune system could be key to treating, or even preventing, post-traumatic stress disorder (PTSD). Research in rodents suggests that immunizing animals can lessen fear if they are later exposed to stress. Researchers have known for some time that depression and immune-system health are linked and can affect each other. Early clinical trials have shown that anti-inflammatory drugs can reduce symptoms of depression1, raising hopes that such treatments might be useful in other types of mental illness, such as PTSD. “I think there’s kind of a frenzy about inflammation in psychiatry right now,” says Christopher Lowry, a neuroscientist at the University of Colorado Boulder. He presented results of experiments probing the link between fearful behaviour and immune response at a meeting in Victoria, Canada, last week of the International Behavioral Neuroscience Society. Studies of military personnel suggest that immune function can influence the development of PTSD. Soldiers whose blood contains high levels of the inflammatory protein CRP before they are deployed2, or who have a genetic mutation that makes CRP more active3, are more likely to develop the disorder. To directly test whether altering the immune system affects fear and anxiety, Lowry and colleagues injected mice with a common bacterium, Mycobacterium vaccae, three times over three weeks to modulate their immune systems. The scientists then placed these mice, and a control group of unimmunized mice, in cages with larger, more aggressive animals. © 2015 Nature Publishing Group

Keyword: Stress; Neuroimmunology
Link ID: 21046 - Posted: 06.13.2015

By Jessica Schmerler Approximately one in 68 children is identified with some form of autism, from extremely mild to severe, according to the U.S. Centers for Disease Control. On average, diagnosis does not occur until after age four, yet all evidence indicates that early intervention is the best way to maximize the treatment impact. Various tests that look for signs of autism in infants have not been conclusive but a new exercise could improve early diagnosis, and also help reduce worry among parents that they did not intervene as soon as possible. The two most widely used tests to measure symptoms, the Autism Observation Scale for Infants (AOSI) and the Autism Diagnostic Observation Schedule (ADOS), cannot be used before the ages of 12 or 16 months respectively. The AOSI measures precursors to symptoms, such as a baby’s response to name, eye contact, social reciprocity, and imitation. The ADOS measures the characteristics and severity of autism symptoms such as social affectation and repetitive and restrictive behaviors. Now a group of scientists at the Babylab at Birkbeck, University of London think they have identified a marker that can predict symptom development more accurately and at an earlier age: enhanced visual attention. Experts have long recognized that certain individuals with autism have superior visual skills, such as increased visual memory or artistic talent. Perhaps the most well known example is Temple Grandin, a high-functioning woman with autism who wrote, “I used to become very frustrated when a verbal thinker could not understand something I was trying to express because he or she couldn’t see the picture that was crystal clear to me.” © 2015 Scientific American

Keyword: Autism; Attention
Link ID: 21045 - Posted: 06.13.2015

by Penny Sarchet Children with ADHD are more likely to succeed in cognitive tasks when they are fidgeting. Rather than telling them to stop, is it time to let them squirm in class? The results, from a small study of teens and pre-teens, add to growing evidence that movement may help children with attention-deficit hyperactivity disorder to think. One of the theories about ADHD is that the brain is somehow under-aroused. Physical movements could help wake it up or maintain alertness, perhaps by stimulating the release of brain-signalling chemicals like dopamine or norepinephrine. This hypothesis would help explain why countries like the US are experiencing an epidemic of ADHD – it might be that a lack of physical activity leads to reduced brain function. Fidget britches In the latest study, Julie Schweitzer of the University of California, Davis, and her colleagues asked 44 children with ADHD and 29 kids without to describe an arrangement of arrows. The children with ADHD were more likely to focus on the task and answer correctly if the test coincided with them fidgeting, as tracked by an ankle monitor. Intriguingly, Schwietzer found that it is the vigour of movements, rather than how often children make them, that seems to be related to improvements in test scores. This might mean, for example, that it helps children to swing their legs in longer arcs, but not to swing them faster. "I think we need to consider that fidgeting is helpful," says Schweitzer. "We need to find ways that children with ADHD can move without being disruptive to others." Dustin Sarver at the University of Mississippi, who recently found a link between fidgeting and improved working memory, agrees. "We should revisit the targets we want for these children, such as improving the work they complete and paying attention, rather than focusing on sitting still." He suggests that movements that are not disruptive to other schoolchildren, such as squirming, bouncing and leg movements, as opposed to getting up in the middle of lessons, could be encouraged in classrooms. © Copyright Reed Business Information Ltd

Keyword: ADHD; Attention
Link ID: 21044 - Posted: 06.13.2015

Boer Deng A genetic variant protected some practitioners of cannibalism from prion disease. Scientists who study a rare brain disease that once devastated entire communities in Papua New Guinea have described a genetic variant that appears to stop misfolded proteins known as prions from propagating in the brain1. Kuru was first observed in the mid-twentieth century among the Fore people of Papua New Guinea. At its peak in the late 1950s, the disease killed up to 2% of the group's population each year. Scientists later traced the illness to ritual cannibalism2, in which tribe members ate the brains and nervous systems of their dead. The outbreak probably began when a Fore person consumed body parts from someone who had sporadic Creutzfeldt-Jakob disease (CJD), a prion disease that spontaneously strikes about one person in a million each year. Scientists have noted previously that some people seem less susceptible to prion diseases if they have an amino-acid substitution in a particular region of the prion protein — codon 1293. And in 2009, a team led by John Collinge — a prion researcher at University College London who is also the lead author of the most recent analysis — found another protective mutation among the Fore, in codon 1274. The group's latest work, reported on 10 June in Nature1, shows that the amino-acid change that occurs at this codon, replacing a glycine with a valine, has a different and more powerful effect than the substitution at codon 129. The codon 129 variant confers some protection against prion disease only when it is present on one of the two copies of the gene that encodes the protein. But transgenic mice with the codon-127 mutation were completely resistant to kuru and CJD regardless of whether they bore one or two copies of it. The researchers say that the mutation in codon 127 appears to confer protection by preventing prion proteins from becoming misshapen. © 2015 Nature Publishing Group,

Keyword: Prions
Link ID: 21043 - Posted: 06.13.2015

Joe Palca Scientists found a molecule crucial to perceiving the sensation of itching. It affects how the brain responds to serotonin, and may explain why anti-depressants that boost serotonin make some itch. JOE PALCA, BYLINE: How do you go about discovering what makes us itch? Well, if you're Diana Bautista at the University of California, Berkeley, you ask what molecules are involved. DIANA BAUTISTA: We say OK, what are the possible molecular players out there that might be contributing to itch or touch? PALCA: Bautista says it turns out itch and touch, and even pain, all seem to be related - at least in the way our brains makes sense of these sensations. But how to tell which molecules are key players? Bautista says basically you try everything you can. BAUTISTA: We test a lot of candidates. And if we're really lucky, one of our candidates - we can prove that it plays a really important role. PALCA: And now she thinks she's found one. Working with colleagues at the Buck Institute for Research on Aging, she's found a molecule that's made by a gene called HTR7. When there's less of this molecule, animals with itchy skin conditions, like eczema, do less scratching. When there's more of it, itching gets worse. The way this molecule works is kind of interesting. It changes how sensitive brain cells are to a chemical called serotonin. Now, serotonin is a chemical that's related to depression. So Bautista's research might explain why certain antidepressant drugs that boost serotonin have a peculiar side effect. For some people, the drugs make them itch. Bautista says the new research is certainly not the end of the story when it comes to understanding itch. © 2015 NPR

Keyword: Pain & Touch
Link ID: 21042 - Posted: 06.13.2015

by Penny Sarchet Simon Sponberg of Georgia Institute of Technology in Atlanta and his team have figured out the secret to the moths' night vision by testing them with robotic artificial flowers (see above). By varying the speed of a fake flower's horizontal motion and changing brightness levels, the team tested moths' abilities under different conditions. It has been theorised that the moth brain slows down, allowing their visual system to collect light for longer, a bit like lengthening a camera's exposure. But the strategy might also introduce blur, making it hard to detect fast movement. If the moths were using this brain-slowing tactic, they would be expected to react to fast flower movements more slowly in darker conditions. The team found that there was indeed a lag. It helped them see motion in the dark while still allowing them to keep up with flowers swaying at normal speeds. The size of the lags matched the expected behaviour of a slowed nervous system, providing evidence that moths could be slowing down the action of neurons in their visual system. Previously, placing hawkmoths in a virtual obstacle courseMovie Camera revealed that they vary their navigation strategies depending on visibility conditions. Journal reference: Science, DOI: 10.1126/science.aaa3042 © Copyright Reed Business Information Ltd

Keyword: Vision
Link ID: 21041 - Posted: 06.13.2015

By David Grimm In 2013, the Nonhuman Rights Project filed a series of lawsuits asking courts to recognize four New York chimpanzees as legal persons and free them from captivity. The animal rights group, which hopes to set a precedent for research chimps everywhere, has yet to succeed, but in April a judge ordered Stony Brook University to defend its possession of two of these animals, Hercules and Leo. Last month, the group and the university squared off in court, and the judge is expected to issue a decision soon. But the scientist working with the chimps, anatomist Susan Larson, has remained largely silent until now. In an exclusive interview, Larson talks about her work with these animals and the impact the litigation is having on her studies—and research animals in general. This interview has been edited for clarity and brevity. Q: Where did Hercules and Leo come from? A: They were born 8 years ago at the New Iberia Research Center in Louisiana. They were among the last juveniles New Iberia had. We've had them on loan for 6 years. Q: What kind of work do you do with them? A: We're interested in learning about the evolution of bipedalism by actually looking at what real animals do. Over the past 30 years, we've looked at 17 different species of primates, including 11 chimpanzees. Chimpanzees are the best model because they are so close to us. When we compare how they walk to how we walk, we can feed those data into computer models that may help us understand how early hominids like Lucy moved around. The work we're doing with Hercules and Leo is the most important work we've done. © 2015 American Association for the Advancement of Science

Keyword: Animal Rights
Link ID: 21040 - Posted: 06.13.2015

Dogs do not like people who are mean to their owners and will refuse food offered by people who have snubbed their master, Japanese researchers have said. The findings reveal that canines have the capacity to cooperate socially – a characteristic found in a relatively small number of species, including humans and some other primates. Researchers led by Kazuo Fujita, a professor of comparative cognition at Kyoto University, tested three groups of 18 dogs using role plays in which their owners needed to open a box. In all three groups, the owner was accompanied by two people whom the dog did not know. In the first group, the owner sought assistance from one of the other people, who actively refused to help. In the second group, the owner asked for, and received, help from one person. In both groups, the third person was neutral and not involved in either helping or refusing to help. Neither person interacted with the dog’s owner in the control – third – group. After watching the box-opening scene, the dog was offered food by the two unfamiliar people in the room. Dogs that saw their owner being rebuffed were far more likely to choose food from the neutral observer, and to ignore the offer from the person who had refused to help, Fujita said on Friday. Dogs whose owners were helped and dogs whose owners did not interact with either person showed no marked preference for accepting snacks from the strangers. “We discovered for the first time that dogs make social and emotional evaluations of people regardless of their direct interest,” Fujita said. If the dogs were acting solely out of self-interest, there would be no differences among the groups, and a roughly equal number of animals would have accepted food from each person. © 2015 Guardian News and Media Limited

Keyword: Emotions; Evolution
Link ID: 21039 - Posted: 06.13.2015

Owning a cat as a kid could put you at risk for schizophrenia and bipolar disorder later on because of parasites found in feline feces, new research says. Previous studies have linked the parasite toxoplasma gondii (T. gondii) to the development of mental disorders, and two more research papers published recently provide further evidence. Researchers from the Academic Medical Centre in Amsterdam looked at more than 50 studies and found that a person infected with the parasite is nearly twice as likely to develop schizophrenia. The other study, led by Dr. Robert H. Yolken of Johns Hopkins University School of Medicine in Baltimore, confirmed the results of a 1982 questionnaire that found half of people who had a cat as a kid were diagnosed with mental illnesses later in life compared to 42% of those who didn't grow up with a cat. "Cat ownership in childhood has now been reported in three studies to be significantly more common in families in which the child is later diagnosed with schizophrenia or another serious mental illness," the authors said in a press release. The findings were published in Schizophrenia Research and Acta Psychiatrica Scandinavica. T. gondii, which causes the disease toxoplasma, is especially risky for pregnant women and people with weak immune symptoms. The parasite can also be found in undercooked meat and unwashed fruits and vegetables.

Keyword: Schizophrenia; Neurotoxins
Link ID: 21038 - Posted: 06.10.2015

By Michael Balter Alcoholic beverages are imbibed in nearly every human society across the world—sometimes, alas, to excess. Although recent evidence suggests that tippling might have deep roots in our primate past, nonhuman primates are only rarely spotted in the act of indulgence. A new study of chimpanzees with easy access to palm wine shows that some drink it enthusiastically, fashioning leaves as makeshift cups with which to lap it up. The findings could provide new insights into why humans evolved a craving for alcohol, with all its pleasures and pains. Scientists first hypothesized an evolutionary advantage to humans’ taste for ethanol about 15 years ago, when a biologist at the University of California, Berkeley, proposed what has come to be called the “drunken monkey hypothesis.” Robert Dudley argued that our primate ancestors got an evolutionary benefit from being able to eat previously unpalatable fruit that had fallen to the ground and started to undergo fermentation. The hypothesis received a boost last year, when a team led by Matthew Carrigan—a biologist at Santa Fe College in Gainesville, Florida—found that the key enzyme that helps us metabolize ethanol underwent an important mutation about 10 million years ago. This genetic change, which occurred in the common ancestor of humans, chimps, and gorillas, made ethanol metabolism some 40 times faster than the process in other primates—such as monkeys—that do not have it. According to the hypothesis, the mutation allowed apes to consume fermented fruit without immediately getting drunk or, worse, succumbing to alcohol poisoning. Nevertheless, researchers had turned up little evidence that primates in the wild regularly eat windfall fruit or are attracted to the ethanol that such fruit contains. Now, a team led by Kimberley Hockings, a primatologist at the Center for Research in Anthropology in Lisbon, concludes from a 17-year study of chimps in West Africa that primates can tolerate significant levels of ethanol and may actually crave it, as humans do. © 2015 American Association for the Advancement of Science

Keyword: Drug Abuse; Evolution
Link ID: 21037 - Posted: 06.10.2015