Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 5421 - 5440 of 29538

By Virginia Morell Want to say “Hello,” but don’t know the local language? Try waving your hand. Such gestures, common among humans, are also surprisingly similar among chimpanzees and bonobos, our closest great ape relatives. Now, a new study has identified numerous gestures that mean the same thing to both species. That suggests these signals have biological underpinnings and could be inherited from our last common ancestor. Gestures, signals often used to get someone’s attention or ask for or stop something, are not technically languages. They don’t have specific linguistic and grammatical rules or accepted vocabularies. But gestures still have meaning: Among chimpanzees, for example, scientists have documented that many of their movements—from mouth stroking to request food or arm raising to request grooming—are used to elicit specific responses from other chimpanzees. Researchers have now found something similar in bonobos, great apes closely related to chimpanzees but with longer legs, pink lips, and long hair that’s parted in the middle on their heads. Scientists started by shooting and analyzing videos of wild bonobos in the Democratic Republic of the Congo. When a bonobo made a common gesture that brought a consistent, satisfying response from others, it was added to the list. For example, when one bonobo looked at another while loudly scratching one arm, the second often responded by grooming the first. Because the first bonobo was almost always satisfied by this response, the researchers concluded that a “big, loud scratch” is a request for grooming. The scientists next compared the bonobo gestures to those of chimpanzees, and found that their repertoires overlapped by about 90%, significantly more than “would be expected by chance,” says lead author Kirsty Graham, a comparative psychologist at the University of York in the United Kingdom. © 2018 American Association for the Advancement of Science

Keyword: Animal Communication; Evolution
Link ID: 24709 - Posted: 02.28.2018

by Amy Ellis Nutt In the first broad demographic study of trends in gender-affirming surgeries in the United States, researchers found that the number of operations increased fourfold from 2000 to 2014. Some of the dramatic rise, according to a study published Wednesday in the journal JAMA Surgery, may be related to an increase in insurance coverage for the procedures. “Early on we recognized there’s been a lot of work on health disparities having to do with age, race and so on that get collected in health-care settings,” said Brandyn Lau, an assistant professor of surgery and health sciences informatics at Johns Hopkins University School of Medicine. “One of the things we need to know is whether [lesbian, gay and transgender] patients are getting the same care.” Lau and other researchers from Johns Hopkins Medicine and Harvard University analyzed 15 years of data from the National Inpatient Sample, a collection of hospital inpatient information from across the country, and found a total of 4,118 gender-affirming surgeries. The surgeries took place as LGBTQ people are finding increasing acceptance, especially among younger generations. The majority of the surgeries that occurred between 2000 and 2011 involved patients not covered by health insurance. About half of the transgender patients in the study paid out of pocket between 2000 and 2005. That number rose to 65 percent between 2006 and 2011. However, the trend reversed between 2012 and 2014, with the number plummeting to 39 percent. Much of that decrease, say the study's authors, is due to Medicare and Medicaid. In May 2014, Medicare ended its 33-year ban on transgender surgeries. Loren Schechter, who specializes in transgender surgeries, says he does about 300 procedures a year, whereas it was only about 50 in 2000. The plastic surgeon also accepts Medicare, which others do not. © 1996-2018 The Washington Post

Keyword: Sexual Behavior
Link ID: 24708 - Posted: 02.28.2018

Rhiannon Lucy Cosslett In my first year of university, just after I had been prescribed fluoxetine for depression, I had an argument about it with a close friend. He told me that taking antidepressants would make my feelings false, my emotions manufactured. I wouldn’t be able to tell if what I was feeling was real – and that was wrong. At the time I did not know how to articulate that all of our feelings are linked to chemicals: that even eating a chocolate bar can give me a blood-sugar spike and alter my behaviour, that feeling the sunshine on my skin can give me hope and energy. Furthermore, that the contraceptive pills his girlfriends took were liable to make them angry, not to mention less horny. I did not know how to say that the antidepressant I took in order to cope with my life was not that different to the ketamine and cocaine he used to cope with his. In any case, it was a pretentious argument of the kind one has at university, and both of us lacked the scientific knowledge to really underpin our views. It was all posturing. Once I accepted that I needed help and began treatment, I felt calmer within a week I think of it now because antidepressants are in the news again: whether they work or don’t work, whether other treatments – therapy, mindfulness, exercise, volunteering, being a 96-year-old Italian with a diet of fish and olive oil – are more effective than that “magic” pill. The chemical imbalance theory is posited, then debunked, in a never-ending cycle, as we, the mentally ill and medicated, watch on with hope but also exasperation. Because for all the scientific advances, therapeutic studies and happiness indexes, the only thing an individual can say with any certainty is whether or not antidepressants worked for them.

Keyword: Depression
Link ID: 24707 - Posted: 02.28.2018

By Kimberly Hickok If you ever wanted to know what a moth was thinking, this might be as close as you’re going to get. In a new study published today in Cell Reports, researchers placed female hawkmoths (Manduca sexta) in a wind tunnel containing two pieces of filter paper—one covered in a test odor, and one with no odor. Perhaps not surprisingly, the insects were most attracted to odors containing aromatic chemicals, which are present in plants that are common nectar sources. Some odors consistently caused the moths to touch their feet to the paper while curving their abdomen, which is how they lay eggs, indicating that moths associate those odors with egg laying. With six different odors, the moths alternated touching their feet and their mouths to the same odor, suggesting that plants containing one or all of those chemicals, such as jimson weed, are important for both feeding and egg laying. By combining these data with imaging of nerve cells at the base of the moths’ antennae, the researchers identified four clusters of nerves specifically associated with feeding behavior and six specifically associated with egg laying, but none associated with both behaviors. This means moths use specific odors to direct their behavior. The scientists say more research is needed to see whether nerve clusters respond to odor the same way in other species of moths and pollinating insects, which can help identify important odors and the plants that make them. © 2018 American Association for the Advancement of Science.

Keyword: Chemical Senses (Smell & Taste); Sexual Behavior
Link ID: 24706 - Posted: 02.28.2018

By Ashley Yeager | Human neural stem cells transplanted into the injured spines of monkeys matured into nerve cells, spurring neuronal connections and giving the animals an improved ability to grasp an orange, researchers report today (February 26) in Nature Medicine. “This type of cellular therapy, though still in its infancy, may eventually be a reasonable approach to treating central nervous system injury and possibly even neurodegenerative disease in humans,” Jonathan Glass, a neurologist at Emory University School of Medicine, tells The Scientist by email. Glass, who was not involved in the study, notes that the differentiation of stem cells over time is “impressive,” as is their ability to make connections in the monkeys’ central nervous systems, but more work needs to be done to show if the cells can grow extremely long axons to connect motor and sensory neurons after spinal injury in humans. Up to this point, most of the work on transplanting neural stem cells had been done in rats. This is the first study to show the treatment can be successfully scaled up to primates. “We definitely have more confidence to do this type of treatment in humans,” study coauthor Mark Tuszynski, a neuroscientist at the University of California, San Diego, School of Medicine, tells The Scientist. In the study, Tuszynski and his colleagues cut into a section of the spinal cord of rhesus monkeys and then two weeks later inserted a graft of human neural progenitor cells into the injury site. In the first four monkeys, the grafts did not stay in position, a finding that forced the researchers to add to the transplants more fibrinogen–thrombin, a protein-enzyme mixture the makes the graft adhere more quickly to site. The team also had to tilt the operating table to drain cerebral spinal fluid, which would wash the graft away. © 1986-2018 The Scientist

Keyword: Regeneration; Stem Cells
Link ID: 24705 - Posted: 02.27.2018

Laura Sanders With fevers, chills and aches, the flu can pound the body. Some influenza viruses may hammer the brain, too. Months after being infected with influenza, mice had signs of brain damage and memory trouble, researchers report online February 26 in the Journal of Neuroscience. It’s unclear if people’s memories are affected in the same way as those of mice. But the new research adds to evidence suggesting that some body-wracking infections could also harm the human brain, says epidemiologist and neurologist Mitchell Elkind of Columbia University, who was not involved in the study. Obvious to anyone who has been waylaid by the flu, brainpower can suffer at the infection’s peak. But not much is known about any potential lingering effects on thinking or memory. “It hasn’t occurred to people that it might be something to test,” says neurobiologist Martin Korte of Technische Universität Braunschweig in Germany. The new study examined the effects of three types of influenza A — H1N1, the strain behind 2009’s swine flu outbreak; H7N7, a dangerous strain that only rarely infects people; and H3N2, the strain behind much of the 2017–2018 flu season misery (SN: 1/19/18, p. 12). Korte and colleagues shot these viruses into mice’s noses, and then looked for memory problems 30, 60 and 120 days later. A month after infection, the mice all appeared to have recovered and gained back weight. But those that had received H3N2 and H7N7 had trouble remembering the location of a hidden platform in a pool of water, the researchers found. Mice that received no influenza or the milder H1N1 virus performed normally at the task. |© Society for Science & the Public 2000 - 2018

Keyword: Learning & Memory; Neuroimmunology
Link ID: 24704 - Posted: 02.27.2018

By Dina Fine Maron Millions of Americans who suffer from bipolar disorder depend on lithium. The medication has been prescribed for half a century to help stabilize patients’ moods and prevent manic or depressive episodes. Yet what it does in the brain—and why it does not work for some people—has remained largely mysterious. But last year San Diego–based researchers uncovered new details about how lithium may alter moods, thanks to an approach recently championed by a small number of scientists studying mental illness: The San Diego team used established lab techniques to reprogram patients’ skin cells into stem cells capable of becoming any other kind—and then chemically coaxed them into becoming brain cells. This process is now providing the first real stand-ins for brain cells from mentally ill humans, allowing for unprecedented direct experiments. Proponents hope studying these lab-grown neurons and related cells will eventually lead to more precise and effective treatment options for a variety of conditions. The San Diego team has already used this technique to show some bipolar cases may have more to do with protein regulation than genetic errors. And another lab discovered the activity of glial cells (a type of brain cell that supports neuron function) likely helps fuel schizophrenia—upending the theory that the disorder results mainly from faulty neurons. This new wave of research builds on Shinya Yamanaka’s Nobel-winning experiments on cellular reprogramming from a decade ago. His landmark findings about creating induced pluripotent stem cells (iPSCs) have only recently been applied to studying mental illness as the field has matured. “What’s really sparked that move now has been the ability to make patient-specific stem cells—and once you can do that, then all sorts of diseases become amenable to investigation,” says Steven Goldman, who specializes in cellular and gene therapy at the University of Rochester Medical Center. © 2018 Scientific American,

Keyword: Schizophrenia; Stem Cells
Link ID: 24703 - Posted: 02.27.2018

Lauren Smith As a shark biologist, I enjoy nothing more than going scuba diving with sharks in the wild. However, I realise it’s an immense privilege to do this as part of my work – and that for the vast majority of people experiencing the underwater world in such a way is simply not possible. Nevertheless, even without the aid of an air tank humans interact with fish on many levels and in greater numbers than they do with mammals and birds. A review published by the journal Animal Cognition in 2014 by Culum Brown, an associate professor at Macquarie University, Sydney, explains that fish are one of the vertebrate taxa most highly utilised by humans. But despite the fact that they are harvested from wild stocks as part of global fishing industries, grown under intensive aquaculture conditions, are the most common pet and are widely used for scientific research, fish are seldom afforded the same level of compassion or welfare as warm-blooded vertebrates. As Brown highlights in his review, part of the problem is the large gap between people’s perception of fish intelligence and the scientific reality. This is an important issue because public perception guides government policy. The perception of an animal’s intelligence often drives our decision on whether or not to include them in our moral circle. From a welfare perspective, most researchers would suggest that if an animal is sentient, then it can most likely suffer and should therefore be offered some form of formal protection.

Keyword: Consciousness; Evolution
Link ID: 24702 - Posted: 02.27.2018

By Aaron E. Carroll I remember the first time my daughter discovered her hand. The look of amazement on her face was priceless. It wasn’t long before she was putting that discovery to use, trying to put everything she could find into her mouth. Babies want to feed themselves. It sometimes feels as if parents spend more time trying to stop them than encouraging them. Over the last few years, however, some people have begun to ask if we are doing the right thing. Baby-led weaning is an approach to feeding that encourages infants to take control of their eating. It’s based on the premise that infants might be better self-regulators of their food consumption. It has even been thought that baby-led weaning might lead to reductions in obesity. While babies have been spoon-fed for a long time, the explosion of commercial foods for them might be making it too easy to overfeed them, an idea that the results from a cohort study in 2015 seemed to hint at. Those weaned in a baby-led approach seemed to be more responsive to being sated and were less likely to be overweight. A case-control study from 2012 also argued that baby-led weaning was associated with a lower body mass index (B.M.I). Such trials cannot establish causality, however, and may be confounded in unmeasured ways. A recent randomized controlled trial accomplished what previous work could not. Pregnant women in New Zealand were recruited before they gave birth and randomly assigned to one of two groups. Both got standard midwifery and child care. But one group received eight more contacts, from pregnancy to the newborn’s ninth month. Five of these were with a lactation consultant, who encouraged the mothers to prolong breast-feeding and delay the introduction of solid foods until 6 months of age. The three other contacts were with research staffers who encouraged parents to read hunger and fullness cues from their infants and provide their babies (starting at 6 months) with foods that were high in energy and iron — easy to grab but hard to choke on. © 2018 The New York Times Company

Keyword: Obesity; Development of the Brain
Link ID: 24701 - Posted: 02.27.2018

Mike Shooter Sian was just 14, brought by her misery to the edge of self-harm, when I met her in a cafe at the top end of one of the old mining valleys. Neutral ground. She told me about her rugby-playing older brother and her bright little sister who had lots of pets and wanted to be a vet. She felt that her parents doted on them and that there could be no room in anyone’s heart for her. She told me about her only friend, who had been killed in a road accident just as they went up to big school. About the recent death of her grandmother, who had been the only person she could confide in. And about the GP who had said she was depressed and given her a course of pills. I thought about Sian again this week. The newspaper headlines across the world were welcoming a major study that confirmed the value of antidepressant medication in the treatment of depression in adults. And so did I. Depression was validated at long last as an illness every bit as serious as physical conditions, that could cause untold human suffering and economic devastation, but could be helped with a course of antidepressant pills. First things first, I heartily agree with what that survey was saying about adult treatment. After all, I have a recurrent depression myself that has needed frequent treatment over the years. I talked about it openly when I was president of the Royal College of Psychiatrists and have continued to do so from the public platform, in the media, and to anyone who will listen. I do this in the hope that it will help to dispel the stigma that surrounds mental illness and prevents people from seeking therapy until it is too late. The diagnosis made sense of what I was going through. It wasn’t my fault. And I was grateful for the medication.

Keyword: Depression; Development of the Brain
Link ID: 24700 - Posted: 02.27.2018

By MAYA SALAM and LIAM STACK President Trump said Thursday that violent video games and movies may play a role in school shootings, a claim that has been made — and rejected — many times since the increase in such attacks in the past two decades. Movies are “so violent,” Mr. Trump said at a meeting on school safety one day after he gathered with survivors of school shootings, including some from last week’s massacre at Marjory Stoneman Douglas High School, where, the authorities say, a former student, Nikolas Cruz, killed 17 people with a semiautomatic rifle. “We have to look at the internet because a lot of bad things are happening to young kids and young minds and their minds are being formed,” Mr. Trump said, “and we have to do something about maybe what they’re seeing and how they’re seeing it. And also video games. I’m hearing more and more people say the level of violence on video games is really shaping young people’s thoughts.” “And then you go the further step and that’s the movies,” he added. “You see these movies, they’re so violent, and yet a kid is able to see the movie if sex isn’t involved, but killing is involved.” A neighbor of Mr. Cruz’s told The Miami Herald that he played video games, often violent ones, for up to 15 hours a day. Media scholars say the claim — a common one in the wake of mass shootings — does not hold up to scrutiny. Mr. Trump is far from the first leader to argue that violence in video games or movies can lead to violence in the real world. A similar claim was made in the 1940s, when Mayor Fiorello La Guardia of New York argued that pinball — which was illegal in the city for over 30 years — was “dominated by interests heavily tainted with criminality.” © 2018 The New York Times Company

Keyword: Aggression
Link ID: 24699 - Posted: 02.26.2018

By Alexandra Rosati The shift to a cooked-food diet was a decisive point in human history. The main topic of debate is when, exactly, this change occurred. All known human societies eat cooked foods, and biologists generally agree cooking could have had major effects on how the human body evolved. For example, cooked foods tend to be softer than raw ones, so humans can eat them with smaller teeth and weaker jaws. Cooking also increases the energy they can get from the food they eat. Starchy potatoes and other tubers, eaten by people across the world, are barely digestible when raw. Moreover, when humans try to eat more like chimpanzees and other primates, we cannot extract enough calories to live healthily. Up to 50 percent of women who exclusively eat raw foods develop amenorrhea, or lack of menstruation, a sign the body does not have enough energy to support a pregnancy—a big problem from an evolutionary perspective. Such evidence suggests modern humans are biologically dependent on cooking. But at what point in our evolutionary history was this strange new practice adopted? Some researchers think cooking is a relatively recent innovation—at most 500,000 years old. Cooking requires control of fire, and there is not much archaeological evidence for hearths and purposefully built fires before this time. The archaeological record becomes increasingly fragile farther back in time, however, so others think fire may have been controlled much earlier. Anthropologist Richard Wrangham has proposed cooking arose before 1.8 million years ago, an invention of our evolutionary ancestors. If the custom emerged this early, it could explain a defining feature of our species: the increase in brain size that occurred around this time. © 2018 Scientific American,

Keyword: Evolution
Link ID: 24698 - Posted: 02.26.2018

Amelia Hill For a serious examination of the devastating and incurable disability that is narcolepsy, Henry Nicholls’s book, Sleepy Head, is a surprisingly funny account. There is the obvious, if somewhat cruel, humour to be found in stories of people falling asleep in surprising places: in a small boat sailing around the Farne Islands, with the freezing North Sea cascading over the gunwale; while scuba diving; on a rollercoaster; at the dentist’s; on the back of a horse; on a surfboard. But there are other extremely funny insights that Nicholls gives into the crepuscular world that narcoleptics inhabit: his laconic fretting over the etiquette of attending a CBT group for insomniacs, which he discovers he also suffers from while researching the book. “A narcoleptic attending an insomnia clinic could be seen as the height of insensitivity,” he deadpans. Then there’s the attempt to solve sleep apnoea by learning the didgeridoo. (Didgetherapy, since you ask. It involves acrylic didgeridoos and is, apparently, quite effective.) Misjudging his tone entirely, I arrive at our interview expecting a garrulous chat. I’m particularly excited that I opened Nicholls’s book thinking I was pretty special to be able to share with him the fact that my father also had narcolepsy – and close his book having realised that five of my closest family members (including myself) have had diagnosable sleep disorders ranging from sleep apnoea to night terrors to – my own thrilling self-realisation – an episode of hypnagogic hallucination and sleep paralysis. © 2018 Guardian News and Media Limited

Keyword: Narcolepsy; Sleep
Link ID: 24697 - Posted: 02.26.2018

By Natalie Crockett BBC News Older people in Wales are being urged to think about donating their brains after they die to help scientists researching dementia. Researchers at Cardiff University are not actively recruiting at the moment but are still keen to hear from people aged over 85 without a diagnosis. While they also recruit donors with dementia as healthy brains are needed for comparisons. Donor Ken Baxter, 75, said: "When I'm finished, it isn't any use to me." Since 2009, 460 people in Wales have signed up, with 79 successful donations made to the Brains for Dementia Research project so far. They are recruited through its team at the university, which is working to identify which genes contribute to a person's susceptibility to developing Alzheimer's disease. It is hoped they will then be able to predict which people are more likely to get it. But to do this they need to study human brain tissue, as looking at the distribution of protein deposits on the brain is the only way to get a definitive diagnosis of the disease. While donors who have dementia often find out about brain donation from medical professionals, it can be harder to attract those with healthy brains. Mr Baxter is one such donor and decided to donate his brain after seeing how dementia affected a friend. He saw it as a way to help others but admitted he does not always get a positive reaction to his plans. He said: "'[People say] are you sure? It's not something I want to do'. And some people are horrified when you tell them - I can't see a reason why but a lot of people take it the wrong way. "They think 'I've never thought of that' - but you're helping someone. If we can overcome these diseases, so much the better." © 2018 BBC.

Keyword: Alzheimers
Link ID: 24696 - Posted: 02.26.2018

Rhik Samadder The results of a comprehensive, six-year study confirmed last week what I’ve known a long time: antidepressants work. I know this because half the people I know are on them – and that’s only the half I know about. Antidepressants saved my life, they tell me, and I believe them. I don’t say: “The only thing you’ve swallowed is propaganda, mate, straight from Big Pharma’s chalky teat.” I would have to be a maniac to do that. And I’m not a maniac. At least, not in that way. I’ve been on antidepressants at various points in my life. And I’ve always been one of the 80% who come off them within a month, looking for another way. I quickly tire of the tweaking of drugs and dosages required to find the appropriate prescription. I freak out at the initial side-effects – the flaccidness in my brain, the lack of ideas in my underpants. More than that, I’ve always had been uncomfortable accepting there is something medically wrong with me. To some extent, I stand by that. Our social structures perpetuate inequality, our media feeds feelings of inferiority, while our politics is an accelerated zoetrope of horror. I feel unnerved when I meet someone who isn’t depressed. What’s wrong with you, I want to ask. Still, while it’s not wrong to feel viscerally offended by many aspects of the modern world, when the strength of those feelings stops you living your life, it’s not a solution, either. What struck me from that study, below the headline, was another of its findings: that talking therapies are equally as effective at treating moderate to severe depression.

Keyword: Depression
Link ID: 24695 - Posted: 02.26.2018

By JAMES GORMAN Recently someone (my boss, actually) mentioned to me that I wrote more articles about dogs than I did about cats and asked why. My first thought, naturally, was that it had nothing to do with the fact that I have owned numerous dogs and no cats, but rather reflected the amount of research done by scientists on the animals. After all, I’ll write about any interesting findings, and I like cats just fine, even if I am a dog person. Two of my adult children have cats, and I would hate for them to think I was paying them insufficient attention. (Hello Bailey! Hello Tawny! — Those are the cats, not the children.) But I figured I should do some reporting, so I emailed Elinor Karlsson at the Broad Institute and the University of Massachusetts. She is a geneticist who owns three cats, but does much of her research on dogs — the perfect unbiased observer. Her research, by the way, is about dog genomes. She gets dog DNA from owners who send in their pets’ saliva samples. The research I have been interested in and writing about involves evolution, domestication, current genetics and behavior. And the questions are of the What-is-a-dog-really? variety. Dogs and cats have also been used as laboratory animals in invasive experiments, but I wasn’t asking about which animal is more popular for those. I had gotten to know Dr. Karlsson a bit while reporting on research she was doing on wolves. I asked her whether there was indeed more research on dogs than cats, and if so, why? “Ooo, that is an interesting question!” she wrote back. “Way more interesting than the various grant-related emails that are filling up my inbox. “The research has lagged behind in cats. I think they’re taken less seriously than dogs, probably to do with societal biases. I have a vet in my group who thinks that many of the cancers in cats may actually be better models for human cancer, but there has been almost no research into them.” Better models than cancers in dogs, that is. Dogs do get many of the same cancers as humans, but in dogs the risk for these cancers often varies by breed, which narrows the target down when looking for the cause of a disease. © 2018 The New York Times Company

Keyword: Animal Rights
Link ID: 24694 - Posted: 02.26.2018

Emma Marris Neanderthals painted caves in what is now Spain before their cousins, Homo sapiens, even arrived in Europe, according to research published today in Science1. The finding suggests that the extinct hominids, once assumed to be intellectually inferior to humans, may have been artists with complex beliefs. Ladder-like shapes, dots and handprints were painted and stenciled deep in caves at three sites in Spain. Their precise meaning may forever be unknowable, says Alistair Pike, an archaeologist at the University of Southampton, UK, who co-authored the study, but they were almost certainly meaningful to our lost kin. “It wasn’t simply decorating your living space,” Pike says. “People were making journeys into the darkness.” Humans are thought to have arrived in Europe from Africa around 40,000–45,000 years ago. The three caves in different parts of Spain yielded artworks that are at least 65,000 years old, according to uranium-thorium dating of calcium carbonate that had formed on top of the art. These mineral deposits develop slowly, as water containing calcium comes into contact with cave surfaces. The water also contains trace levels of uranium from the rock. After the calcium carbonate has precipitated out of the water, a clock of sorts begins to tick, as uranium decays into thorium at a steady, known rate. Uranium-thorium dating has been used in geology for decades, but has seldom been employed to estimate the age of cave art. Some archaeologists are sceptical of the approach. They suggest that the calcium carbonate could have dissolved and re-crystallized after it was first formed — a process that could have also washed away some uranium, making a sample of the mineral appear older than it is. 2018 Macmillan Publishers Limited,

Keyword: Evolution
Link ID: 24693 - Posted: 02.23.2018

Barbara J. King When humans talk to each other or walk alongside each other, we tend to match each other's subtle movements. Called interpersonal movement synchrony in the science literature and mirroring in the popular media, it's an often-unconscious process during which we match our gestures and pace to that of our social partner of the moment. Writing in the March issue of the journal Animal Cognition, Charlotte Duranton, Thierry Bedossa, and Florence Gaunet note that this process is "evolutionarily adaptive" for us: "It contributes to communication between individuals by signaling the convergence of their inner states and fostering social cohesion." Then, these three researchers present evidence to show that dogs synchronize their walking pace with their humans in a way that may also reflect an evolutionary adaptation. In an experiment, 36 pet dogs were brought to an open area in Maisons-Laffitte, France, with their owners. After a 15-minute free period, the owner-dog pairs experienced three testing conditions presented in random order. These were: stay-still (owner didn't move for 10 seconds), normal-walk (owners walked at normal speed for 10 seconds), and fast-walk (owner walked fast for 10 seconds). Importantly, the dogs were off-leash and, thus, not tethered in any way to the speed of the owners. The owners were told not to look at, or talk to, their dogs — or to show any evident emotion. The experimenters filmed the trials as they occurred. The dogs synchronized their pace closely with their owners, speeding up when the owners walked at an unnaturally fast pace. (The dogs in their regular routines were used to walking at a normal pace, with the owners often pausing to chat with other people). © 2018 npr

Keyword: Animal Communication; Emotions
Link ID: 24692 - Posted: 02.23.2018

By BENEDICT CAREY President Trump called again on Thursday for the opening of more mental hospitals to help prevent mass murders like the one at Marjory Stoneman Douglas High School in Parkland, Fla. Yet ramping up institutional care, experts say, likely would not have prevented most of the spree killings regularly making headlines in this country. “We’re going to be talking about mental institutions. And when you have some person like this, you can bring them into a mental institution, and they can see what they can do. But we’ve got to get them out of our communities,” the president said during a meeting at the White House with state and local officials. In the 1960s, states across the country began to close or shrink mental hospitals after a series of court decisions that limited the powers of state and local officials to commit people. The decline continued for decades, in part because of cuts in both state and federal budgets for mental health care. Those institutions housed people with severe mental disorders, like schizophrenia, who were deemed unable to care for themselves. And while spree killers may be angry and emotionally disordered, few have had the sorts of illnesses that would have landed them in hospital custody. The latest school shooter, Nikolas Cruz, 19, was clearly troubled and making threats, and he was stockpiling weapons. But he had no mental diagnosis. He has been described as angry, possibly depressed, perhaps isolated — not so different from millions of other teenagers. A full psychiatric evaluation, if he’d had one, might have resulted in a temporary commitment at best, but not full-time institutionalization, experts said. The idea that more such institutions would prevent this kind of violence “is ridiculous, because you can’t put half the people in the country with a mental disturbance in mental hospitals,” said Dr. Michael Stone, a forensic psychiatrist at Columbia University who has studied mass killers. © 2018 The New York Times Company

Keyword: Aggression; Schizophrenia
Link ID: 24691 - Posted: 02.23.2018

It was disappointing to read such an uncritical description of the latest analysis of antidepressant trials that does not address doubts about the widespread use of these drugs (The drugs do work, says study of antidepressants, 22 February). The analysis consists of comparing “response” rates between people on antidepressants and those on placebo. But “response” is an artificial category that has been arbitrarily constructed out of the data actually collected, which consists of scores on depression rating scales. Analysing categories inflates differences. When scores are compared, differences are trivial, and unlikely to be clinically relevant. Moreover, even these small differences are easily accounted for by the fact that antidepressants produce more or less subtle mental and physical alterations (eg nausea, dry mouth, drowsiness and emotional blunting) irrespective of whether or not they treat depression. These enable participants to guess whether they have been allocated to antidepressant or placebo, thus enhancing the placebo effect of the active drugs. This may explain why antidepressants that cause the most noticeable alterations, such as amitriptyline, appeared to be the most effective. “Real world” studies show that people treated with antidepressants have poor outcomes and fare worse than depressed people who do not receive antidepressants. Increased prescribing will do more harm than good. Adverse effects include sexual dysfunction, which may occasionally persist after the drugs are stopped, agitation, suicidal and aggressive behaviour among younger users, prolonged and severe withdrawal effects and foetal abnormalities. The costs of encouraging more people to consider themselves as flawed and diseased are hard to quantify. © 2018 Guardian News and Media Limited

Keyword: Depression
Link ID: 24690 - Posted: 02.23.2018