Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 21 - 40 of 2373

Ewen Callaway Ringo, a golden retriever born in 2003 in a Brazilian kennel, was never expected to live long. Researchers bred him and his littermates to inherit a gene mutation that causes severe muscular dystrophy. They hoped that the puppies would provide insight into Duchenne muscular dystrophy (DMD), an untreatable and ultimately fatal human disease caused by inactivation of the same gene. But Ringo’s muscles didn't waste away like his littermates', and researchers have now determined why: he was born with another mutation that seems to have protected him from the disease, according to a paper published in Cell1. Scientists hope that by studying Ringo’s mutation — which has never before been linked to muscular dystrophy — they can find new treatments for the disease. As many as 1 in 3,500 boys inherit mutations that produce a broken version of a protein called dystrophin, causing DMD. (The disease appears in boys because the dystrophin gene sits on the X chromosome, so girls must inherit two copies of the mutated gene to develop DMD.) The protein helps to hold muscle fibres together, and its absence disrupts the regenerative cycle that rebuilds muscle tissue. Eventually, fat and connective tissue replace muscle, and people with DMD often become reliant on a wheelchair before their teens. Few survive past their thirties. Some golden retriever females carry dystrophin mutations that cause a similar disease when passed onto male puppies. Dog breeders can prevent this through genetic screening. But Mayana Zatz, a geneticist at the University of São Paulo in Brazil, and her colleagues set out to breed puppies with the mutation to model the human disease. © 2015 Nature Publishing Group,

Keyword: Movement Disorders; Muscles
Link ID: 21632 - Posted: 11.14.2015

Alva Noë For some time now, I've been skeptical about the neuroscience of consciousness. Not so much because I doubt that consciousness is affected by neural states and processes, but because of the persistent tendency on the part of some neuroscientists to think of consciousness itself as a neural phenomenon. Nothing epitomizes this tendency better than Francis Crick's famous claim — he called it his "astonishing hypothesis" — that you are your brain. At an interdisciplinary conference at Brown not so long ago, I heard a prominent neuroscientist blandly assert, as if voicing well-established scientific fact, that thoughts, feelings and beliefs are specific constellations of matter that are located (as it happens) inside the head. My own view — I laid this out in a book I wrote a few years back called Out of Our Heads — is that the brain is only part of the story, and that we can only begin to understand how the brain makes us consciousness by realizing that brain functions only in the setting of our bodies and our broader environmental (including our social and cultural) situation. The skull is not a magical membrane, my late collaborator, friend and teacher Susan Hurley used to say. And there is no reason to think the processes supporting consciousness are confined to what happens only on one side (the inside) of that boundary. There is a nice interview on the Oxford University Press website with Anil Seth, the editor of a new Oxford journal Neuroscience of Consciousness. It's an informative discussion and makes the valuable point that the study of consciousness is interdisciplinary. © 2015 npr

Keyword: Consciousness
Link ID: 21631 - Posted: 11.14.2015

By Virginia Morell You and your partner are hungry, but your favorite pizza parlor will only let your mate in to dine. What do you do? If you’re a great tit (Parus major), a songbird found from Europe to Northern Asia, you wait by yourself, even though theoretically you would be better off looking for food elsewhere, scientists have discovered. To find out whether the small birds, pictured above, prefer food or hanging out with their mates, the researchers conducted a series of experiments with a long-studied population of wild great tits in the United Kingdom. They set up 12 feeding stations that would only open to great tits wearing particular radio frequency identification (RFID) tags. Half of the stations unlocked only to birds with even-numbered RFID tags; the others opened to great tits wearing odd-numbered tags. The scientists randomly outfitted 10 mated pairs of the birds with identical tags so that they could enter the stations and feed together; and seven pairs with incompatible tags, so that one was locked out. They followed the birds for 90 days, recording 66,184 visits to the feeders. The pairs with the incompatible tags spent almost four times longer at the prohibited feeders than did the compatible pairs—even though one bird was stuck outside, the scientists report today in Current Biology. Other studies have shown that birds may forage in flocks, despite having less to eat, because there are other benefits, such as having others to help watch for or defend against predators. But this is the first experimental study to show that wild birds will choose their mate over food—a decision that also determines where they travel and what other individuals they associate with, which could affect their social rank, the scientists say. Many of the locked-out birds learned a new trick, too. After a great tit with the correct RFID code entered a feeder, the door didn’t slam shut for 2 seconds—just enough time for one of the incompatible birds to slip in and join his sweetie. © 2015 American Association for the Advancement of Science.

Keyword: Sexual Behavior
Link ID: 21630 - Posted: 11.14.2015

By Emilie Reas What makes for a long-lasting memory? Research has shown that emotional or important events take root deeply, whereas neutral or mundane happenings create weak impressions that easily fade. But what about an experience that initially seemed forgettable but was later shown to be important? Animal research suggested that these types of older memories could be strengthened, but scientists had not been able to replicate this finding in humans—until now. New evidence suggests that our initially weak memories are maintained by the brain for a period, during which they can be enhanced. In the recent study published in Nature, psychologists at New York University showed 119 participants a series of images of tools and animals. A few minutes later the subjects saw a new set of images, with an electric shock paired with either the tools or the animals, to increase the salience of just one of those categories. The participants' memories for both sets of images were then tested either immediately, six hours later or the next day. Participants remembered images from the first neutral series better if they belonged to the same category (tool or animal) that was later paired with the shock. The findings suggest that even if an event does not seem meaningful when it occurs, a later cue that the experience was important can enhance the old memory. Although research has not yet demonstrated this effect outside the laboratory, the scientists speculate it happens often in daily life. For example, imagine you meet several new people at a networking event. During a job interview days later, you discover that one of those acquaintances is on the hiring committee, and suddenly the details of your conversation at the networking event become vivid and memorable—whereas the conversations you had with others at the event fade with time. © 2015 Scientific American

Keyword: Learning & Memory
Link ID: 21629 - Posted: 11.12.2015

Rachel England Brussels sprouts, Marmite, stinky cheese … these are all foods guaranteed to create divisions around the dinner table –and sometimes extreme reactions. A friend once ordered a baked camembert at dinner and I had to physically remove myself from the vicinity, such was its overpowering stench. Yet foods that once turned my stomach – mushrooms and prawns, in particular – now make a regular appearance on my plate. How is it that my opinion of a juicy grilled mushroom has gone from yuk to yum after 30 years of steadfast objection? And why is it that certain foods leave some diners gagging theatrically while others tuck in with vigour? Taste is a complicated business. In evolutionary terms we’re programmed to prefer sweeter flavours to bitter tastes: sweet ripe fruits provide a good source of nutrients and energy, for example, while bitter flavours can be found in dangerous plant toxins, which we’re better off avoiding. We’re also more likely to go for fatty foods with a high calorie count which would provide the energy needed for hunting our next meal. But now we live in a world where bitter vegetables such as kale reign supreme, kids salivate over eye-wateringly sour sweets and hunting dinner is as strenuous as picking up the phone. There are some environmental factors at play. When you eat something, molecules in the food hit your taste cells in such a way as to send a message to your brain causing one of five sensations: sweetness, saltiness, bitterness, sourness or umami (a loanword from Japanese meaning ‘pleasant savoury taste’). Mix up these taste cells and messages with external influences and the results can be dramatic. © 2015 Guardian News and Media Limited

Keyword: Chemical Senses (Smell & Taste)
Link ID: 21628 - Posted: 11.12.2015

Lauren Morello When Fiona Ingleby took to Twitter last April to vent about a journal’s peer-review process, she didn’t expect much of a response. With only around 100 followers on the social-media network, Ingleby — an evolutionary geneticist at the University of Sussex near Brighton, UK — guessed that she might receive a few messages of support or commiseration from close colleagues. What she got was an overwhelming wave of reaction. In four pointed tweets, Ingleby detailed her frustration with a PLoS ONE reviewer who tried to explain away her findings on gender disparities in the transition from PhD to postdoc. He suggested that men had “marginally better health and stamina”, and that adding “one or two male biologists” as co-authors would improve the analysis. The response was a full-fledged ‘Twitterstorm’ that spawned more than 5,000 retweets, a popular hashtag — #addmaleauthorgate — and a public apology from the journal. “Things went really mental,” Ingleby says. “I had to turn off the Twitter notifications on my e-mail.” Yet her experience is not as unusual as it may seem. Social media has enabled an increasingly public discussion about the persistent problem of sexism in science. When a male scientist with the European Space Agency (ESA) wore a shirt patterned with half-naked women to a major media event in November 2014, Twitter blazed with criticism. The site was where the first reports surfaced in June of Nobel Prizewinning biologist Tim Hunt’s self-confessed “trouble with girls” in laboratories. And in mid-October, many astronomers took to Twitter to register their anger and disappointment when the news broke that Geoffrey Marcy, an exoplanet hunter at the University of California, Berkeley, was found to have sexually harassed female subordinates for at least a decade. © 2015 Nature Publishing Group

Keyword: Sexual Behavior
Link ID: 21627 - Posted: 11.12.2015

By Emily Underwood Researchers have found a way to increase how fast, and for how long, four paralyzed people can type using just their thoughts. The advance has to do with brain-machine interfaces (BCI), which are implanted in brain tissue and record hundreds of neurons firing as people imagine moving a computer cursor. The devices then use a computer algorithm to decode those signals and direct a real cursor toward words and letters on a computer screen. One of the biggest problems with BCIs is the brain itself: When the soft, squishy organ shifts in the skull, as it frequently does, it can displace the electrode implants. As a result, the movement signal extracted from neuronal firing is constantly being distorted, making it impossible for a patient to keep the cursor from drifting off course without a researcher recalibrating the instrument every 10 minutes or so. In the new study, part of a clinical trial of BCIs called BrainGate, researchers performed several software tweaks that allow the devices to self-correct in real time by calculating the writer’s intention based on the words they’ve already written. The devices can now also correct for neuronal background noise whenever a person stops typing. These improvements, demonstrated in the video above, allow BCI users to type faster and for longer periods of time, up to hours or days, the team reports today in Science Translational Medicine. Though the technology still needs to be miniaturized and wireless before it can be used outside of the lab, the new work is a big step towards BCIs that paralyzed people can use on their own at home, the scientists say. © 2015 American Association for the Advancement of Science

Keyword: Robotics
Link ID: 21626 - Posted: 11.12.2015

By James Gallagher Health editor, BBC News website A mass vaccination programme against meningitis A in Africa has been a "stunning success", say experts. More than 220 million people were immunised across 16 countries in the continent's meningitis belt. In 2013 there were just four cases across the entire region, which once faced thousands of deaths each year. However, there are fresh warnings from the World Health Organization that "huge epidemics" could return unless a new vaccination programme is started. The meningitis belt stretches across sub-Saharan Africa from Gambia in the west to Ethiopia in the east. In the worst epidemic recorded, in 1996-97, the disease swept across the belt infecting more than a quarter of a million people and led to 25,000 deaths. Unlike other vaccines, the MenAfriVac was designed specifically for Africa and in 2010 a mass vaccination campaign was started. "The disease has virtually disappeared from this part of the world," said Dr Marie-Pierre Preziosi from the World Health Organization. The mass immunisation programme was aimed at people under 30. However, routine vaccination will be needed to ensure that newborns are not vulnerable to the disease. Projections, published in the journal Clinical Infectious Diseases, showed the disease could easily return. Dr Preziosi told the BBC News website: "What could happen is a huge epidemic that could sweep the entire area, that could target hundreds of thousands of people with 5-10% deaths at least. © 2015 BBC

Keyword: Miscellaneous
Link ID: 21624 - Posted: 11.11.2015

By Virginia Morell Plunge a live crab into a pot of boiling water, and it’s likely to try to scramble out. Is the crab’s behavior simply a reflex, or is it a sign of pain? Many scientists doubt that any invertebrate (or fish) feels pain because they lack the areas in the brain associated with human pain. Others argue this is an unfair comparison, noting that despite the major differences between vertebrate and invertebrate brains, their functions (such as seeing) are much the same. To get around this problem, researchers in 2014 argued that an animal could be classified as experiencing pain if, among other things, it changes its behavior in a way that indicates it’s trying to prevent further injury, such as through increased wariness, and if it shows a physiological change, such as elevated stress hormones. To find out whether crabs meet these criteria, scientists collected 40 European shore crabs (Carcinus maenas), shown in the photo above, in Northern Ireland. They placed the animals into individual tanks, and gave half 200-millisecond electrical shocks every 10 seconds for 2 minutes in their right and left legs. The other 20 crabs served as controls. Sixteen of the shocked crabs began walking in their tanks, and four tried to climb out. None of the control crabs attempted to clamber up the walls, but 14 walked, whereas six didn’t move at all. There was, however, one big physiological difference between the 16 shocked, walking crabs and the 14 control walkers, the scientists report in today’s issue of Biology Letters: Those that received electrical jolts had almost three times the amount of lactic acid in their haemolymph, a fluid that’s analogous to the blood of vertebrates—a clear sign of stress. Thus, crabs pass the bar scientists set for showing that an animal feels pain. © 2015 American Association for the Advancement of Science.

Keyword: Pain & Touch; Evolution
Link ID: 21623 - Posted: 11.11.2015

By Arlene Karidis Several years ago, Peggy Chenoweth began having excruciating cramping in her ankle. It felt severely sprained and as if her toe were twisting to the point where it was being ripped off her foot. “The pain is right here,” she told an orthopedic surgeon, “in my ankle and foot.” But the 41-year-old Gainesville, Va., resident no longer had that ankle and foot. Her leg had been amputated below the knee after a large piece of computer equipment fell off a cart, crushed her foot and caused nerve damage. Further, she insisted that since the amputation, she could feel her missing toes move. Chenoweth’s surgeon knew exactly what was going on: phantom pain. Lynn Webster, an anesthesiologist and past president of the American Academy of Pain Medicine, explains the phenomenon: “With ‘phantom pain,’ nerves that transmitted information from the brain to the now-missing body part continue to send impulses, which relay the message of pain.” It feels as if the removed part is still there and hurting, but pain is actually in the brain. The sensation ranges from annoying itching to red-hot burning. Physicians wrote about phantom pain as early as the 1860s, but U.S. research on this condition has increased recently, spurred by the surge of amputees returning from warfare in Iraq and Afghanistan and by increasing rates of diabetes. (Since 2003, nearly 1,650 service members have lost limbs, according to the Congressional Research Service. In 2010, about 73,000 amputations were performed on diabetics in the United States, according to the Centers for Disease Control and Prevention.)

Keyword: Pain & Touch
Link ID: 21620 - Posted: 11.10.2015

By Jason G. Goldman When a monkey has the sniffles or a headache, it doesn't have the luxury of popping a few painkillers from the medicine cabinet. So how does it deal with the common colds and coughs of the wildlife world? University of Georgia ecologist Ria R. Ghai and her colleagues observed a troop of more than 100 red colobus monkeys in Uganda's Kibale National Park for four years to figure out whether the rain forest provides a Tylenol equivalent. Monkeys infected with a whipworm parasite were found to spend more time resting and less time moving, grooming and having sex. The infected monkeys also ate twice as much tree bark as their healthy counterparts even though they kept the same feeding schedules. The findings were published in September in the journal Proceedings of the Royal Society B. The fibrous snack could help literally sweep the intestinal intruder out of the simians' gastrointestinal tracts, but Ghai suspects a more convincing reason. Seven of the nine species of trees and shrubs preferred by sick monkeys have known pharmacological properties, such as antisepsis and analgesia. Thus, the monkeys could have been self-medicating, although she cannot rule out other possibilities. The sick individuals were, however, using the very same plants that local people use to treat illnesses, including infection by whipworm parasites. And that “just doesn't seem like a coincidence,” Ghai says. © 2015 Scientific American,

Keyword: Drug Abuse; Evolution
Link ID: 21619 - Posted: 11.10.2015

By Katherine Ellison Last year, Sinan Sonmezler of Istanbul refused to keep going to school. His eighth-grade classmates called him “weird” and “stupid,” and his teachers rebuked him for his tendency to stare out the window during class. The school director told his parents he was “lazy.” Sinan has attention-deficit hyperactivity disorder, a condition still little understood in many parts of the world. “He no longer believes he can achieve anything, and has quit trying,” said Sinan’s father, Umit Sonmezler, a mechanical engineer. While global diagnoses of A.D.H.D. are on the rise, public understanding of the disorder has not kept pace. Debates about the validity of the diagnosis and the drugs used to treat it — the same that have long polarized Americans — are now playing out from Northern and Eastern Europe to the Middle East and South America. Data from various nations tell a story of rapid change. In Germany, A.D.H.D. diagnosis rates rose 381 percent from 1989 to 2001. In the United Kingdom, prescriptions for A.D.H.D. medications rose by more than 50 percent in five years to 657,000 in 2012, up from 420,000 in 2007. Consumption of A.D.H.D. medications doubled in Israel from 2005 to 2012. The surge in use of the medications has prompted skepticism that pharmaceutical firms, chasing profits in an $11 billion international market for A.D.H.D. drugs, are driving the global increase in diagnoses. In 2007, countries outside the United States accounted for only 17 percent of the world’s use of Ritalin. By 2012, that number had grown to 34 percent. © 2015 The New York Times Company

Keyword: ADHD
Link ID: 21618 - Posted: 11.10.2015

By Michelle Roberts Health editor, BBC News online An increasingly warped sense of humour could be an early warning sign of impending dementia, say UK experts. The University College London study involved patients with frontotemporal dementia, with the results appearing in the Journal of Alzheimer's Disease. Questionnaires from the friends and family of the 48 patients revealed many had noticed a change in humour years before the dementia had been diagnosed. This included laughing inappropriately at tragic events. Experts say more studies are now needed to understand how and when changes in humour could act as a red flag for dementia. There are many different types of dementia and frontotemporal dementia is one of the rarer ones. The area of the brain it affects is involved with personality and behaviour, and people who develop this form of dementia can lose their inhibition, become more impulsive and struggle with social situations. Dr Camilla Clark and colleagues recruited 48 patients from their dementia clinic at University College London. And they asked the friends or relatives of the patients to rate their loved one's liking for different kinds of comedy - slapstick comedy such as Mr Bean, satirical comedy such as Yes, Minister or absurdist comedy such as Monty Python - as well as any examples of inappropriate humour. Nearly all of the respondents said, with hindsight, that they had noticed a shift in the nine years before the dementia had been diagnosed. Many of the patients had developed a dark sense of humour - for example, laughing at tragic events in the news or in their personal lives. The dementia patients also tended to prefer slapstick to satirical humour, when compared with 21 healthy people of a similar age. © 2015 BBC.

Keyword: Alzheimers
Link ID: 21617 - Posted: 11.10.2015

Angus Chen English bursts with consonants. We have words that string one after another, like angst, diphthong and catchphrase. But other languages keep more vowels and open sounds. And that variability might be because they evolved in different habitats. Consonant-heavy syllables don't carry very well in places like windy mountain ranges or dense rainforests, researchers say. "If you have a lot of tree cover, for example, [sound] will reflect off the surface of leaves and trunks. That will break up the coherence of the transmitted sound," says Ian Maddieson, a linguist at the University of New Mexico. That can be a real problem for complicated consonant-rich sounds like "spl" in "splice" because of the series of high-frequency noises. In this case, there's a hiss, a sudden stop and then a pop. Where a simple, steady vowel sound like "e" or "a" can cut through thick foliage or the cacophony of wildlife, these consonant-heavy sounds tend to get scrambled. Hot climates might wreck a word's coherence as well, since sunny days create pockets of warm air that can punch into a sound wave. "You disrupt the way it was originally produced, and it becomes much harder to recognize what sound it was," Maddieson says. "In a more open, temperate landscape, prairies in the Midwest of the United States [or in Georgia] for example, you wouldn't have that. So the sound would be transmitted with fewer modifications." © 2015 npr

Keyword: Language; Evolution
Link ID: 21616 - Posted: 11.07.2015

By Rachel E. Gross For decades, Michael Jackson had struggled to fall asleep at night. But in 2009 the pop singer was preparing for his worldwide comeback tour, and he couldn’t afford to be at anything less than 100 percent. Desperate for sleep, he convinced an unscrupulous physician to give it to him synthetically in the form of an anesthetic so strong that it sent him almost immediately into a “druglike coma.” At first, Jackson would wake up feeling refreshed. But the nightly injections conferred only the shadow of true sleep, with none of the deep, dream-filled REM cycles that his body needed. Soon he was fading fast, his mind and mood slipping away. Within two months Jackson was dead of an overdose. If that hadn’t killed him, doctors later testified during his wrongful death trial, he would have died of sleep deprivation. Jackson’s is a particularly dramatic case. But his struggle for oblivion rings true to anyone who has dealt with insomnia. “I’m for anything that gets you through the night,” Frank Sinatra once said, “be it prayer, tranquilizers, or a bottle of Jack Daniel’s.” If you have insomnia, you’ll understand this sentiment, and you’re not alone: Regular sleep eludes up to 15 percent of the population, making insomnia the most commonly diagnosed sleep problem in America. Fortunately, the nighttime affliction is becoming steadily less mysterious—at least from the perspective of neuroscience. While insomniacs toss and turn, researchers are finally starting to understand this elusive disease. As it turns out, chronic insomnia may be more hard-wired into our brains than we had thought, and indicative of larger differences that separate the brains of the sleepless from those who so effortlessly enter the land of dreams. © 2015 The Slate Group LLC

Keyword: Sleep
Link ID: 21614 - Posted: 11.07.2015

By Erika Beras From the backseat of a cab, the moves a driver makes may at times seem, let’s say, daring. In fact, cabbies may actually be better, more agile drivers than the rest of us. Because they know their streets so well. Previous research found that the hippocampus in the brain of a typical cab driver is enlarged. That’s the part of the brain used in navigation. But now a study confirms that learning detailed navigation information does indeed cause that part of the brain to grow. The findings are in the journal NeuroImage. Researchers had young adults who were not regular gamers play a driving simulation game. Some practiced maneuvering the same route 20 times, while other players were confronted with 20 different routes. The participants’ brains were scanned before they performed the simulated driving and again after. Researchers found that subjects who kept repeating the same route increased their speed more than those driving multiple routes. The single-route drivers were also much better able to put in order a sequence of random pictures taken along the way and to draw a map of the route. The investigators also found increases in the single-route drivers in the functional connectivity between the hippocampus and other parts of the brain involved with navigation. And the amount of change was directly related to the amount of improvement each participant displayed. © 2015 Scientific American

Keyword: Learning & Memory
Link ID: 21612 - Posted: 11.07.2015

By Kelli Whitlock Burton More than half of Americans over the age of 70 have cataracts, caused by clumps of proteins collecting in the eye lens. The only way to remove them is surgery, an unavailable or unaffordable option for many of the 20 million people worldwide who are blinded by the condition. Now, a new study in mice suggests eye drops made with a naturally occurring steroid could reverse cataracts by teasing apart the protein clumps. “This is a game changer in the treatment of cataracts,” says Roy Quinlan, a molecular biologist at Durham University in the United Kingdom who was not part of the study. “It takes decades for the cataracts to get to that point, so if you can reverse that by a few drops in the eye over a couple of weeks, that’s amazing.” The proteins that make up the human lens are among the oldest in the body, forming at about 4 weeks after fertilization. The majority are crystallins, a family of proteins that allow the eye to focus and keep the lens clear. Two of the most abundant crystallins, CRYAA and CRYAB, are produced in response to stress or injury. They act as chaperones, identifying and binding to damaged and misfolded proteins in the lens, preventing them from aggregating. But over the years, as damaged proteins accumulate in the lens, these chaperones become overwhelmed. The mutated proteins then clump together, blocking light and producing the tell-tale cloudiness of cataracts. © 2015 American Association for the Advancement of Science

Keyword: Vision
Link ID: 21611 - Posted: 11.06.2015

THINK twice before you tell that fib. By watching courtroom videos, a computer has learned to predict if someone is telling the truth or a lie. A machine learning algorithm trained on the faces of defendants in recordings of real trials, including that of Andrea Sneiderman (above) who was convicted of lying, correctly identified truth-tellers about 75 per cent of the time. Humans managed just 59.5 per cent. The best interrogators can reach 65 per cent. “We’re actually pretty bad lie detectors,” says Rada Mihalcea at the University of Michigan in Ann Arbor. Mihalcea and her colleagues took 121 videos from sources such as the Innocence Project, a non-profit group in Texas dedicated to exonerating people with wrongful convictions. This is superior to simulated conversation because the speakers are more invested in what they are saying. Transcriptions of the videos that included the speaker’s gestures and expressions were fed into a machine learning algorithm, along with the trial’s outcome. To hone it further, the team plans to feed in even more data. Such a system could one day spot liars in real-time in court or at airport customs, says Mihalcea, who will present the work at the International Conference on Multimodal Interaction this month in Seattle, Washington. © Copyright Reed Business Information Ltd.

Keyword: Emotions
Link ID: 21610 - Posted: 11.06.2015

David Cyranoski A Chinese neuroscientist has been sacked after reporting he had used magnetic fields to control neurons and muscle cells in nematode worms (pictured), using a protein that senses magnetism. Tsinghua University in Beijing has sacked a neuroscientist embroiled in a dispute over work on a long-sought protein that can sense magnetic fields. The university has not given a specific reason for its dismissal, however, and the scientist involved, Zhang Sheng-jia, says that he will contest their action. In September, Zhang reported in the journal Science Bulletin1 that he could manipulate neurons in worms by applying a magnetic field — a process that uses a magnetic-sensing protein. But a biophysicist at neighbouring Peking University, Xie Can, who claims to have discovered the protein’s magnetic-sensing capacity and to have a paper detailing his research under review, complained that Zhang should not have published his paper before Xie’s own work appeared. Xie said that by publishing, Zhang violated an agreement that the pair had reached — although the two scientists tell different versions about the terms of their agreement, and have different explanations of how Zhang came to be working with the protein. © 2015 Nature Publishing Group

Keyword: Animal Migration
Link ID: 21608 - Posted: 11.06.2015

Paul Ibbotson and Michael Tomasello The natural world is full of wondrous adaptations such as camouflage, migration and echolocation. In one sense, the quintessentially human ability to use language is no more remarkable than these other talents. However, unlike these other adaptations, language seems to have evolved just once, in one out of 8.7 million species on earth today. The hunt is on to explain the foundations of this ability and what makes us different from other animals. The intellectual most closely associated with trying to pin down that capacity is Noam Chomsky. He proposed a universal grammatical blueprint that was unique to humans. This blueprint operated like a computer program. Instead of running Windows or Excel, this program performed “operations” on language – any language. Regardless of which of the 6000+ human languages that this code could be exposed to, it would guide the learner to the correct adult grammar. It was a bold claim: despite the surface variations we hear between Swahili, Japanese and Latin, they are all run on the same piece of underlying software. As ever, remarkable claims require remarkable evidence, and in the 50 years since some of these ideas were laid out, history has not been kind. First, it turned out that it is really difficult to state what is “in” universal grammar in a way that does justice to the sheer diversity of human languages. Second, it looks as if kids don’t learn language in the way predicted by a universal grammar; rather, they start with small pockets of reliable patterns in the language they hear, such as Where’s the X?, I wanna X, More X, It’s a X, I’m X-ing it, Put X here, Mommy’s X-ing it, Let’s X it, Throw X, X gone, I X-ed it, Sit on the X, Open X, X here, There’s a X, X broken … and gradually build their grammar on these patterns, from the “bottom up”. © 2015 Guardian News and Media Limited

Keyword: Language
Link ID: 21607 - Posted: 11.06.2015