Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 19853

By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014

By Nsikan Akpan Gut surgery is often the only option for life-threatening obesity and diabetes, but what if doctors could cut the pounds without using a knife? Scientists have engineered an antiobesity drug that rivals the dramatic benefits seen with surgery, dropping excess body weight by a third. Though the work was done only in rodents, the drug is the first to influence three obesity-related hormones in the gut at once. Bariatric surgery, including gastric bypass, typically involves limiting food intake by removing part of the stomach or intestines. Yet it does more than shrink the size of patient’s stomach or intestines. It also changes the release of multiple gut-related hormones, explains clinical endocrinologist Stephen O'Rahilly of the University of Cambridge in the United Kingdom, who wasn’t involved with the study. That’s important, because years of eating a diet high in fat and sugar can throw a person’s metabolism into disarray. Cells undergo genetic reprogramming that negatively impacts how they process sugar and store fat, locking in obesity. This pattern makes it harder and harder to lose weight, even if a person changes their diet and begins exercising. Bariatric surgery interrupts that cycle by stimulating the production of several hormones that reduce blood sugar, burn fat, and curb appetite. (It may also change the composition of the gut’s microbes.) Three of these hormones are called glucagon-like peptide-1 (GLP-1), gastric inhibitory peptide (GIP), and glucagon. Cells in your gut release GLP-1 and GIP after a meal to keep your body’s blood sugar levels in a normal range. GLP-1 also curbs appetite, signaling to your brain that you are full. In type 2 diabetes, the body stops responding to GLP-1 and GIP, which contributes to hyperglycemia, or too much blood sugar. Hyperglycemia causes the devastating hallmarks of diabetes, such as kidney injury, cardiovascular disease, and nerve damage. © 2014 American Association for the Advancement of Science.

Keyword: Obesity
Link ID: 20408 - Posted: 12.10.2014

By ANDREW POLLACK It is either the most exciting new treatment for depression in years or it is a hallucinogenic club drug that is wrongly being dispensed to desperate patients in a growing number of clinics around the country. It is called ketamine — or Special K, in street parlance. While it has been used as an anesthetic for decades, small studies at prestigious medical centers like Yale, Mount Sinai and the National Institute of Mental Health suggest it can relieve depression in many people who are not helped by widely used conventional antidepressants like Prozac or Lexapro. And the depression seems to melt away within hours, rather than the weeks typically required for a conventional antidepressant. But some psychiatrists say the drug has not been studied enough to be ready for use outside of clinical trials, and they are alarmed that clinics are springing up to offer ketamine treatments, charging hundreds of dollars for sessions that must be repeated many times. “We don’t know what the long-term side effects of this are,” said Dr. Anthony J. Rothschild, a professor of psychiatry at the University of Massachusetts Medical School. Some psychiatrists say the drug has not been studied enough to be ready for use outside of clinical trials. Credit Sandy Huffaker for The New York Times Pharmaceutical companies hope to solve the problem by developing drugs that work like ketamine but without the side effects, which are often described as out-of-body experiences. © 2014 The New York Times Company

Keyword: Depression; Drug Abuse
Link ID: 20407 - Posted: 12.10.2014

by Helen Thomson Zapping your brain might make you better at maths tests – or worse. It depends how anxious you are about taking the test in the first place. A recent surge of studies has shown that brain stimulation can make people more creative and better at maths, and can even improve memory, but these studies tend to neglect individual differences. Now, Roi Cohen Kadosh at the University of Oxford and his colleagues have shown that brain stimulation can have completely opposite effects depending on your personality. Previous research has shown that a type of non-invasive brain stimulation called transcranial direct current stimulation (tDCS) – which enhances brain activity using an electric current – can improve mathematical ability when applied to the dorsolateral prefrontal cortex, an area involved in regulating emotion. To test whether personality traits might affect this result, Kadosh's team tried the technique on 25 people who find mental arithmetic highly stressful, and 20 people who do not. They found that participants with high maths anxiety made correct responses more quickly and, after the test, showed lower levels of cortisol, an indicator of stress. On the other hand, individuals with low maths anxiety performed worse after tDCS. "It is hard to believe that all people would benefit similarly [from] brain stimulation," says Cohen Kadosh. He says that further research could shed light on how to optimise the technology and help to discover who is most likely to benefit from stimulation. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging; Learning & Memory
Link ID: 20406 - Posted: 12.10.2014

Ian Sample, science editor Electrical brain stimulation equipment – which can boost cognitive performance and is easy to buy online – can have bad effects, impairing brain functioning, research from scientists at Oxford University has shown. A steady stream of reports of stimulators being able to boost brain performance, coupled with the simplicity of the devices, has led to a rise in DIY enthusiasts who cobble the equipment together themselves, or buy it assembled on the web, then zap themselves at home. In science laboratories brain stimulators have long been used to explore cognition. The equipment uses electrodes to pass gentle electric pulses through the brain, to stimulate activity in specific regions of the organ. Roi Cohen Kadosh, who led the study, published in the Journal of Neuroscience, said: “It’s not something people should be doing at home at this stage. I do not recommend people buy this equipment. At the moment it’s not therapy, it’s an experimental tool.” The Oxford scientists used a technique called transcranial direct current stimulation (tDCS) to stimulate the dorsolateral prefrontal cortex in students as they did simple sums. The results of the test were surprising. Students who became anxious when confronted with sums became calmer and solved the problems faster than when they had sham stimulation (the stimulation itself lasted only 30 seconds of the half hour study). The shock was that the students who did not fear maths performed worse with the same stimulation.

Keyword: Brain imaging; Learning & Memory
Link ID: 20405 - Posted: 12.10.2014

By Tina Rosenberg When Ebola ends, the people who have suffered, who have lost loved ones, will need many things. They will need ways to rebuild their livelihoods. They will need a functioning health system, which can ensure that future outbreaks do not become catastrophes. And they will need mental health care. Depression is the most important thief of productive life for women around the world, and the second-most important for men. We sometimes imagine it is a first-world problem, but depression is just as widespread, if not more so, in poor countries, where there is a good deal more to be depressed about. And it is more debilitating, as a vast majority of sufferers have no safety net. Health care for all must include mental health care. It’s hard to believe but both Liberia and Sierra Leone have only a single psychiatrist. The Ebola crisis has exposed these countries’ malignant neglect of their health systems. People can’t get care for diarrhea and malaria. How will these countries take care of an epidemic of depression? This isn’t really a medical question. We know how to treat depression. What we don’t know yet is how to make effective treatment cheap, culturally appropriate, convenient and non-stigmatizing — all needed to get treatment out to millions and millions of people. But some researchers are finding out. They are doing so despite the fact that growing attention to this issue hasn’t been accompanied by money. The U.S. National Institute of Mental Health last year provided just $24.5 million for global mental health efforts, and the Canadian government’s Grand Challenges Canada, which is said to have the largest portfolio of mental health innovation in developing countries, has spent only $28 million on them since it began in 2010. © 2014 The New York Times Company

Keyword: Depression
Link ID: 20404 - Posted: 12.08.2014

By Lenny Bernstein There are 60 million epileptics on the planet, and while advances in medication and implantable devices have helped them, the ability to better detect and even predict when they will have debilitating seizures would be a significant improvement in their everyday lives. Imagine, for example, if an epileptic knew with reasonable certainty that his next seizure would not occur for an hour or a day or a week. That might allow him to run to the market or go out for the evening or plan a short vacation with less concern. Computers and even dogs have been tested in the effort to do this, but now a group of organizations battling epilepsy is employing "big data" to help. They sponsored an online competition that drew 504 entrants who tried to develop algorithms that would detect and predict epileptic seizures. Instead of the traditional approach of asking researchers in a handful of labs to tackle the problem, the groups put huge amounts of data online that was recorded from the brains of dogs and people as they had seizures over a number of months. They then challenged anyone interested to use the information to develop detection and prediction models. "Seizure detection and seizure prediction," said Walter J. Koroshetz, deputy director of the National Institute of Neurological Disorders and Stroke (NINDS), are "two fundamental problems in the field that are poised to take significant advantage of large data computation algorithms and benefit from the concept of sharing data and generating reproducible results."

Keyword: Epilepsy
Link ID: 20403 - Posted: 12.08.2014

By Quassim Cassam Most people wonder at some point in their lives how well they know themselves. Self-knowledge seems a good thing to have, but hard to attain. To know yourself would be to know such things as your deepest thoughts, desires and emotions, your character traits, your values, what makes you happy and why you think and do the things you think and do. These are all examples of what might be called “substantial” self-knowledge, and there was a time when it would have been safe to assume that philosophy had plenty to say about the sources, extent and importance of self-knowledge in this sense. Not any more. With few exceptions, philosophers of self-knowledge nowadays have other concerns. Here’s an example of the sort of thing philosophers worry about: suppose you are wearing socks and believe you are wearing socks. How do you know that that’s what you believe? Notice that the question isn’t: “How do you know you are wearing socks?” but rather “How do you know you believe you are wearing socks?” Knowledge of such beliefs is seen as a form of self-knowledge. Other popular examples of self-knowledge in the philosophical literature include knowing that you are in pain and knowing that you are thinking that water is wet. For many philosophers the challenge is explain how these types of self-knowledge are possible. This is usually news to non-philosophers. Most certainly imagine that philosophy tries to answer the Big Questions, and “How do you know you believe you are wearing socks?” doesn’t sound much like one of them. If knowing that you believe you are wearing socks qualifies as self-knowledge at all — and even that isn’t obvious — it is self-knowledge of the most trivial kind. Non-philosophers find it hard to figure out why philosophers would be more interested in trivial than in substantial self-knowledge. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20402 - Posted: 12.08.2014

By JOHN McWHORTER “TELL me, why should we care?” he asks. It’s a question I can expect whenever I do a lecture about the looming extinction of most of the world’s 6,000 languages, a great many of which are spoken by small groups of indigenous people. For some reason the question is almost always posed by a man seated in a row somewhere near the back. Asked to elaborate, he says that if indigenous people want to give up their ancestral language to join the modern world, why should we consider it a tragedy? Languages have always died as time has passed. What’s so special about a language? The answer I’m supposed to give is that each language, in the way it applies words to things and in the way its grammar works, is a unique window on the world. In Russian there’s no word just for blue; you have to specify whether you mean dark or light blue. In Chinese, you don’t say next week and last week but the week below and the week above. If a language dies, a fascinating way of thinking dies along with it. I used to say something like that, but lately I have changed my answer. Certainly, experiments do show that a language can have a fascinating effect on how its speakers think. Russian speakers are on average 124 milliseconds faster than English speakers at identifying when dark blue shades into light blue. A French person is a tad more likely than an Anglophone to imagine a table as having a high voice if it were a cartoon character, because the word is marked as feminine in his language. This is cool stuff. But the question is whether such infinitesimal differences, perceptible only in a laboratory, qualify as worldviews — cultural standpoints or ways of thinking that we consider important. I think the answer is no. Furthermore, extrapolating cognitive implications from language differences is a delicate business. In Mandarin Chinese, for example, you can express If you had seen my sister, you’d have known she was pregnant with the same sentence you would use to express the more basic If you see my sister, you know she’s pregnant. One psychologist argued some decades ago that this meant that Chinese makes a person less sensitive to such distinctions, which, let’s face it, is discomfitingly close to saying Chinese people aren’t as quick on the uptake as the rest of us. The truth is more mundane: Hypotheticality and counterfactuality are established more by context in Chinese than in English. © 2014 The New York Times Company

Keyword: Language
Link ID: 20401 - Posted: 12.08.2014

Carl Zimmer For thousands of years, fishermen knew that certain fish could deliver a painful shock, even though they had no idea how it happened. Only in the late 1700s did naturalists contemplate a bizarre possibility: These fish might release jolts of electricity — the same mysterious substance as in lightning. That possibility led an Italian physicist named Alessandro Volta in 1800 to build an artificial electric fish. He observed that electric stingrays had dense stacks of muscles, and he wondered if they allowed the animals to store electric charges. To mimic the muscles, he built a stack of metal disks, alternating between copper and zinc. Volta found that his model could store a huge amount of electricity, which he could unleash as shocks and sparks. Today, much of society runs on updated versions of Volta’s artificial electric fish. We call them batteries. Now a new study suggests that electric fish have anticipated other kinds of technology. The research, by Kenneth C. Catania, a biologist at Vanderbilt University, reveals a remarkable sophistication in the way electric eels deploy their shocks. Dr. Catania, who published the study on Thursday in the journal Science, found that the eels use short shocks like a remote control on their victims, flushing their prey out of hiding. And then they can deliver longer shocks that paralyze their prey at a distance, in precisely the same way that a Taser stops a person cold. “It shows how finely adapted eels are to attack prey,” said Harold H. Zakon, a biologist at the University of Texas at Austin, who was not involved in the study. He considered Dr. Catania’s findings especially impressive since scientists have studied electric eels for more than 200 years. © 2014 The New York Times Company

Keyword: Evolution
Link ID: 20400 - Posted: 12.06.2014

Kelly Servick* Anesthesiologists and surgeons who operate on children have been dogged by a growing fear—that being under anesthesia can permanently damage the developing brain. Although the few studies of children knocked out for surgeries have been inconclusive, evidence of impaired development in nematodes, zebrafish, rats, guinea pigs, pigs, and monkeys given common anesthetics has piled up in recent years. Now, the alarm is reaching a tipping point. “Anything that goes from [the roundworm] C. elegans to nonhuman primates, I've got to worry about,” Maria Freire, co-chair of the U.S. Food and Drug Administration (FDA) science advisory board, told attendees at a meeting the agency convened here last month to discuss the issue. The gathering came as anesthesia researchers and regulators consider several moves to address the concerns: a clinical trial of anesthetics in children, a consensus statement about their possible risks, and an FDA warning label on certain drugs. But each step stirs debate. Many involved in the issue are reluctant to make recommendations to parents and physicians based on animal data alone. At the same time, more direct studies of anesthesia's risks in children are plagued by confounding factors, lack of funding, and ethical issues. “We have to generate—very quickly—an action item, because I don't think the status quo is acceptable,” Freire said at the 19 November meeting. “Generating an action item without having the data is where things become very, very tricky.” © 2014 American Association for the Advancement of Science

Keyword: Sleep; Development of the Brain
Link ID: 20399 - Posted: 12.06.2014

by Michael Slezak The elusive link between obesity and high blood pressure has been pinned down to the action of leptin in the brain, and we might be able to block it with drugs. We've known for more than 30 years that fat and high blood pressure are linked, but finding what ties them together has been difficult. One of the favourite candidates has been leptin – a hormone produced by fat cells. Under normal circumstances, when fat cells produce leptin, the hormone sends the message that you've had enough food. But in people with obesity, the body stops responding to this message, and large levels of leptin build up. Leptin is known to activate the regulatory network called the sympathetic nervous system, and it's the activation of sympathetic nerves on the kidneys that seem to be responsible for raising blood pressure. Leptin has thus been linked to blood pressure. However, conclusive evidence has been hard to come by. Michael Cowley of Monash University in Melbourne, Australia, and his colleagues have now conducted a string of experiments that provide some evidence. Through genetic and drug experiments in mice, they have pinpointed an area in the mouse brain that increases blood pressure when it is exposed to high leptin levels. This region is called the dorsomedial hypothalamus, and is thought to be involved in controlling energy consumption. Their findings show that high levels in leptin do indeed boost blood pressure, via this brain region. © Copyright Reed Business Information Ltd.

Keyword: Obesity
Link ID: 20398 - Posted: 12.06.2014

|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American

Keyword: Learning & Memory; Emotions
Link ID: 20397 - Posted: 12.06.2014

By Neuroskeptic | An important new study could undermine the concept of ‘endophenotypes’ – and thus derail one of the most promising lines of research in neuroscience and psychiatry. The findings are out now in Psychophysiology. Unusually, an entire special issue of the journal is devoted to presenting the various results of the study, along with commentary, but here’s the summary paper: Knowns and unknowns for psychophysiological endophenotypes by Minnesota researchers William Iacono, Uma Vaidyanathan, Scott Vrieze and Stephen Malone. In a nutshell, the researchers ran seven different genetic studies to try to find the genetic basis of a total of seventeen neurobehavioural traits, also known as ‘endophenotypes’. Endophenotypes are a hot topic in psychiatric neuroscience, although the concept is somewhat vague. The motivation behind interest in endophenotypes comes mainly from the failure of recent studies to pin down the genetic cause of most psychiatric syndromes: endophenotypes_A Essentially an endophenotype is some trait, which could be almost anything, which is supposed to be related to (or part of) a psychiatric disorder or symptom, but which is “closer to genetics” or “more biological” than the disorder itself. Rather than thousands of genes all mixed together to determine the risk of a psychiatric disorder, each endophenotype might be controlled by only a handful of genes – which would thus be easier to find.

Keyword: Schizophrenia; Genes & Behavior
Link ID: 20396 - Posted: 12.06.2014

by Viviane Callier It's a fresh problem. People who smoke menthol cigarettes often smoke more frequently and can be less likely to quit – and it could be because fresh-tasting menthol is changing their brains to more sensitive to nicotine. How menthol enhances nicotine addiction has been something of a mystery. Now, Brandon Henderson at the California Institute of Technology in Pasadena and his colleagues have shown that exposing mice to menthol alone causes them to develop more nicotinic receptors, the parts of the brain that are targeted by nicotine. Menthol can be used medically to relieve minor throat irritations, and menthol-flavoured cigarettes were first introduced in the 1920s. But smokers of menthol cigarettes can be less likely to quit. In one study of giving up smoking, 50 per cent of unflavoured-cigarette smokers were able to quit, while menthol smokers showed quitting rates as low as 23 per cent, depending on ethnicity. Over time, smokers of both menthol and unflavoured cigarettes acquire more receptors for nicotine, particularly in neurons involved in the body's neural pathways for reward and motivation. And research last year showed that smokers of menthol cigarettes develop even more of these receptors than smokers of unflavoured cigarettes. To understand how menthol may be altering the brain, Henderson's team exposed mice to either menthol with nicotine, or menthol alone. They found that, even without nicotine, menthol increased the numbers of brain nicotinic receptors. They saw a 78 per cent increase in one particular brain region – the ventral tegmental area – which is involved in the dopamine signalling pathway that mediates in addiction. © Copyright Reed Business Information Ltd.

Keyword: Drug Abuse
Link ID: 20395 - Posted: 12.06.2014

Injections of a new drug may partially relieve paralyzing spinal cord injuries, based on indications from a study in rats, which was partly funded by the National Institutes of Health. The results demonstrate how fundamental laboratory research may lead to new therapies. “We’re very excited at the possibility that millions of people could, one day, regain movements lost during spinal cord injuries,” said Jerry Silver, Ph.D., professor of neurosciences, Case Western Reserve University School of Medicine, Cleveland, and a senior investigator of the study published in Nature. Every year, tens of thousands of people are paralyzed by spinal cord injuries. The injuries crush and sever the long axons of spinal cord nerve cells, blocking communication between the brain and the body and resulting in paralysis below the injury. On a hunch, Bradley Lang, Ph.D., the lead author of the study and a graduate student in Dr. Silver’s lab, came up with the idea of designing a drug that would help axons regenerate without having to touch the healing spinal cord, as current treatments may require. “Originally this was just a side project we brainstormed in the lab,” said Dr. Lang. After spinal cord injury, axons try to cross the injury site and reconnect with other cells but are stymied by scarring that forms after the injury. Previous studies suggested their movements are blocked when the protein tyrosine phosphatase sigma (PTP sigma), an enzyme found in axons, interacts with chondroitin sulfate proteoglycans, a class of sugary proteins that fill the scars.

Keyword: Regeneration
Link ID: 20394 - Posted: 12.04.2014

By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014

Keyword: Hearing; Learning & Memory
Link ID: 20393 - Posted: 12.04.2014

by Andy Coghlan How does this make you feel? Simply asking people to think about emotion-laden actions as their brains are scanned could become one of the first evidence-based tests for psychiatric illness. Assessing people in this way would be a step towards a more scientific approach to diagnosis, away from that based on how someone behaves or how they describe their symptoms. The US National Institute of Mental Health has had such a goal in mind since 2013. Marcel Just of Carnegie Mellon University in Pittsburgh, Pennsylvania, and his colleagues developed the brain scanning technique and used it to identify people with autism. "This gives us a whole new perspective to understanding psychiatric illnesses and disorders," says Just. "We've discovered a biological thought-marker for autism." The technique builds on work by the group showing that specific thoughts and emotions are represented in the brain by certain patterns of neural activation. The idea is that deviations from these patterns, what Just refers to as thought-markers, can be used to diagnose different psychiatric conditions. The team asked a group of adults to imagine 16 actions, some of which required emotional involvement, such as "hugging", "persuading" or "adoring", while they lay in an fMRI scanner. © Copyright Reed Business Information Ltd.

Keyword: Autism; Emotions
Link ID: 20392 - Posted: 12.04.2014

By Beth Winegarner When Beth Wankel’s son, Bowie, was a baby, he seemed pretty typical. But his “terrible twos” were more than terrible: In preschool, he would hit and push his classmates to a degree that worried his parents and teachers. As Bowie got a little older, he was able tell his mom why he was so combative. “He would say things like, 'I thought they were going to step on me or push me,’” Wankel said. “He was overly uncomfortable going into smaller spaces; it was just too much for him.” Among other things, he refused to enter the school bathroom if another student was inside. When Bowie was 3, he was formally evaluated by his preschool teachers. They said he appeared to be having trouble processing sensory input, especially when it came to figuring out where his body is in relation to other people and objects. He’s also very sensitive to touch and to the textures of certain foods, said Wankel, who lives with her family in San Francisco. Bowie has a form of what’s known as sensory processing disorder. As the name suggests, children and adults with the disorder have trouble filtering sights, smells, sounds and more from the world around them. While so-called neurotypicals can usually ignore background noise, clothing tags or cluttered visual environments, people with SPD notice all of those and more — and quickly become overwhelmed by the effort. Rachel Schneider, a mental-health expert and a blogger for adults with SPD, describes it as a “neurological traffic jam” or “a soundboard, except the sound technician is terrible at his job.”

Keyword: Hearing; Development of the Brain
Link ID: 20391 - Posted: 12.04.2014

Ewen Callaway A shell found on Java in the late 1800s was recently found to bear markings that seem to have been carved intentionally half a million years ago. The photograph is about 15 millimetres wide. Expand A zigzag engraving on a shell from Indonesia is the oldest abstract marking ever found. But what is most surprising about the half-a-million-year-old doodle is its likely creator — the human ancestor Homo erectus. "This is a truly spectacular find and has the potential to overturn the way we look at early Homo," says Nick Barton, an archaeologist at the University of Oxford, UK, who was not involved in the discovery, which is described in a paper published online in Nature on 3 December1. By 40,000 years ago, and probably much earlier, anatomically modern humans — Homo sapiens — were painting on cave walls in places as far apart as Europe2 and Indonesia3. Simpler ochre engravings found in South Africa date to 100,000 years ago4. Earlier this year, researchers reported a 'hashtag' engraving in a Gibraltar cave once inhabited by Neanderthals5. That was the first evidence for drawing in any extinct species. But until the discovery of the shell engraving, nothing approximating art has been ascribed to Homo erectus. The species emerged in Africa about 2 million years ago and trekked as far as the Indonesian island of Java, before going extinct around 140,000 years ago. Most palaeoanthropologists consider the species to be the direct ancestor of both humans and Neanderthals. © 2014 Nature Publishing Group

Keyword: Evolution
Link ID: 20390 - Posted: 12.04.2014