Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American
|By Ingrid Wickelgren Confusion is one symptom of a concussion. But confusion may also characterize decisions about how soon to let an athlete play after taking a hit to the head. Sizing up symptoms such as dizziness and nausea is subjective, after all. Now a study suggests that a blood test could objectively determine whether or not the damage is bad enough to put a player on the bench. The work is in the Journal of Neurotrauma. [Robert Siman et al, Serum SNTF Increases in Concussed Professional Ice Hockey Players and Relates to the Severity of Post Concussion Symptoms] A strong blow to the head causes chemical changes within nerve cells that damage their structural proteins. Among the debris is a protein fragment called SNTF—which in more severe cases, spills into the bloodstream. The new study followed 20 professional hockey players who got concussions with symptoms that lasted six days or more. And blood levels of SNTF were much higher one hour to six days later than were levels of the protein fragment in eight other athletes who had gotten concussions that cleared up within five days. Levels were also low in 45 non-concussed players tested during the pre-season. A blood test for SNTF might thus forecast recovery time from a head injury. Combined with other neurological tests, levels of this molecule could help doctors tell athletes when it’s safe to suit up again. © 2014 Scientific American
Keyword: Brain Injury/Concussion
Link ID: 20419 - Posted: 12.16.2014
By Bruce Bower In the movie Roxanne, Steve Martin plays a lovesick guy who mocks his own huge schnoz by declaring: “It’s not the size of a nose that’s important. It’s what’s in it that matters.” Scientists demonstrated the surprising truth behind that joke this year: People can whiff an average of more than 1 trillion different odors, regardless of nose size (SN: 4/19/14, p. 6). No one had systematically probed how many scents people can actually tell apart. So a team led by Leslie Vosshall of Rockefeller University in New York City asked 26 men and women to discriminate between pairs of scents created from mixes of 128 odor molecules. Volunteers easily discriminated between smells that shared as much as 51 percent of their odor molecules. Errors gradually rose as pairs of scents became chemically more alike. Vosshall’s group calculated that an average participant could tell apart a minimum of more than 1 trillion smells made up of different combinations of 30 odor molecules. Really good smellers could have detected way more than 1 trillion odor mixtures, the scientists said. Smell lags behind sight and hearing as a sense that people need to find food, avoid dangers and otherwise succeed at surviving. Still, detecting the faint odor of spoiled food and other olfactory feats must have contributed to the success of Homo sapiens over the last 200,000 years. Perhaps many animals can whiff the difference between a trillion or more smells. For now, odor-detection studies modeled on Vosshall’s approach have been conducted only with humans. © Society for Science & the Public 2000 - 2014.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20417 - Posted: 12.16.2014
by Andy Coghlan It may not sound very appetising, but an edible powder made from waste excreted by bacteria in our guts may help people to avoid gaining weight. Stabilising a person's weight could have a major health impact, says Gary Frost of Imperial College London, because as people on Western diets grow older, they tend to put on between 0.3 and 0.8 kilograms per year on average. A fatty acid called propionate is released when the bacteria in our gut digest fibre. Propionate makes people feel full by activating cells in the large intestine that produce the satiety hormones GLP-1 and PYY: these tell the brain that it's time to stop eating. But to trigger a big enough dose of this appetite-suppressing signal from gut bacteria alone, people would have to eat extremely large amounts of fibre. To get around that, Frost and his team made the molecule in a concentrated form called inulin-propionate ester (IPE). "That gives you eight times the amount of someone following a typical Western diet," he says. To test its appetite-stemming properties, the team gave powdered IPE, mixed in with fruit juice or a milkshake, to a group of overweight volunteers every day for six months. A type of ordinary fibre was given to another set of people, who acted as controls. Only one of the 25 volunteers taking IPE put on more than 3 per cent of their body weight over that time, compared with six of the 24 controls. One reason for this might be that the IPE recipients ate around 9 per cent less over the six months. © Copyright Reed Business Information Ltd.
Link ID: 20416 - Posted: 12.13.2014
|By Lindsey Konkel For 28 years, Bill Gilmore lived in a New Hampshire beach town, where he surfed and kayaked. “I’ve been in water my whole life,” he said. “Before the ocean, it was lakes. I’ve been a water rat since I was four.” Now Gilmore can no longer swim, fish or surf, let alone button a shirt or lift a fork to his mouth. Earlier this year, he was diagnosed with amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease. In New England, medical researchers are now uncovering clues that appear to link some cases of the lethal neurological disease to people’s proximity to lakes and coastal waters. About five years ago, doctors at a New Hampshire hospital noticed a pattern in their ALS patients—many of them, like Gilmore, lived near water. Since then, researchers at Dartmouth-Hitchcock Medical Center have identified several ALS hot spots in lake and coastal communities in New England, and they suspect that toxic blooms of blue-green algae—which are becoming more common worldwide—may play a role. Now scientists are investigating whether breathing a neurotoxin produced by the algae may raise the risk of the disease. They have a long way to go, however: While the toxin does seem to kill nerve cells, no research, even in animals, has confirmed the link to ALS. As with all ALS patients, no one knows what caused Bill Gilmore’s disease. He was a big, strong guy—a carpenter by profession. One morning in 2011, his arms felt weak. “I couldn’t pick up my tools. I thought I had injured myself,” said Gilmore, 59, who lived half his life in Hampton and now lives in Rochester, N.H. © 2014 Scientific American
by Colin Barras It's not just great minds that think alike. Dozens of the genes involved in the vocal learning that underpins human speech are also active in some songbirds. And knowing this suggests that birds could become a standard model for investigating the genetics of speech production – and speech disorders. Complex language is a uniquely human trait, but vocal learning – the ability to pick up new sounds by imitating others – is not. Some mammals, including whales, dolphins and elephants, share our ability to learn new vocalisations. So do three groups of birds: the songbirds, parrots and hummingbirds. The similarities between vocal learning in humans and birds are not just superficial. We know, for instance, that songbirds have specialised vocal learning brain circuits that are similar to those that mediate human speech. What's more, a decade ago we learned that FOXP2, a gene known to be involved in human language, is also active in "area X" of the songbird brain – one of the brain regions involved in those specialised vocal learning circuits. Andreas Pfenning at the Massachusetts Institute of Technology and his colleagues have now built on these discoveries. They compared maps of genetic activity – transcriptomes – in brain tissue taken from the zebra finch, budgerigar and Anna's hummingbird, representing the three groups of vocal-learning birds. © Copyright Reed Business Information Ltd.
|By Claudia Wallis Touch a hot frying pan and the searing message of pain sprints up to your brain and back down to your hand so fast that the impulse to withdraw your fingers seems instantaneous. That rapid-fire signal begins in a heat-sensing molecule called a TRPV1 channel. This specialized protein is abundant on the surface of sensory nerve cells in our fingers and elsewhere and is a shape-shifter that can take an open or closed configuration. Heat opens a central pore in the molecule, so do certain spider toxins and capsaicin—the substance that gives chili peppers their burn. Once the pore is open, charged ions of sodium and calcium flow into the nerve cell, triggering the pain signal. Ouch! As neuroscientist-journalist Stephani Sutherland explains in “Pain that Won’t Quit,” in the December Scientific American, researchers have long been interested in finding ways to moderate the action of this channel—and other ion channels—in patients who suffer from chronic pain. Shutting down the TRPV1 channel completely, however, is not an option because it plays a vital role in regulating body temperature. In two papers published in Nature in December 2013 investigators at the University of California, San Francisco, gave pain researchers a big leg up in understanding TRPV1. They revealed, in exquisite atomic detail, the structure of the channel molecule (from a rat) using an electron cryomicroscope, an instrument designed to explore the 3-D structure of molecules at very low temperatures. One of those investigators, Yifan Cheng, also created this colorful animation, showing how the molecule looks when the channel is open. © 2014 Scientific American
Keyword: Pain & Touch
Link ID: 20411 - Posted: 12.13.2014
By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014
By Nsikan Akpan Gut surgery is often the only option for life-threatening obesity and diabetes, but what if doctors could cut the pounds without using a knife? Scientists have engineered an antiobesity drug that rivals the dramatic benefits seen with surgery, dropping excess body weight by a third. Though the work was done only in rodents, the drug is the first to influence three obesity-related hormones in the gut at once. Bariatric surgery, including gastric bypass, typically involves limiting food intake by removing part of the stomach or intestines. Yet it does more than shrink the size of patient’s stomach or intestines. It also changes the release of multiple gut-related hormones, explains clinical endocrinologist Stephen O'Rahilly of the University of Cambridge in the United Kingdom, who wasn’t involved with the study. That’s important, because years of eating a diet high in fat and sugar can throw a person’s metabolism into disarray. Cells undergo genetic reprogramming that negatively impacts how they process sugar and store fat, locking in obesity. This pattern makes it harder and harder to lose weight, even if a person changes their diet and begins exercising. Bariatric surgery interrupts that cycle by stimulating the production of several hormones that reduce blood sugar, burn fat, and curb appetite. (It may also change the composition of the gut’s microbes.) Three of these hormones are called glucagon-like peptide-1 (GLP-1), gastric inhibitory peptide (GIP), and glucagon. Cells in your gut release GLP-1 and GIP after a meal to keep your body’s blood sugar levels in a normal range. GLP-1 also curbs appetite, signaling to your brain that you are full. In type 2 diabetes, the body stops responding to GLP-1 and GIP, which contributes to hyperglycemia, or too much blood sugar. Hyperglycemia causes the devastating hallmarks of diabetes, such as kidney injury, cardiovascular disease, and nerve damage. © 2014 American Association for the Advancement of Science.
Link ID: 20408 - Posted: 12.10.2014
By Tina Rosenberg When Ebola ends, the people who have suffered, who have lost loved ones, will need many things. They will need ways to rebuild their livelihoods. They will need a functioning health system, which can ensure that future outbreaks do not become catastrophes. And they will need mental health care. Depression is the most important thief of productive life for women around the world, and the second-most important for men. We sometimes imagine it is a first-world problem, but depression is just as widespread, if not more so, in poor countries, where there is a good deal more to be depressed about. And it is more debilitating, as a vast majority of sufferers have no safety net. Health care for all must include mental health care. It’s hard to believe but both Liberia and Sierra Leone have only a single psychiatrist. The Ebola crisis has exposed these countries’ malignant neglect of their health systems. People can’t get care for diarrhea and malaria. How will these countries take care of an epidemic of depression? This isn’t really a medical question. We know how to treat depression. What we don’t know yet is how to make effective treatment cheap, culturally appropriate, convenient and non-stigmatizing — all needed to get treatment out to millions and millions of people. But some researchers are finding out. They are doing so despite the fact that growing attention to this issue hasn’t been accompanied by money. The U.S. National Institute of Mental Health last year provided just $24.5 million for global mental health efforts, and the Canadian government’s Grand Challenges Canada, which is said to have the largest portfolio of mental health innovation in developing countries, has spent only $28 million on them since it began in 2010. © 2014 The New York Times Company
Link ID: 20404 - Posted: 12.08.2014
By Lenny Bernstein There are 60 million epileptics on the planet, and while advances in medication and implantable devices have helped them, the ability to better detect and even predict when they will have debilitating seizures would be a significant improvement in their everyday lives. Imagine, for example, if an epileptic knew with reasonable certainty that his next seizure would not occur for an hour or a day or a week. That might allow him to run to the market or go out for the evening or plan a short vacation with less concern. Computers and even dogs have been tested in the effort to do this, but now a group of organizations battling epilepsy is employing "big data" to help. They sponsored an online competition that drew 504 entrants who tried to develop algorithms that would detect and predict epileptic seizures. Instead of the traditional approach of asking researchers in a handful of labs to tackle the problem, the groups put huge amounts of data online that was recorded from the brains of dogs and people as they had seizures over a number of months. They then challenged anyone interested to use the information to develop detection and prediction models. "Seizure detection and seizure prediction," said Walter J. Koroshetz, deputy director of the National Institute of Neurological Disorders and Stroke (NINDS), are "two fundamental problems in the field that are poised to take significant advantage of large data computation algorithms and benefit from the concept of sharing data and generating reproducible results."
Link ID: 20403 - Posted: 12.08.2014
By Quassim Cassam Most people wonder at some point in their lives how well they know themselves. Self-knowledge seems a good thing to have, but hard to attain. To know yourself would be to know such things as your deepest thoughts, desires and emotions, your character traits, your values, what makes you happy and why you think and do the things you think and do. These are all examples of what might be called “substantial” self-knowledge, and there was a time when it would have been safe to assume that philosophy had plenty to say about the sources, extent and importance of self-knowledge in this sense. Not any more. With few exceptions, philosophers of self-knowledge nowadays have other concerns. Here’s an example of the sort of thing philosophers worry about: suppose you are wearing socks and believe you are wearing socks. How do you know that that’s what you believe? Notice that the question isn’t: “How do you know you are wearing socks?” but rather “How do you know you believe you are wearing socks?” Knowledge of such beliefs is seen as a form of self-knowledge. Other popular examples of self-knowledge in the philosophical literature include knowing that you are in pain and knowing that you are thinking that water is wet. For many philosophers the challenge is explain how these types of self-knowledge are possible. This is usually news to non-philosophers. Most certainly imagine that philosophy tries to answer the Big Questions, and “How do you know you believe you are wearing socks?” doesn’t sound much like one of them. If knowing that you believe you are wearing socks qualifies as self-knowledge at all — and even that isn’t obvious — it is self-knowledge of the most trivial kind. Non-philosophers find it hard to figure out why philosophers would be more interested in trivial than in substantial self-knowledge. © 2014 The New York Times Company
Link ID: 20402 - Posted: 12.08.2014
By JOHN McWHORTER “TELL me, why should we care?” he asks. It’s a question I can expect whenever I do a lecture about the looming extinction of most of the world’s 6,000 languages, a great many of which are spoken by small groups of indigenous people. For some reason the question is almost always posed by a man seated in a row somewhere near the back. Asked to elaborate, he says that if indigenous people want to give up their ancestral language to join the modern world, why should we consider it a tragedy? Languages have always died as time has passed. What’s so special about a language? The answer I’m supposed to give is that each language, in the way it applies words to things and in the way its grammar works, is a unique window on the world. In Russian there’s no word just for blue; you have to specify whether you mean dark or light blue. In Chinese, you don’t say next week and last week but the week below and the week above. If a language dies, a fascinating way of thinking dies along with it. I used to say something like that, but lately I have changed my answer. Certainly, experiments do show that a language can have a fascinating effect on how its speakers think. Russian speakers are on average 124 milliseconds faster than English speakers at identifying when dark blue shades into light blue. A French person is a tad more likely than an Anglophone to imagine a table as having a high voice if it were a cartoon character, because the word is marked as feminine in his language. This is cool stuff. But the question is whether such infinitesimal differences, perceptible only in a laboratory, qualify as worldviews — cultural standpoints or ways of thinking that we consider important. I think the answer is no. Furthermore, extrapolating cognitive implications from language differences is a delicate business. In Mandarin Chinese, for example, you can express If you had seen my sister, you’d have known she was pregnant with the same sentence you would use to express the more basic If you see my sister, you know she’s pregnant. One psychologist argued some decades ago that this meant that Chinese makes a person less sensitive to such distinctions, which, let’s face it, is discomfitingly close to saying Chinese people aren’t as quick on the uptake as the rest of us. The truth is more mundane: Hypotheticality and counterfactuality are established more by context in Chinese than in English. © 2014 The New York Times Company
Link ID: 20401 - Posted: 12.08.2014
Carl Zimmer For thousands of years, fishermen knew that certain fish could deliver a painful shock, even though they had no idea how it happened. Only in the late 1700s did naturalists contemplate a bizarre possibility: These fish might release jolts of electricity — the same mysterious substance as in lightning. That possibility led an Italian physicist named Alessandro Volta in 1800 to build an artificial electric fish. He observed that electric stingrays had dense stacks of muscles, and he wondered if they allowed the animals to store electric charges. To mimic the muscles, he built a stack of metal disks, alternating between copper and zinc. Volta found that his model could store a huge amount of electricity, which he could unleash as shocks and sparks. Today, much of society runs on updated versions of Volta’s artificial electric fish. We call them batteries. Now a new study suggests that electric fish have anticipated other kinds of technology. The research, by Kenneth C. Catania, a biologist at Vanderbilt University, reveals a remarkable sophistication in the way electric eels deploy their shocks. Dr. Catania, who published the study on Thursday in the journal Science, found that the eels use short shocks like a remote control on their victims, flushing their prey out of hiding. And then they can deliver longer shocks that paralyze their prey at a distance, in precisely the same way that a Taser stops a person cold. “It shows how finely adapted eels are to attack prey,” said Harold H. Zakon, a biologist at the University of Texas at Austin, who was not involved in the study. He considered Dr. Catania’s findings especially impressive since scientists have studied electric eels for more than 200 years. © 2014 The New York Times Company
Link ID: 20400 - Posted: 12.06.2014
by Michael Slezak The elusive link between obesity and high blood pressure has been pinned down to the action of leptin in the brain, and we might be able to block it with drugs. We've known for more than 30 years that fat and high blood pressure are linked, but finding what ties them together has been difficult. One of the favourite candidates has been leptin – a hormone produced by fat cells. Under normal circumstances, when fat cells produce leptin, the hormone sends the message that you've had enough food. But in people with obesity, the body stops responding to this message, and large levels of leptin build up. Leptin is known to activate the regulatory network called the sympathetic nervous system, and it's the activation of sympathetic nerves on the kidneys that seem to be responsible for raising blood pressure. Leptin has thus been linked to blood pressure. However, conclusive evidence has been hard to come by. Michael Cowley of Monash University in Melbourne, Australia, and his colleagues have now conducted a string of experiments that provide some evidence. Through genetic and drug experiments in mice, they have pinpointed an area in the mouse brain that increases blood pressure when it is exposed to high leptin levels. This region is called the dorsomedial hypothalamus, and is thought to be involved in controlling energy consumption. Their findings show that high levels in leptin do indeed boost blood pressure, via this brain region. © Copyright Reed Business Information Ltd.
Link ID: 20398 - Posted: 12.06.2014
By Neuroskeptic | An important new study could undermine the concept of ‘endophenotypes’ – and thus derail one of the most promising lines of research in neuroscience and psychiatry. The findings are out now in Psychophysiology. Unusually, an entire special issue of the journal is devoted to presenting the various results of the study, along with commentary, but here’s the summary paper: Knowns and unknowns for psychophysiological endophenotypes by Minnesota researchers William Iacono, Uma Vaidyanathan, Scott Vrieze and Stephen Malone. In a nutshell, the researchers ran seven different genetic studies to try to find the genetic basis of a total of seventeen neurobehavioural traits, also known as ‘endophenotypes’. Endophenotypes are a hot topic in psychiatric neuroscience, although the concept is somewhat vague. The motivation behind interest in endophenotypes comes mainly from the failure of recent studies to pin down the genetic cause of most psychiatric syndromes: endophenotypes_A Essentially an endophenotype is some trait, which could be almost anything, which is supposed to be related to (or part of) a psychiatric disorder or symptom, but which is “closer to genetics” or “more biological” than the disorder itself. Rather than thousands of genes all mixed together to determine the risk of a psychiatric disorder, each endophenotype might be controlled by only a handful of genes – which would thus be easier to find.
by Viviane Callier It's a fresh problem. People who smoke menthol cigarettes often smoke more frequently and can be less likely to quit – and it could be because fresh-tasting menthol is changing their brains to more sensitive to nicotine. How menthol enhances nicotine addiction has been something of a mystery. Now, Brandon Henderson at the California Institute of Technology in Pasadena and his colleagues have shown that exposing mice to menthol alone causes them to develop more nicotinic receptors, the parts of the brain that are targeted by nicotine. Menthol can be used medically to relieve minor throat irritations, and menthol-flavoured cigarettes were first introduced in the 1920s. But smokers of menthol cigarettes can be less likely to quit. In one study of giving up smoking, 50 per cent of unflavoured-cigarette smokers were able to quit, while menthol smokers showed quitting rates as low as 23 per cent, depending on ethnicity. Over time, smokers of both menthol and unflavoured cigarettes acquire more receptors for nicotine, particularly in neurons involved in the body's neural pathways for reward and motivation. And research last year showed that smokers of menthol cigarettes develop even more of these receptors than smokers of unflavoured cigarettes. To understand how menthol may be altering the brain, Henderson's team exposed mice to either menthol with nicotine, or menthol alone. They found that, even without nicotine, menthol increased the numbers of brain nicotinic receptors. They saw a 78 per cent increase in one particular brain region – the ventral tegmental area – which is involved in the dopamine signalling pathway that mediates in addiction. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 20395 - Posted: 12.06.2014
Injections of a new drug may partially relieve paralyzing spinal cord injuries, based on indications from a study in rats, which was partly funded by the National Institutes of Health. The results demonstrate how fundamental laboratory research may lead to new therapies. “We’re very excited at the possibility that millions of people could, one day, regain movements lost during spinal cord injuries,” said Jerry Silver, Ph.D., professor of neurosciences, Case Western Reserve University School of Medicine, Cleveland, and a senior investigator of the study published in Nature. Every year, tens of thousands of people are paralyzed by spinal cord injuries. The injuries crush and sever the long axons of spinal cord nerve cells, blocking communication between the brain and the body and resulting in paralysis below the injury. On a hunch, Bradley Lang, Ph.D., the lead author of the study and a graduate student in Dr. Silver’s lab, came up with the idea of designing a drug that would help axons regenerate without having to touch the healing spinal cord, as current treatments may require. “Originally this was just a side project we brainstormed in the lab,” said Dr. Lang. After spinal cord injury, axons try to cross the injury site and reconnect with other cells but are stymied by scarring that forms after the injury. Previous studies suggested their movements are blocked when the protein tyrosine phosphatase sigma (PTP sigma), an enzyme found in axons, interacts with chondroitin sulfate proteoglycans, a class of sugary proteins that fill the scars.
Link ID: 20394 - Posted: 12.04.2014
Ewen Callaway A shell found on Java in the late 1800s was recently found to bear markings that seem to have been carved intentionally half a million years ago. The photograph is about 15 millimetres wide. Expand A zigzag engraving on a shell from Indonesia is the oldest abstract marking ever found. But what is most surprising about the half-a-million-year-old doodle is its likely creator — the human ancestor Homo erectus. "This is a truly spectacular find and has the potential to overturn the way we look at early Homo," says Nick Barton, an archaeologist at the University of Oxford, UK, who was not involved in the discovery, which is described in a paper published online in Nature on 3 December1. By 40,000 years ago, and probably much earlier, anatomically modern humans — Homo sapiens — were painting on cave walls in places as far apart as Europe2 and Indonesia3. Simpler ochre engravings found in South Africa date to 100,000 years ago4. Earlier this year, researchers reported a 'hashtag' engraving in a Gibraltar cave once inhabited by Neanderthals5. That was the first evidence for drawing in any extinct species. But until the discovery of the shell engraving, nothing approximating art has been ascribed to Homo erectus. The species emerged in Africa about 2 million years ago and trekked as far as the Indonesian island of Java, before going extinct around 140,000 years ago. Most palaeoanthropologists consider the species to be the direct ancestor of both humans and Neanderthals. © 2014 Nature Publishing Group
Link ID: 20390 - Posted: 12.04.2014
Jia You Ever wonder how cockroaches scurry around in the dark while you fumble to switch on the kitchen light? Scientists know the insect navigates with its senses of touch and smell, but now they have found a new piece to the puzzle: A roach can also see its environment in pitch darkness, by pooling visual signals from thousands of light-sensitive cells in each of its compound eyes, known as photoreceptors. To test the sensitivity of roach vision, researchers created a virtual reality system for the bugs, knowing that when the environment around a roach rotates, the insect spins in the same direction to stabilize its vision. First, they placed the roach on a trackball, where it couldn’t navigate with its mouthpart or antennae. Then the scientists spun black and white gratings around the insect, illuminated by light at intensities ranging from a brightly lit room to a moonless night. The roach responded to its rotating environment in light as dim as 0.005 lux, when each of its photoreceptors was picking up only one photon every 10 seconds, the researchers report online today in The Journal of Experimental Biology. They suggest that the cockroach must rely on unknown neural processing in the deep ganglia, an area in the base of the brain involved in coordinating movements, to process such complex visual information. Understanding this mechanism could help scientists design better imaging systems for night vision. © 2014 American Association for the Advancement of Science.
Link ID: 20389 - Posted: 12.04.2014