Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 2198

By James Gallagher Health editor, BBC News website The first hints a drug can slow the progression of Alzheimer's disease have emerged at a conference. Data from pharmaceutical company Eli Lilly suggests its solanezumab drug can cut the rate of the dementia's progression by about a third. The results are being met with cautious optimism, with a separate trial due to report next year. The death of brain cells in Alzheimer's is currently inexorable. Solanezumab may be able to keep them alive. Current medication, such as Aricept, can only manage the symptoms of dementia by helping the dying brain cells function. But solanezumab attacks the deformed proteins, called amyloid, that build up in the brain during Alzheimer's. It is thought the formation of sticky plaques of amyloid between nerve cells leads to damage and eventually brain cell death. Solanezumab has long been the great hope of dementia research, yet an 18-month trial of the drug seemingly ended in failure in 2012. But when Eli Lilly looked more closely at the data, there were hints it could be working for patients in the earliest stages of the disease. So the company asked just over 1,000 of the patients in the original trial with mild Alzheimer's to take the drug for another two years. And the results from this extension of the original trial have now been presented, at the Alzheimer's Association International Conference. Dr Eric Siemers, from the Lilly Research Laboratories, in Indiana, told the BBC: "It's another piece of evidence that solanezumab does have an effect on the underlying disease pathology. "We think there is a chance that solanezumab will be the first disease-modifying medication to be available." The company also started a completely separate trial in mild patients in 2012, and these results could prove to be the definitive moment for the drug. © 2015 BBC.

Keyword: Alzheimers
Link ID: 21203 - Posted: 07.22.2015

Ewen Callaway A mysterious group of humans crossed the Bering land bridge from Siberia into the Americas thousands of years ago, genetic analyses reveal. Modern-day signatures of this ‘ghost population’ survive in people who live deep in the Brazilian Amazon, but the two research teams who have made the discovery have different ideas about when and how these migrants reached the Americas1, 2. "This is an unexpected finding," says Jennifer Raff, an anthropological geneticist at the University of Texas at Austin who was not involved in either study. "It’s honestly one of the most exciting results we’ve seen in a while." North and South America were the last continents that humans settled. Previous studies of DNA from modern and ancient Native Americans suggest that the trek was made at least 15,000 years ago (although the timing is not clear-cut) by a single group dubbed the ‘First Americans’, who crossed the Bering land bridge linking Asia and North America. “The simplest hypothesis would be that a single population penetrated the ice sheets and gave rise to most of the Americans,” says David Reich, a population geneticist at Harvard Medical School in Boston, Massachusetts. In 2012, his team found evidence for a single founding migration in the genomes from members of 52 Native American groups3. So Reich was flabbergasted when a colleague called Pontus Skoglund mentioned during a conference last year that he had found signs of a second ancient migration to the Americas lurking in the DNA of contemporary Native Amazonians. Reich wasted no time in verifying the discovery. “During the session afterward, he passed his laptop over the crowd, and he had corroborated the results,” says Skoglund, who is now a researcher in Reich’s lab. © 2015 Nature Publishing Group

Keyword: Genes & Behavior
Link ID: 21201 - Posted: 07.22.2015

By Smitha Mundasad Health reporter A type of diabetes drug may offer a glimmer of hope in the fight against Parkinson's disease, research in the journal Plos Medicine suggests. Scientists found people taking glitazone pills were less likely to develop Parkinson's than patients on other diabetes drugs. But they caution the drugs can have serious side-effects and should not be given to healthy people. Instead, they suggest the findings should prompt further research. 'Unintended benefits' There are an estimated 127,000 people in the UK with Parkinson's disease, which can lead to tremor, slow movement and stiff muscles. And charities say with no drugs yet proven to treat the condition, much more work is needed in this area. The latest study focuses solely on people with diabetes who did not have Parkinson's disease at the beginning of the project. Researchers scoured UK electronic health records to compare 44,597 people prescribed glitazone pills with 120,373 people using other anti-diabetic treatment. They matched participants to ensure their age and stage of diabetes treatment were similar. Scientists found fewer people developed Parkinson's in the glitazone group - but the drug did not have a long-lasting benefit. Any potential protection disappeared once patients switched to another type of pill. Dr Ian Douglas, lead researcher at the London School of Hygiene and Tropical Medicine, said: "We often hear about negative side-effects associated with medications, but sometimes there can also be unintended beneficial effects. "Our findings provide unique evidence that we hope will drive further investigation into potential drug treatments for Parkinson's disease." © 2015 BBC

Keyword: Parkinsons
Link ID: 21199 - Posted: 07.22.2015

By BENEDICT CAREY Bill Cosby stands accused of committing date rape long before drugs like GHB or Rohypnol were widely used for that purpose. Many of Mr. Cosby’s accusers believed they had been drugged — but with what? And how? In a recently obtained legal deposition, Mr. Cosby acknowledged giving quaaludes to some women with whom he had sex, but said consumption of the drug was consensual, “the same as a person would say, ‘Have a drink.’ ” In a transcript of the deposition, reported on Sunday in The New York Times, the comedian told lawyers had had obtained seven prescriptions for quaaludes. Originally approved and marketed as a “safer” sleeping pill, less addictive than barbiturates, the drug (known generically as methaqualone) was both sedating and hypnotic. Recreational use was common, but the federal government withdrew them from the market in 1982. “It was inevitable that it would be tried by people looking for a ‘better high,’ ” Dr. David Smith, medical director of the Haight-Ashbury Free Clinic, and Dr. Donald Wesson noted in The Journal of Psychedelic Drugs. Intoxication with quaaludes “soon developed a reputation for being especially pleasant.” Young people in the 1970s used quaaludes as they would a strong drink: to loosen up, to relax, to socialize. The pills also won a reputation for inducing periods of euphoria, as well as sexual arousal — “heroin for lovers,” some called it. By the middle of the decade, quaaludes were a staple of the club scene, often taken with alcohol. So embedded were quaaludes in the cultural scene that even years later the Dead Kennedys and Billy Idol were singing about the drug’s captivating effects. But reckless users risked overdose, especially when combining the pills with alcohol, which could lead to coma, convulsions and sometimes death. In a 1973 review of 252 hospital admissions for drug overdose, doctors in Edinburgh found that the third most common cause of “self-poisoning,” after barbiturates and LSD, was Mandrax — the British version of quaaludes, widely abused in South Africa as well. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 21198 - Posted: 07.22.2015

Carl Zimmer An ant colony is an insect fortress: When enemies invade, soldier ants quickly detect the incursion and rip their foes apart with their oversize mandibles. But some invaders manage to slip in with ease, none more mystifyingly than the ant nest beetle. Adult beetles stride into an ant colony in search of a mate, without being harassed. They lay eggs, from which larva hatch. As far as scientists can tell, workers feed the young beetles as if they were ants. When the beetles grow into adults, the ants swarm around them, grooming their bodies. In exchange for this hospitality, the beetles sink their jaws into ant larvae and freshly moulted adults in order to drink their body fluids. “They’re like vampire beetles wandering in the ant nests,” said Andrea Di Giulio, an entomologist at Roma Tre University in Rome. Dr. Di Giulio and his colleagues have now uncovered a remarkable trick that the beetles use to fool their hosts. It turns out they can perform uncanny impressions, mimicking a range of ant calls. Dr. Di Giulio and his colleagues study a species of ant nest beetle called Paussus favieri, which lives in the Atlas Mountains of Morocco, where it infiltrates the nests of Moroccan ants, known as Pheidole pallidula. Like many ant species, Pheidole pallidula makes noises by rubbing its legs against ridges on its body. The meanings of these signals vary from species to species; leaf-cutting ants summon bodyguards for the march back to the nest; in other species, a queen trills to her workers to attend to her. Scientists have found that Pheidole pallidula ants make three distinct sounds, each produced by a different caste: soldiers, workers and the queen. © 2015 The New York Times Company

Keyword: Evolution; Language
Link ID: 21193 - Posted: 07.20.2015

by Stephen Buchmann Flowers, bugs and bees: Stephen Buchmann wanted to study them all when he was a kid. "I never grew out of my bug-and-dinosaur phase," he tells NPR's Arun Rath. "You know, since about the third grade, I decided I wanted to chase insects, especially bees." These days, he's living that dream. As a pollination ecologist, he's now taking a particular interest in how flowers attract insects. In his new book, The Reason for Flowers, he looks at more than just the biology of flowers — he dives into the ways they've laid down roots in human history and culture, too. On the real 'reason for flowers' The reason for flowers is actually one word: sex. So, flowers are literally living scented billboards that are advertising for sexual favors, whether those are from bees, flies, beetles, butterflies or us, because quite frankly most of the flowers in the world have gotten us to do their bidding. But that's only the first stage because flowers, if they're lucky, turn into fruits, and those fruits and seeds feed the world. On the raucous secret lives of beetles One of my favorite memories is roaming the Napa foothills as a UC Davis grad student. And I would go to the wineries, of course, and in between I would find western spice bush, which is this marvelous flower that kind of smells like a blend between a cabernet and rotten fruit. And when you find those flowers and open them up, you discover literally dozens of beetles in there, mating, defecating, pollinating — having a grand time. © 2015 NPR

Keyword: Sexual Behavior; Evolution
Link ID: 21192 - Posted: 07.20.2015

By NANCY L. SEGAL, AARON T. GOETZ and ALBERTO C. MALDONADO SEVERAL years ago, while browsing the campus bookstore, one of us, Professor Segal, encountered a display table filled with Squirtles. A Squirtle is a plush-toy turtle manufactured by the company Russ Berrie. They were adorable and she couldn’t wait to take one home. Afterward, Professor Segal began wondering why this toy was so attractive and suspected that its large, round eyes played a major role. It’s well known that a preference for large eyes emerges in humans by 5 months of age. But the Squirtle was even more appealing than many of its big-eyed competitors. Was there something else about its eyes? Professor Segal consulted one of us, Professor Goetz, a colleague in evolutionary psychology, who suggested that because the Squirtle’s eyes were bordered in white, the cooperative eye hypothesis might have answers. This hypothesis, developed by the Japanese researchers Hiromi Kobayashi and Shiro Kohshima, holds that the opaque white outer coating of the human eye, or sclera, evolved to assist communication between people by signaling the direction of their gaze. The clear visibility of the sclera is a uniquely human characteristic. Other primates, such as the African great apes, also track the gaze direction of others, yet their sclera are pigmented or, if white, not visible. The great apes appear to use head direction more than other cues when following another’s gaze. Do humans have an instinctive preference for the whites of eyes, thus explaining the allure of the Squirtle? We conducted a study, to be published this year in the journal Evolution and Human Behavior, that suggested that the answer was yes. First we had to make some stuffed animals. We used six specially designed sets of three or four animals each (three cats, three dogs, three octopuses, four elephants, four snails and four turtles). The animals within each set were identical except for the eyes, which varied with respect to the size, color and presence of sclera. © 2015 The New York Times Company

Keyword: Emotions
Link ID: 21191 - Posted: 07.20.2015

By THE EDITORIAL BOARD Scientific research has a gender gap, and not just among humans. In many disciplines, the animals used to study diseases and drugs are overwhelmingly male, which may significantly reduce the reliability of research and lead to drugs that won’t work in half the population. A new study published in the journal Nature Neuroscience suggests that research done on male animals may not hold up for women. Its authors reported that hypersensitivity to pain works differently in male and female mice. For males, immune cells called microglia appear to be required for pain hypersensitivity, and inhibiting their function also relieves the pain. But in female mice, different cells are involved, and targeting the microglia has no effect. If these differences occur in mice, they may occur in humans too. This means a pain drug targeting microglia might appear to work in male mice, but wouldn’t work on women. Failure to consider gender in research is very much the norm. According to one analysis of scientific studies that were published in 2009, male animals outnumbered females 5.5 to 1 in neuroscience, 5 to 1 in pharmacology, and 3.7 to 1 in physiology. Only 45 percent of animal studies involving depression or anxiety and only 38 percent involving strokes used females, even though these conditions are more common in women. In 1994, the National Institutes of Health confronted gender imbalance in clinical drug trials and began requiring that women and minorities be included in clinical studies; women now make up around half of clinical trial participants. In June, the N.I.H. announced that it would begin requiring researchers to take gender into account in preclinical research on animals as well. © 2015 The New York Times Company

Keyword: Sexual Behavior
Link ID: 21188 - Posted: 07.20.2015

By C. CLAIBORNE RAY Q. Can you hear without an intact eardrum? A. “When the eardrum is not intact, there is usually some degree of hearing loss until it heals,” said Dr. Ashutosh Kacker, an ear, nose and throat specialist at NewYork-Presbyterian Hospital and a professor at Weill Cornell Medical College, “but depending on the size of the hole, you may still be able to hear almost normally.” Typically, Dr. Kacker said, the larger an eardrum perforation is, the more severe the hearing loss it will cause. The eardrum, or tympanic membrane, is a thin, cone-shaped, pearly gray tissue separating the outer and middle ear canals, he explained. Soundwaves hit the eardrum, which in turn vibrates the bones of the middle ear. The bones pass the vibration to the cochlea, which leads to a signal cascade culminating in the sound being processed by the brain and being heard. There are several ways an eardrum can be ruptured, Dr. Kacker said, including trauma, exposure to sudden or very loud noises, foreign objects inserted deeply into the ear canal, and middle-ear infection. “Usually, the hole will heal by itself and hearing will improve within about two weeks to a few months, especially in cases where the hole is small,” he said. Sometimes, when the hole is larger or does not heal well, surgery will be required to repair the eardrum. Most such operations are done by placing a patch over the hole to allow it to heal, and the surgery is usually very successful in restoring hearing, Dr. Kacker said. © 2015 The New York Times Company

Keyword: Hearing
Link ID: 21187 - Posted: 07.20.2015

by Sarah Schwartz Brainlike cell bundles grown in a lab may expose some of the biological differences of autistic brains. Researchers chemically reprogrammed human stem cells into small bundles of functional brain cells that mimic the developing brain. These “organoids” appear to be different when built with cells from autistic patients compared with when they are built with cells from the patients’ non-autistic family members, researchers report July 16 in Cell. The brainlike structures created from cells taken from autistic children showed increased activity in genes that control brain-cell growth and development. Too much activity in one of these genes led to an overproduction of a certain type of brain cell that suppresses the activity of other brain cells. At an early stage of development, the miniature organs grown from autistic patients’ stem cells also showed faster cell division rates than those grown from the cells of non-autistic relatives. Though the study was small, using cells from only four autistic patients and eight family members, the results may indicate common factors underlying autism, the scientists say. © Society for Science & the Public 2000 - 2015.

Keyword: Autism
Link ID: 21186 - Posted: 07.18.2015

It’s a good combination. Gene therapy to reverse blindness repairs damaged cells in the eye and also rearranges the brain to help process the new information. Visual pathways in the brain are made up of millions of interconnected neurons. When sensory signals are sent along them, the connections between neurons become strong. If underused – for example, as people lose their sight – the connections become weak and disorganised. Over the past few years, a type of gene therapy – injecting healthy genes into the eye to repair mutations – has emerged as a promising way to treat congenital and degenerative blindness. One of the first successful trials began in 2007. It involved 10 blind volunteers with a hereditary disease called Leber’s congenital amaurosis. The condition causes the retina to degenerate and leaves people completely blind early in life. Mutations in at least 19 genes can cause the disease, but all of the people in the trial had mutations in a gene called RPE65. The participants got an injection of a harmless virus in one of their eyes. The virus inserted healthy copies of RPE65 into their retina. Some of the volunteers went from straining to see a hand waving half a metre from their face to being able to read six lines on a sight chart. Others were able to navigate around an obstacle course in dim light – something that would have been impossible before the therapy. © Copyright Reed Business Information Ltd.

Keyword: Vision; Genes & Behavior
Link ID: 21185 - Posted: 07.18.2015

Austin Frakt It’s a Catch-22 that even those with a common cold experience: Illness disrupts sleep. Poor sleep makes the symptoms of the illness worse. What’s true for a cold also holds for more serious conditions that co-occur with insomnia. Depression, post-traumatic stress disorder, alcohol dependence, fibromyalgia, cancer and chronic pain often give rise to insomnia, just as sleeplessness exacerbates the symptoms of these diseases. Historically, insomnia was considered a symptom of other diseases. Today it is considered an illness in its own right and recognized as an amplifier of other mental and physical ailments. When a person is chronically tired, pain can be more painful, depression deeper, anxiety heightened. What should doctors address first, insomnia or the co-occurring condition? How about both at the same time? A new study suggests that a therapy that improves sleep also reduces symptoms of other illnesses that often disrupt it. The study published in JAMA Internal Medicine examined the effect of cognitive behavioral therapy for insomnia in patients with serious mental and physical conditions. As its name suggests, C.B.T.-I. is a treatment that works through the mind. As I wrote about a few weeks ago, the therapy treats insomnia without medications, combining good sleep hygiene techniques with more consistent wake times, relaxation techniques and positive sleep attitudes and thoughts. Several clinical trials have shown that C.B.T.-I. provides as good or better relief of symptoms of insomnia than prescription drugs, with improvements in sleep that are more durable. C.B.T.-I. can usually be delivered relatively inexpensively through an online course costing about $40. Compared with those who didn’t receive C.B.T.-I., patients who did increased the time asleep in bed by about 12 percentage points, fell asleep about 25 minutes faster and decreased the amount of time awake in the middle of the night by about 45 minutes, according to Jade Wu, lead study author and a Boston University doctoral student in psychology. © 2015 The New York Times Company

Keyword: Sleep
Link ID: 21181 - Posted: 07.18.2015

By Emily Underwood Glance at a runner's wrist or smartphone, and you'll likely find a GPS-enabled app or gadget ticking off miles and minutes as she tries to break her personal record. Long before FitBit or MapMyRun, however, the brain evolved its own system for tracking where we go. Now, scientists have discovered a key component of this ancient navigational system in rats: a group of neurons called "speed cells" that alter their firing rates with the pace at which the rodents run. The findings may help explain how the brain maintains a constantly updated map of our surroundings. In the 1970s, neuroscientist John O'Keefe, now at University College London, discovered neurons called place cells, which fire whenever a rat enters a specific location. Thirty-five years later, neuroscientists May-Britt and Edvard Moser, now at the Norwegian University of Science and Technology in Trondheim, Norway, discovered a separate group of neurons, called grid cells, which fire at regular intervals as rats traverse an open area, creating a hexagonal grid with coordinates similar to those in GPS. The Mosers and O'Keefe shared last year's Nobel Prize in Physiology and Medicine for their findings, which hint at how the brain constructs a mental map of an animal's environment. Still mysterious, however, is how grid and place cells obtain the information that every GPS system requires: the angle and speed of an object's movement relative to a known starting point, says Edvard Moser, co-author of the new study along with May-Britt Moser, his spouse and collaborator. If the brain does indeed contain a dynamic, internal map of the world, "there has to be a speed signal" that tells the network how far an animal has moved in a given period of time, he says. © 2015 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 21178 - Posted: 07.16.2015

By Laura Sanders Everybody knows people who seem to bumble through life with no sense of time — they dither for hours on a “quick” e-mail or expect an hour’s drive to take 20 minutes. These people are always late. But even for them, such minor lapses in timing are actually exceptions. We notice these flaws precisely because they’re out of the ordinary. Humans, like other animals, are quite good at keeping track of passing time. This talent does more than keep office meetings running smoothly. Almost everything our bodies and brains do requires precision clockwork — down to milliseconds. Without a sharp sense of time, people would be reduced to insensate messes, unable to move, talk, remember or learn. “We don’t think about it, but just walking down the street is an exquisitely timed operation,” says neuroscientist Lila Davachi of New York University. Muscles fire and joints steady themselves in a precisely orchestrated time series that masquerades as an unremarkable part of everyday life. A sense of time, Davachi says, is fundamental to how we move, how we act and how we perceive the world. Yet for something that forms the bedrock of nearly everything we do, time perception is incredibly hard to study. “It’s a quagmire,” says cognitive neuroscientist Peter Tse of Dartmouth College. The problem is thorny because there are thousands of possible intricate answers, all depending on what exactly scientists are asking. Their questions have begun to reveal an astonishingly complex conglomerate of neural timekeepers that influence each other. © Society for Science & the Public 2000 - 2015.

Keyword: Attention
Link ID: 21177 - Posted: 07.16.2015

By Fredrick Kunkle A new study suggests that Alzheimer’s disease may affect the brain differently in black people compared with whites. The research, conducted by Lisa L. Barnes at the Rush University Medical Center, suggests that African Americans are less likely than Caucasians to have Alzheimer’s disease alone and more likely to have other pathologies associated with dementia. These include the presence of Lewy bodies, which are abnormal proteins found in the brain, and lesions arising from the hardening of tiny arteries in the brain, which is caused mainly by high blood pressure and other vascular conditions. The findings suggest that researchers should seek different strategies to prevent and treat Alzheimer’s disease in blacks. While many therapeutic strategies focus on removing or modifying beta amyloid – a key ingredient whose accumulation leads to the chain of event triggering the neurodegenerative disease – the study suggests that possible treatments should pursue additional targets, particularly for African Americans. But the study also points up the critical need to enroll more black people in clinical trials. Although Barnes said the research was the largest sample of its kind, she also acknowledged that the sample is still small. And that’s at least partially because blacks, for a variety of cultural and historical reasons, are less likely to participate in scientific research.

Keyword: Alzheimers
Link ID: 21176 - Posted: 07.16.2015

Nikki Stevenson Autism may represent the last great prejudice we, as a society, must overcome. History is riddled with examples of intolerance directed at the atypical. We can sometime fear that which diverges from the “norm”, and sometimes that fear leads us to frame those who are different as being in some way lesser beings than ourselves. Intolerances take generations to overcome. Racism is an obvious, ugly example. Other horrifying examples are easy to find: take, for instance the intolerance faced by the gay community. Countless gay people were diagnosed with “sociopathic personality disturbance” based upon their natural sexuality. Many were criminalised and forced into institutions, the “treatments” to which they were subject akin to torture. How many believed they were sociopathic and hated themselves, wishing to be free from the label they had been given? How many wished to be “cured” so that they could live their lives in peace? The greatest crime was the damage perpetuated by the image projected upon them by those claiming to be professionals. Autism is framed as a disability, with mainstream theories presenting autism via deficit models. Popular theory is often passed off as fact with no mention of the morphic nature of research and scientific process. Most mainstream theory is silent regarding autistic strengths and atypical ability; indeed, what is in print often presents a damning image of autism as an “epidemic”. Hurtful words such as risk, disease, disorder, impairment, deficit, pedantic, obsession are frequently utilised. © 2015 Guardian News and Media Limited

Keyword: Autism
Link ID: 21175 - Posted: 07.16.2015

By Gretchen Reynolds Would soccer be safer if young players were not allowed to head the ball? According to a new study of heading and concussions in youth soccer, the answer to that question is not the simple yes that many of us might have hoped. Soccer parents — and nowadays we are legion — naturally worry about head injuries during soccer, whether our child’s head is hitting the ball or another player. The resounding head-to-head collision between Alexandra Popp of Germany and Morgan Brian of the United States during the recent Women’s World Cup sent shivers down many of our spines. People’s concerns about soccer heading and concussions have grown so insistent in the past year or so that some doctors, parents and former professional players have begun to call for banning the practice outright among younger boys and girls, up to about age 14, and curtailing it at other levels of play. Ridding youth soccer of heading, many of these advocates say, would virtually rid the sport of severe head injuries. But Dawn Comstock, for one, was skeptical when she heard about the campaign. An associate professor of public health at the University of Colorado in Denver and an expert on youth sports injuries, she is also, she said, “a believer in evidence-based decision making.” And she said she wasn’t aware of any studies showing that heading causes the majority of concussions in the youth game. In fact, she and her colleagues could not find any large-scale studies examining the causes of concussions in youth soccer at all. So, for a study being published this week in JAMA Pediatrics, she and her colleagues decided to investigate the issue themselves. © 2015 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 21172 - Posted: 07.15.2015

Tina Hesman Saey The Earth has rhythm. Every 24 hours, the planet pirouettes on its axis, bathing its surface alternately in sunlight and darkness. Organisms from algae to people have evolved to keep time with the planet’s light/dark beat. They do so using the world’s most important timekeepers: daily, or circadian, clocks that allow organisms to schedule their days so as not to be caught off guard by sunrise and sunset. A master clock in the human brain appears to synchronize sleep and wake with light. But there are more. Circadian clocks tick in nearly every cell in the body. “There’s a clock in the liver. There’s a clock in the adipose [fat] tissue. There’s a clock in the spleen,” says Barbara Helm, a chronobiologist at the University of Glasgow in Scotland. Those clocks set sleep patterns and meal times. They govern the flow of hormones and regulate the body’s response to sugar and many other important biological processes (SN: 4/10/10, p. 22). Having timekeepers offers such an evolutionary advantage that species have developed them again and again throughout history, many scientists say. But as common and important as circadian clocks have become, exactly why such timepieces arose in the first place has been a deep and abiding mystery. Many scientists favor the view that multiple organisms independently evolved their own circadian clocks, each reinventing its own wheel. Creatures probably did this to protect their fragile DNA from the sun’s damaging ultraviolet rays. But a small group of researchers think otherwise. They say there had to be one mother clock from which all others came. That clock evolved to shield the cell from oxygen damage or perhaps provide other, unknown advantages. © Society for Science & the Public 2000 - 2015

Keyword: Biological Rhythms; Evolution
Link ID: 21171 - Posted: 07.15.2015

By Michael Balter The human hand is a marvel of dexterity. It can thread a needle, coax intricate melodies from the keys of a piano, and create lasting works of art with a pen or a paintbrush. Many scientists have assumed that our hands evolved their distinctive proportions over millions of years of recent evolution. But a new study suggests a radically different conclusion: Some aspects of the human hand are actually anatomically primitive—more so even than that of many other apes, including our evolutionary cousin the chimpanzee. The findings have important implications for the origins of human toolmaking, as well as for what the ancestor of both humans and chimps might have looked like. Humans and chimps diverged from a common ancestor perhaps about 7 million years ago, and their hands now look very different. We have a relatively long thumb and shorter fingers, which allows us to touch our thumbs to any point along our fingers and thus easily grasp objects. Chimps, on the other hand, have much longer fingers and shorter thumbs, perfect for swinging in trees but much less handy for precision grasping. For decades the dominant view among researchers was that the common ancestor of chimps and humans had chimplike hands, and that the human hand changed in response to the pressures of natural selection to make us better toolmakers. But recently some researchers have begun to challenge the idea that the human hand fundamentally changed its proportions after the evolutionary split with chimps. The earliest humanmade stone tools are thought to date back 3.3 million years, but new evidence has emerged that some of the earliest members of the human line—such as the 4.4-million-year-old Ardipithecus ramidus (“Ardi”)—had hands that resembled those of modern humans rather than chimps, even though it did not make tools. © 2015 American Association for the Advancement of Science

Keyword: Evolution
Link ID: 21170 - Posted: 07.15.2015

Emily M. Keeler How smart are you? Would you be smarter if you ate more blueberries, played better video games, learned another language, or read the novels of Proust? What about if you did more crosswords? Took some pills? Electrically stimulated your brain? Or are you smart enough as is? Patricia Marx is, of course, pretty smart already. She’s a Guggenheim fellow, and a New Yorker Staff writer. She’s also funny as hell. Marx was the first woman elected to the Harvard Lampoon, and is a former writer for Saturday Night Live. Her new book, Let’s Be Less Stupid, takes readers on a chatty nosedive into her own neurological functioning, in the hopes that maybe, just maybe, we’ll all become a little smarter along the way. The book is the most recent entrant in the burgeoning field of pop-neuroscience, but with a liberal helping of humour. For four months, Marx did everything she could to add a few points to her IQ, including becoming adept with Luminosity, a video game app intended to improve cognitive function, and learning a little Cherokee in the hopes of multilingualism giving her brain a competitive advantage against the inevitable decline. When I called Marx to chat about her brain, she said she was sure her four months of compulsively chasing brain health hadn’t done her much good; in fact, she sheepishly admitted she’d already forgotten most of what she’d learned about the incredibly complex organ folded up inside our skulls. © 2015 National Post

Keyword: Miscellaneous
Link ID: 21169 - Posted: 07.15.2015