Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Matt Kaplan Humans are among the very few animals that constitute a threat to elephants. Yet not all people are a danger — and elephants seem to know it. The giants have shown a remarkable ability to use sight and scent to distinguish between African ethnic groups that have a history of attacking them and groups that do not. Now a study reveals that they can even discern these differences from words spoken in the local tongues. Biologists Karen McComb and Graeme Shannon at the University of Sussex in Brighton, UK, guessed that African elephants (Loxodonta africana) might be able to listen to human speech and make use of what they heard. To tease out whether this was true, they recorded the voices of men from two Kenyan ethnic groups calmly saying, “Look, look over there, a group of elephants is coming,” in their native languages. One of these groups was the semi-nomadic Maasai, some of whom periodically kill elephants during fierce competition for water or cattle-grazing space. The other was the Kamba, a crop-farming group that rarely has violent encounters with elephants. The researchers played the recordings to 47 elephant family groups at Amboseli National Park in Kenya and monitored the animals' behaviour. The differences were remarkable. When the elephants heard the Maasai, they were much more likely to cautiously smell the air or huddle together than when they heard the Kamba. Indeed, the animals bunched together nearly twice as tightly when they heard the Maasai. “We knew elephants could distinguish the Maasai and Kamba by their clothes and smells, but that they can also do so by their voices alone is really interesting,” says Fritz Vollrath, a zoologist at the University of Oxford, UK (see video below). © 2014 Nature Publishing Group
By TARA PARKER-POPE For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory. While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm Neuronix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.” And last month, in a move that could significantly improve the financial prospects for brain-game developers, the Centers for Medicare and Medicaid Services began seeking comments on a proposal that would, in some cases, reimburse the cost of “memory fitness activities.” Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19346 - Posted: 03.11.2014
By ALBERT SUN On a frigid night recently in Randolph, N.J., the Jersey Wildcats junior hockey team flew across the home rink during practice at Aspen Ice Arena, sending ice into the air. Hockey is known for its collisions, and concussions aren’t unusual, but the players didn’t seem particularly worried. On the backs of their heads were flashing green lights, signifying that all was well. “We’ll be behind the bench, and as soon as a player comes back we can look right down and it’ll be a nice light,” said the coach, Justin Stanlick. If the light changes color, “we can know that player needs to go see a trainer to get cleared.” The light is part of a head impact sensor called the Checklight, made by Reebok. The device is a black skullcap with an electronic strip and three lights on the back. It blinks green when a player has sustained no head impact on the ice, yellow after a moderate impact and red after a severe one. The Checklight relies on an accelerometer and a gyroscope to measure the force of an impact. The Checklight flashes green for no impact, yellow for a moderate blow, red for a severe one.Bryan Thomas for The New York Times The Checklight flashes green for no impact, yellow for a moderate blow, red for a severe one. Coaches and parents have only to look to see if a player has taken a serious blow. And because the sensors are objective, Reebok executives say, they may lessen the pressure on young athletes to project toughness and play through a concussion. Gage Malinowski, a 19-year-old defenseman for the Wildcats, recently returned to practice after suffering the latest in a series of concussions during a game in February. “There’s not a game where I don’t have at least 10 hits,” he said. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 19343 - Posted: 03.11.2014
Why do some humans have lighter skin than others? Researchers have longed chalked up the difference to tens of thousands of years of evolution, with darker skin protecting those who live nearer to the equator from the sun’s intense radiation. But a new study of ancient DNA concludes that European skin color has continued to change over the past 5000 years, suggesting that additional factors, including diet and sexual attraction, may also be at play. Our species, Homo sapiens, first arose in Africa about 200,000 years ago, and researchers assume that its first members were as dark-skinned as Africans are today, because dark skin is advantageous in Africa. Dark skin stems from higher levels of the pigment melanin, which blocks UV light and protects against its dangers, such as DNA damage—which can lead to skin cancer—and the breakdown of vitamin B. On the other hand, skin cells need exposure to a certain amount of UV light in order to produce vitamin D. These competing pressures mean that as early humans moved away from the equator, it makes sense that their skin lightened. Recent research, however, has suggested that the picture is not so simple. For one thing, a number of genes control the synthesis of melanin (which itself comes in two different forms in humans), and each gene appears to have a different evolutionary history. Moreover, humans apparently did not begin to lighten up immediately after they migrated from Africa to Europe beginning about 40,000 years ago. In 2012, for example, a team led by Jorge Rocha, a geneticist at the University of Porto in Portugal, looked at variants of four pigmentation genes in modern Portuguese and African populations and calculated that at least three of them had only been strongly favored by evolution tens of thousands of years after humans left Africa. In January, another team, led by geneticist Carles Lalueza-Fox of the University of Barcelona in Spain, sequenced the genome of an 8000-year-old male hunter-gatherer skeleton from the site of La Braña-Arintero in Spain and found that he was dark rather than light-skinned—again suggesting that natural selection for light skin acted relatively late in prehistory. © 2014 American Association for the Advancement of Science
By RON SUSKIND In our first year in Washington, our son disappeared. Just shy of his 3rd birthday, an engaged, chatty child, full of typical speech — “I love you,” “Where are my Ninja Turtles?” “Let’s get ice cream!” — fell silent. He cried, inconsolably. Didn’t sleep. Wouldn’t make eye contact. His only word was “juice.” I had just started a job as The Wall Street Journal’s national affairs reporter. My wife, Cornelia, a former journalist, was home with him — a new story every day, a new horror. He could barely use a sippy cup, though he’d long ago graduated to a big-boy cup. He wove about like someone walking with his eyes shut. “It doesn’t make sense,” I’d say at night. “You don’t grow backward.” Had he been injured somehow when he was out of our sight, banged his head, swallowed something poisonous? It was like searching for clues to a kidnapping. After visits to several doctors, we first heard the word “autism.” Later, it would be fine-tuned to “regressive autism,” now affecting roughly a third of children with the disorder. Unlike the kids born with it, this group seems typical until somewhere between 18 and 36 months — then they vanish. Some never get their speech back. Families stop watching those early videos, their child waving to the camera. Too painful. That child’s gone. In the year since his diagnosis, Owen’s only activity with his brother, Walt, is something they did before the autism struck: watching Disney movies. “The Little Mermaid,” “Beauty and the Beast,” “Aladdin” — it was a boom time for Disney — and also the old classics: “Dumbo,” “Fantasia,” “Pinocchio,” “Bambi.” They watch on a television bracketed to the wall in a high corner of our smallish bedroom in Georgetown. It is hard to know all the things going through the mind of our 6-year-old, Walt, about how his little brother, now nearly 4, is changing. They pile up pillows on our bed and sit close, Walt often with his arm around Owen’s shoulders, trying to hold him — and the shifting world — in place. © 2014 The New York Times Company
Link ID: 19341 - Posted: 03.10.2014
Alison Abbott A simple blood test has the potential to predict whether a healthy person will develop symptoms of dementia within two or three years. If larger studies uphold the results, the test could fill a major gap in strategies to combat brain degeneration, which is thought to show symptoms only at a stage when it too late to treat effectively. The test was identified in a preliminary study involving 525 people aged over 70. The work identified a set of ten lipid metabolites in blood plasma that distinguished with 90% accuracy between people who would remain cognitively healthy from those who would go on to show signs of cognitive impairment. “These findings are potentially very exciting,” says Simon Lovestone, a neuroscientist at the University of Oxford, UK, and a cordinator of a major European public-private partnership seekimg biomarkers for Alzheimer's. But he points out that only 28 participants developed symptoms similar to those of Alzheimer's disease during the latest work. “So the findings need to be confirmed in independent and larger studies.” There is not yet a good treatment for Alzheimer’s disease, which affects 35 million people worldwide. Several promising therapies have been tested in clinical trials over the last few years, but all have failed. However, those trials involved people who had already developed symptoms. Many neuroscientists fear that any benefits of a treatment would be missed in such a study, because it could be impossible to halt the disease once it has manifested. “We desperately need biomarkers which would allow patients to be identified — and recruited into trials — before their symptoms begin,” says Lovestone. © 2014 Nature Publishing Group,
Link ID: 19340 - Posted: 03.10.2014
By BENEDICT CAREY Jack Belliveau, a Harvard scientist whose quest to capture the quicksilver flare of thought inside a living brain led to the first magnetic resonance image of human brain function, died on Feb. 14 in San Mateo, Calif. He was 55. The cause was complications of a gastrointestinal disorder, said his wife, Brigitte Poncelet-Belliveau, a researcher who worked with him at the Athinoula A. Martinos Center for Biomedical Imaging at Massachusetts General Hospital. He lived in Boston. His wife said he died suddenly while visiting an uncle at his childhood home, which he owned. Dr. Belliveau was a 30-year-old graduate student at the Martinos Center when he hatched a scheme to “see” the neural trace of brain activity. Doctors had for decades been taking X-rays and other images of the brain to look for tumors and other lesions and to assess damage from brain injuries. Researchers had also mapped blood flow using positron emission tomography scans, but that required making and handling radioactive trace chemicals, whose signature vanished within minutes. Very few research centers had the technical knowledge or the machinery to pull it off. Dr. Belliveau tried a different approach. He had developed a technique to track blood flow, called dynamic susceptibility contrast, using an M.R.I. scanner that took split-second images, faster than was usual at the time. This would become a standard technique for assessing blood perfusion in stroke patients and others, but Dr. Belliveau thought he would try it to spy on a normal brain in the act of thinking or perceiving. “He went out to RadioShack and bought a strobe light, like you’d see in a disco,” said Dr. Bruce Rosen, director of the Martinos Center and one of Dr. Belliveau’s advisers at the time. “He thought the strobe would help image the visual areas of the brain, where there was a lot of interest.” © 2014 The New York Times Company
Keyword: Brain imaging
Link ID: 19337 - Posted: 03.10.2014
by Hal Hodson WHETHER striding ahead with pride or slouching sullenly, we all broadcast our emotions through body language. Now a computer has learned to interpret those unspoken cues as well as you or I. Antonio Camurri of the University of Genoa in Italy and colleagues have built a system which uses the depth-sensing, motion-capture camera in Microsoft's Kinect to determine the emotion conveyed by a person's body movements. Using computers to capture emotions has been done before, but typically focuses on facial analysis or voice recording. Reading someone's emotional state from the way they walk across a room or their posture as they sit at a desk means they don't have to speak or look into a camera. "It's a nice achievement," says Frank Pollick, professor of psychology at the University of Glasgow, UK. "Being able to use the Kinect for this is really useful." The system uses the Kinect camera to build a stick figure representation of a person that includes information on how their head, torso, hands and shoulders are moving. Software looks for body positions and movements widely recognised in psychology as indicative of certain emotional states. For example, if a person's head is bowed and their shoulders are drooping, that might indicate sadness or fear. Adding in the speed of movement – slow indicates sadness, while fast indicates fear – allows the software to determine how someone is feeling. In tests, the system correctly identified emotions in the stick figures 61.3 per cent of the time, compared with a 61.9 per cent success rate for 60 human volunteers (arXiv.org/1402.5047). Camurri is using the system to build games that teach children with autism to recognise and express emotions through full-body movements. Understanding how another person feels can be difficult for people with autism, and recognising fear is more difficult than happiness. © Copyright Reed Business Information Ltd.
A man blind since birth is taking up a surprising new hobby: photography. His newfound passion is thanks to a system that turns images into sequences of sound. The technology not only gives “sight” to the blind, but also challenges the way neurologists think the brain is organized. In 1992, Dutch engineer Peter Meijer created vOICe, an algorithm that converts simple grayscale images into musical soundscapes. (The capitalized middle letters sound out “Oh, I see!”). The system scans images from left to right, converting shapes in the image into sound as it sweeps, with higher positions in the image corresponding to higher sound frequencies. For instance, a diagonal line stretching upward from left to right becomes a series of ascending musical notes. While more complicated images, such as a person sitting on a lawn chair, at first seem like garbled noise, with enough training users can learn to “hear” everyday scenes. In 2007, neuroscientist Amir Amedi and his colleagues at the Hebrew University of Jerusalem began training subjects who were born blind to use vOICe. Despite having no visual reference points, after just 70 hours of training, the individuals went from “hearing” simple dots and lines to “seeing” whole images such as faces and street corners composed of 4500 pixels. (For comparison, Nintendo’s Mario was made up of just 192 pixels in his first video game appearance.) By attaching a head-mounted camera to a computer and headphones, the blind users were even able to navigate around a room by the sound cues alone. Every few steps the system snaps a photo and converts it into sound, giving the users their bearings as they traverse tables and trashcans. One patient even took up photography, using the head-mounted system to frame his snapshots. © 2014 American Association for the Advancement of Science.
by Bruce Bower Actor Philip Seymour Hoffman’s February death from a drug overdose triggered media reports blaming the terrible disease of addiction for claiming another life. But calling addiction a “disease” may be misguided, according to an alternative view with some scientific basis. Most people who are addicted to cigarette smoking, alcohol or other drugs manage to quit, usually on their own, after experiencing major attitude adjustments. Although relapses occur, successes ultimately outnumber fatalities. People can permanently walk away from addiction. Evidence that addiction is a solvable coping problem rather than a chronic, recurring disease seems like encouraging news. But it’s highly controversial. Neuroscientists and many clinicians regard drug addictions as brain illnesses best vanquished with the help of medications that fight cravings and withdrawal. From this perspective, drug-induced brain changes increase a person’s thirst for artificial highs and make quitting progressively more difficult. This conflict over addiction’s nature plays out in two lines of research: studies of remission and relapse among treated substance abusers and long-term studies of the general population. Follow-up investigations of people who attend treatment programs report that addicts never completely shake an urge to snort, inject, guzzle or otherwise consume their poisons of choice. Ongoing treatment in psychotherapy, rehab centers or 12-step groups encourages temporary runs of sobriety, but it’s easier to kick the bucket than to kick the habit. Surveys and long-term studies of the general population, however, observe that addicts typically spend their youth in a substance-induced haze but drastically cut back or quit using drugs altogether by early adulthood. Most of those who renounce the “high” life do so without formal treatment. © Society for Science & the Public 2000 - 2013.
Keyword: Drug Abuse
Link ID: 19332 - Posted: 03.08.2014
By SABRINA TAVERNISE Middle and high school students who used electronic cigarettes were more likely to smoke real cigarettes and less likely to quit than students who did not use the devices, a new study has found. They were also more likely to smoke heavily. But experts are divided about what the findings mean. The study’s lead author, Stanton Glantz, a professor of medicine at the University of California, San Francisco, who has been critical of the devices, said the results suggested that the use of e-cigarettes was leading to less quitting, not more. “The use of e-cigarettes does not discourage, and may encourage, conventional cigarette use among U.S. adolescents,” the study concluded. It was published online in the journal JAMA Pediatrics on Thursday. But other experts said the data did not support that interpretation. They said that just because e-cigarettes are being used by youths who smoke more and have a harder time quitting does not mean that the devices themselves are the cause of those problems. It is just as possible, they said, that young people who use the devices were heavier smokers to begin with, or would have become heavy smokers anyway. “The data in this study do not allow many of the broad conclusions that it draws,” said Thomas J. Glynn, a researcher at the American Cancer Society. The study is likely to stir the debate further over what electronic cigarettes mean for the nation’s 45 million smokers, about three million of whom are middle and high school students. Some experts worry that e-cigarettes are a gateway to smoking real cigarettes for young people, though most say the data is too skimpy to settle the issue. Others hope the devices could be a path to quitting. So far, the overwhelming majority of young people who use e-cigarettes also smoke real cigarettes, a large federal survey published last year found. Still, while e-cigarette use among youths doubled from 2011 to 2012, regular cigarette smoking for youths has continued to decline. The rate hit a record low in 2013 of 9.6 percent, down by two-thirds from its peak in 1997. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 19331 - Posted: 03.08.2014
by Graham Lawton In August 2013, professional rugby union player Andy Hazell received a massive blow to the head while playing for his club Gloucester. Six "horrendous" months later he retired from the game, stricken by dizziness, mood swings and a sense of detachment. Hazell isn't the first rugby player to experience concussion during a game, and probably won't be the last to have to retire as a result. According to a campaign launched this week, rugby union players don't know enough about the risks of concussion – and the governing bodies aren't doing enough to prevent it. The problem isn't so much one-off blows like the one that ended Hazell's career, but long-term damage caused by repeated concussions over many years. Studies of boxers and American footballers have shown that these can lead to a degenerative brain disease called Chronic Traumatic Encephalopathy (CTE). CTE leads to memory problems, personality change and slowness of movement. It usually shows up in middle age, long after a sporting career is over. CTE has been an issue in American Football for years. Thousands of ex-professionals sued the National Football League alleging that it knew about the risks but covered them up. Last year the NFL offered a $765 million settlement package. Neurologists have long suspected that other contact sports might also lead to CTE – particularly rugby union because of its emphasis on high-speed "hits". Concussion is the fourth most common injury in the professional game. © Copyright Reed Business Information Ltd.
Keyword: Brain Injury/Concussion
Link ID: 19330 - Posted: 03.08.2014
Londa Schiebinger. In Madrid a couple of years ago, I was interviewed for Spanish newspapers. When I later ran the text through Google Translate, I got a shock: I was referred to repeatedly as “he”. Like much science and technology, Google Translate has a male default. When I drive a car, the seatbelt is not designed to accommodate breast tissue. Any medicines I take are more likely to have been tested on male than on female animals. There are moral issues here: women pay taxes and buy products and should not be short-changed. But scientific objectivity is at stake, too. Because medical research is done mainly in males, there is a male bias in, for example, the choice of drug targets. Science is halving the potential field of innovation. This is not about active discrimination; the bias is largely unconscious. Google Translate defaults to the masculine pronoun because 'he' is more commonly found on the Web than 'she'. Yet that is changing: an analysis of American-English texts in Google Books shows that the ratio of masculine to feminine pronouns has fallen to around 2:1, from a peak of 4:1 in the 1960s. In the summer of 2012, I invited Google and several language-processing experts to a Gendered Innovations workshop at Harvard University in Cambridge, Massachusetts. They listened to the problem for about 20 minutes, then said: “We can fix that!” Although it is complicated, the search for solutions is on. Fixing the problem is great, but constantly retrofitting for women is not the best road forwards. A better way is to include gender at all relevant phases of research — when setting priorities, gathering and analysing data, evaluating results, developing patents and, finally, transferring ideas to markets. Science and technology should take into account the biological and social needs of both women and men. Unconscious sex and gender bias can be socially harmful and expensive. © 2014 Nature Publishing Group
Keyword: Sexual Behavior
Link ID: 19329 - Posted: 03.06.2014
Clara Moskowitz When mathematicians describe equations as beautiful, they are not lying. Brain scans show that their minds respond to beautiful equations in the same way other people respond to great paintings or masterful music. The finding could bring neuroscientists closer to understanding the neural basis of beauty, a concept that is surprisingly hard to define. In the study, researchers led by Semir Zeki of University College London asked 16 mathematicians to rate 60 equations on a scale ranging from "ugly" to "beautiful." Two weeks later, the mathematicians viewed the same equations and rated them again while lying inside a functional magnetic resonance imaging (fMRI) scanner. The scientists found that the more beautiful an equation was to the mathematician, the more activity his or her brain showed in an area called the A1 field of the medial orbitofrontal cortex. The orbitofrontal cortex is associated with emotion, and this particular region of it has shown in previous tests to be correlated with emotional responses to visual and musical beauty. The researchers wondered whether the trend would extend to mathematical beauty, which "has a much deeper intellectual source than visual or musical beauty, which are more 'sensible' and perceptually based," they wrote in a paper reporting their results published on 13 February in Frontiers of Human Neuroscience. Investigating mathematical beauty allowed the researchers to test the role of culture and learning in aesthetic appreciation. The scientists hypothesized that while people with no musical or artistic training can still appreciate Beethoven’s and Michelangelo's works, only those who understand the meaning behind certain mathematical formulas would find them beautiful. © 2014 Nature Publishing Group,
Link ID: 19327 - Posted: 03.06.2014
By Tara Bahrampour, Alzheimer’s disease likely plays a much larger role in the deaths of older Americans than is reported, according to a new study that says the disease may be the third-leading cause of death in the United States. The Centers for Disease Control and Prevention lists Alzheimer’s as the sixth-leading cause of death, far below heart disease and cancer. But the new report, published Wednesday in the medical journal of the American Academy of Neurology, suggests that the current system of relying on death certificates for causes misses the complexity of dying for many older people and underestimates the impact of Alzheimer’s. While the CDC attributed about 84,000 deaths in 2010 to Alzheimer’s, the report estimated that number to be 503,400 among people 75 and older. That puts it in a close third place, behind heart disease and cancer, and well above chronic lung disease, stroke and accidents, which rank third, fourth and fifth. Alzheimer’s is somewhat of a sleeping giant compared with other leading killers that have received more funding over the years. While deaths from these diseases have been going down thanks to better treatment and prevention, the number of people suffering from Alzheimer’s is quickly rising and the disease is always fatal. More than 5 million people in the United States are estimated to have Alzheimer’s. With the aging of the baby-boom generation, this number is expected to nearly triple by 2050 if there are no significant medical breakthroughs, according to the Alzheimer’s Association. © 1996-2014 The Washington Post
Link ID: 19326 - Posted: 03.06.2014
By GRETCHEN REYNOLDS Obesity may have harmful effects on the brain, and exercise may counteract many of those negative effects, according to sophisticated new neurological experiments with mice, even when the animals do not lose much weight. While it’s impossible to know if human brains respond in precisely the same way to fat and physical activity, the findings offer one more reason to get out and exercise. It’s been known for some time that obesity can alter cognition in animals. Past experiments with lab rodents, for instance, have shown that obese animals display poor memory and learning skills compared to their normal-weight peers. They don’t recognize familiar objects or recall the location of the exit in mazes that they’ve negotiated multiple times. But scientists hadn’t understood how excess weight affects the brain. Fat cells, they knew, manufacture and release substances into the bloodstream that flow to other parts of the body, including the heart and muscles. There, these substances jump-start biochemical processes that produce severe inflammation and other conditions that can lead to poor health. Many thought the brain, though, should be insulated from those harmful effects. It contains no fat cells and sits behind the protective blood-brain barrier that usually blocks the entry of undesirable molecules. However, recent disquieting studies in animals indicate that obesity weakens that barrier, leaving it leaky and permeable. In obese animals, substances released by fat cells can ooze past the barrier and into the brain. The consequences of that seepage became the subject of new neurological experiments conducted by researchers at Georgia Regents University in Augusta and published last month in The Journal of Neuroscience. © 2014 The New York Times Company
Link ID: 19323 - Posted: 03.05.2014
by Tom Siegfried Max Planck, who shook the world with his discovery of quantum physics, also offered a warning. “One must be careful,” he said, “when using the word, real.” It was good advice. As physicists explored the quantum domain, they found that usual ideas about reality did not apply. Reality in the realm of atoms was nothing like the world of rocks and baseballs and planets, where Newton’s laws of motions ruled with rigor. Among atoms, the rules were more like Olympic ice skating judging, with unpredictable scores. Gradually physicists, engineers and even screenwriters became familiar with quantum weirdness and used it in lasers, computers and movie plots. Quantum reality might be crazy, but it’s our reality, and most scientists, anyway, have become more or less used to it. Nevertheless, Planck’s warning still applies. Perhaps the quantum picture of reality is another illusion, just like Newton’s was. Human insight into nature may not yet have penetrated reality’s ultimate veil. In other words, maybe reality always dresses itself up in Newtonian or Einsteinian or quantum clothing, and science hasn’t yet seen what reality looks like naked. And that might explain why nature has been able to protect so many of its mysteries from science’s prying eyes — mysteries like the identity of dark matter, the math describing quantum gravity, the mechanism underlying consciousness. And whether humans have free will. Maybe reality always dresses itself up in Newtonian or Einsteinian or quantum clothing, and science hasn’t yet seen what reality looks like naked. © Society for Science & the Public 2000 - 2013.
Link ID: 19319 - Posted: 03.04.2014
By Deborah Kotz Glaring gaps persist in medical researchers’ efforts to understand gender differences in common diseases, two decades after the passage of pivotal legislation mandating that more women be included in government-funded clinical trials, concludes a report being released Monday at a women’s health summit in Boston. The authors said research still lags on understanding how treatments for heart disease—the number one killer of women—affect the sexes differently, because women make up only one-third of the participants in clinical trials to test new drugs and medical devices, and most of these studies don’t report results for men and women separately. Women who don’t smoke are, for unknown reasons, three times more likely than non-smoking men to get lung cancer, but they’re still less likely than men to enroll in lung cancer studies, notes the report from Brigham and Women’s Hospital. And twice as many women suffer from depression as men, but fewer than 45 percent of animal studies to better understand anxiety and depression use female lab animals. “Women are now routinely included in clinical trials, but we are far from achieving equity in biomedical research,” said report leader Dr. Paula Johnson, executive director of the Brigham’s Connors Center for Women’s Health and Gender Biology. To address research disparities, the authors recommended that government agencies, drug manufacturers, hospital review boards that approve studies, and medical journal editors institute substantial changes to make women’s health a research priority. © 2014 Boston Globe Media Partners, LLC
Keyword: Sexual Behavior
Link ID: 19318 - Posted: 03.04.2014
By ANDREW POLLACK In the late 1980s, scientists at Osaka University in Japan noticed unusual repeated DNA sequences next to a gene they were studying in a common bacterium. They mentioned them in the final paragraph of a paper: “The biological significance of these sequences is not known.” Now their significance is known, and it has set off a scientific frenzy. The sequences, it turns out, are part of a sophisticated immune system that bacteria use to fight viruses. And that system, whose very existence was unknown until about seven years ago, may provide scientists with unprecedented power to rewrite the code of life. In the past year or so, researchers have discovered that the bacterial system can be harnessed to make precise changes to the DNA of humans, as well as other animals and plants. This means a genome can be edited, much as a writer might change words or fix spelling errors. It allows “customizing the genome of any cell or any species at will,” said Charles Gersbach, an assistant professor of biomedical engineering at Duke University. Already the molecular system, known as Crispr, is being used to make genetically engineered laboratory animals more easily than could be done before, with changes in multiple genes. Scientists in China recently made monkeys with changes in two genes. Scientists hope Crispr might also be used for genomic surgery, as it were, to correct errant genes that cause disease. Working in a laboratory — not, as yet, in actual humans — researchers at the Hubrecht Institute in the Netherlands showed they could fix a mutation that causes cystic fibrosis. But even as it is stirring excitement, Crispr is raising profound questions. Like other technologies that once wowed scientists — like gene therapy, stem cells and RNA interference — it will undoubtedly encounter setbacks before it can be used to help patients. © 2014 The New York Times Company
Keyword: Genes & Behavior
Link ID: 19317 - Posted: 03.04.2014
By LISA FELDMAN BARRETT CAN you detect someone’s emotional state just by looking at his face? It sure seems like it. In everyday life, you can often “read” what someone is feeling with the quickest of glances. Hundreds of scientific studies support the idea that the face is a kind of emotional beacon, clearly and universally signaling the full array of human sentiments, from fear and anger to joy and surprise. Increasingly, companies like Apple and government agencies like the Transportation Security Administration are banking on this transparency, developing software to identify consumers’ moods or training programs to gauge the intent of airline passengers. The same assumption is at work in the field of mental health, where illnesses like autism and schizophrenia are often treated in part by training patients to distinguish emotions by facial expression. But this assumption is wrong. Several recent and forthcoming research papers from the Interdisciplinary Affective Science Laboratory, which I direct, suggest that human facial expressions, viewed on their own, are not universally understood. The pioneering work in the field of “emotion recognition” was conducted in the 1960s by a team of scientists led by the psychologist Paul Ekman. Research subjects were asked to look at photographs of facial expressions (smiling, scowling and so on) and match them to a limited set of emotion words (happiness, anger and so on) or to stories with phrases like “Her husband recently died.” Most subjects, even those from faraway cultures with little contact with Western civilization, were extremely good at this task, successfully matching the photos most of the time. Over the following decades, this method of studying emotion recognition has been replicated by other scientists hundreds of times. In recent years, however, at my laboratory we began to worry that this research method was flawed. In particular, we suspected that by providing subjects with a preselected set of emotion words, these experiments had inadvertently “primed” the subjects — in effect, hinting at the answers — and thus skewed the results. © 2014 The New York Times Company
Link ID: 19316 - Posted: 03.03.2014