Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company
Link ID: 20056 - Posted: 09.10.2014
By SOMINI SENGUPTA A coalition of political figures from around the world, including Kofi Annan, the former United Nations secretary general, and several former European and Latin American presidents, is urging governments to decriminalize a variety of illegal drugs and set up regulated drug markets within their own countries. The proposal by the group, the Global Commission on Drug Policy, goes beyond its previous call to abandon the nearly half-century-old American-led war on drugs. As part of a report scheduled to be released on Tuesday, the group goes much further than its 2011 recommendation to legalize cannabis. The former Brazilian president Fernando Henrique Cardoso, a member of the commission, said the group was calling for the legal regulation of “as many of the drugs that are currently illegal as possible, with the understanding that some drugs may remain too dangerous to decriminalize.” The proposal comes at a time when several countries pummeled by drug violence, particularly in Latin America, are rewriting their own drug laws, and when even the United States is allowing state legislatures to gingerly regulate cannabis use. The United Nations is scheduled to hold a summit meeting in 2016 to evaluate global drug laws. The commission includes former presidents like Mr. Cardoso of Brazil, Ernesto Zedillo of Mexico and Ruth Dreifuss of Switzerland, along with George P. Shultz, a former secretary of state in the Reagan administration, among others. The group stops short of calling on countries to legalize all drugs right away. It calls instead for countries to continue to pursue violent criminal gangs, to stop incarcerating users and to offer treatment for addicts. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20052 - Posted: 09.10.2014
By Mo Costandi The nerve endings in your fingertips can perform complex neural computations that were thought to be carried out by the brain, according to new research published in the journal Nature Neuroscience. The processing of both touch and visual information involves computations that extract the geometrical features of objects we touch and see, such as the edge orientation. Most of this processing takes place in the brain, which contains cells that are sensitive to the orientation of edges on the things we touch and see, and which pass this information onto cells in neighbouring regions, that encode other features. The brain has outsourced some aspects of visual processing, such as motion detection, to the retina, and the new research shows that something similar happens in the touch processing pathway. Delegating basic functions to the sense organs in this way could be an evolutionary mechanism that enables the brain to perform other, more sophisticated information processing tasks more efficiently. Your fingertips are among the most sensitive parts of your body. They are densely packed with thousands of nerve endings, which produce complex patterns of nervous impulses that convey information about the size, shape and texture of objects, and your ability to identify objects by touch and manipulate them depends upon the continuous influx of this information. © 2014 Guardian News and Media Limited
Keyword: Pain & Touch
Link ID: 20051 - Posted: 09.09.2014
By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.
Link ID: 20049 - Posted: 09.09.2014
By Maggie Fox, Erika Edwards and Judy Silverman Here’s how you might be able to turn autism around in a baby: Carefully watch her cues, and push just a little harder with that game of peek-a-boo or “This little piggy.” But don’t push too hard — kids with autism are super-sensitive. That’s what Sally Rogers of the University of California, Davis has found in an intense experiment with the parents of infants who showed clear signs of autism. It’s one of the most hopeful signs yet that if you diagnose autism very early, you can help children rewire their brains and reverse the symptoms. It was a small study, and it’s very hard to find infants who are likely to have autism, which is usually diagnosed in the toddler years. But the findings, published in the Journal of Autism and Developmental Disorders, offer some hope to parents worried about their babies. “With only seven infants in the treatment group, no conclusions can be drawn,” they wrote. However, the effects were striking. Six out of the seven children in the study had normal learning and language skills by the time they were 2 to 3. Isobel was one of them. “She is 3 years old now and she is a 100 percent typical, normally developing child,” her mother, Megan, told NBC News. The family doesn’t want their last name used for privacy reasons. “We don’t have to do the therapy any more. It literally rewired her brain.” Autism is a very common diagnosis for children in the U.S. The latest survey by the Centers for Disease Control and Prevention shows a startling 30 percent jump among 8-year-olds diagnosed with the disorder in a two-year period, to one in every 68 children.
Link ID: 20047 - Posted: 09.09.2014
// by Richard Farrell Conventional thinking has long held that pelvic bones in whales and dolphins, evolutionary throwbacks to ancestors that once walked on land, are vestigial and will disappear millions of years from now. But researchers from University of Southern California and the Natural History Museum of Los Angeles County (NHM) have upended that assumption. The scientists argue in a paper just published in the journal Evolution that cetacean (whale and dolphin) pelvic bones certainly do have a purpose and that they're specifically targeted, by selection, for mating. The muscles that control a cetacean's penis are attached to the creature's pelvic bones. Matthew Dean, assistant professor at the USC Dornsife College of Letters, Arts and Sciences, and Jim Dines, collections manager of mammalogy at NHM, wanted to find out if pelvic bones could be evolutionarily advantageous by impacting the overall amount of control an individual creature has with its penis. The pair spent four years examining whale and dolphin pelvic bones, using a 3D laser scanner to study the shape and size of the samples in extreme detail. Then they gathered as much data as they could find -- reaching back to whaler days -- on whale testis size relative to body mass. The testis data was important because in nature, species in "promiscuous," competitive mating environments (where females mate with multiple males) develop larger testes, relative to their body mass, in order to outdo the competition. © 2014 Discovery Communications, LLC.
By BENEDICT CAREY Imagine that on Day 1 of a difficult course, before you studied a single thing, you got hold of the final exam. The motherlode itself, full text, right there in your email inbox — attached mistakenly by the teacher, perhaps, or poached by a campus hacker. No answer key, no notes or guidelines. Just the questions. Would that help you study more effectively? Of course it would. You would read the questions carefully. You would know exactly what to focus on in your notes. Your ears would perk up anytime the teacher mentioned something relevant to a specific question. You would search the textbook for its discussion of each question. If you were thorough, you would have memorized the answer to every item before the course ended. On the day of that final, you would be the first to finish, sauntering out with an A+ in your pocket. And you would be cheating. But what if, instead, you took a test on Day 1 that was just as comprehensive as the final but not a replica? You would bomb the thing, for sure. You might not understand a single question. And yet as disorienting as that experience might feel, it would alter how you subsequently tuned into the course itself — and could sharply improve your overall performance. This is the idea behind pretesting, one of the most exciting developments in learning-science. Across a variety of experiments, psychologists have found that, in some circumstances, wrong answers on a pretest aren’t merely useless guesses. Rather, the attempts themselves change how we think about and store the information contained in the questions. On some kinds of tests, particularly multiple-choice, we benefit from answering incorrectly by, in effect, priming our brain for what’s coming later. That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20043 - Posted: 09.08.2014
by Laura Beil The obesity crisis has given prehistoric dining a stardom not known since Fred Flintstone introduced the Bronto Burger. Last year, “Paleo diet” topped the list of most-Googled weight loss searches, as modern Stone Age dieters sought the advice of bestsellers like The Paleo Solution or The Primal Blueprint, which encourages followers to “honor your primal genes.” The assumption is that America has a weight problem because human metabolism runs on ancient genes that are ill equipped for contemporary eating habits. In this line of thinking, a diet true to the hunter-gatherers we once were — heavy on protein, light on carbs — will make us skinny again. While the fad has attracted skepticism from those who don’t buy the idea whole hog, there’s still plenty of acceptance for one common premise about the evolution of obesity: Our bodies want to stockpile fat. For most of human history, the theory goes, hunter-gatherers ate heartily when they managed to slay a fleeing mastodon. Otherwise, prehistoric life meant prolonged stretches of near starvation, surviving only on inner reserves of adipose. Today, modern humans mostly hunt and gather at the drive-thru, but our Pleistocene genes haven’t stopped fretting over the coming famine. The idea that evolution favored calorie-hoarding genes has long shaped popular and scientific thinking. Called the “thrifty gene” hypothesis, it has arguably been the dominant theory for evolutionary origins of obesity, and by extension diabetes. (Insulin resistance and diabetes so commonly accompany obesity that doctors have coined the term “diabesity.”) However, it’s not that difficult to find scientists who call the rise of the thrifty gene theory a feat of enthusiasm over evidence. Greg Gibson, director of the Center for Integrative Genomics at Georgia Tech in Atlanta, calls the data “somewhere between scant and nonexistent — a great example of crowd mentality in science.” © Society for Science & the Public 2000 - 2014
Link ID: 20042 - Posted: 09.06.2014
By Jeffrey Mervis Embattled U.K. biomedical researchers are drawing some comfort from a new survey showing that a sizable majority of the public continues to support the use of animals in research. But there’s another twist that should interest social scientists as well: The government’s decision this year to field two almost identical surveys on the topic offers fresh evidence that the way you ask a question affects how people answer it. Since 1999, the U.K. Department for Business, Innovation & Skills (BIS) has been funding a survey of 1000 adults about their attitudes toward animal experimentation. But this year the government asked the London-based pollsters, Ipsos MORI, to carry out a new survey, changing the wording of several questions. (The company also collected additional information, including public attitudes toward different animal species and current rules regarding their use.) For example, the phrase “animal experimentation” was replaced by “animal research” because the latter is “less inflammatory,” notes Ipsos MORI Research Manager Jerry Latter. In addition, says Emma Brown, a BIS spokeswoman, the word research “more accurately reflects the range of procedures that animals may be involved in, including the breeding of genetically modified animals.” But government officials also value the information about long-term trends in public attitudes that can be gleaned from the current survey. So they told the company to conduct one last round—the 10th in the series—at the same time they deployed the new survey. Each survey went to a representative, but different, sample of U.K. adults. © 2014 American Association for the Advancement of Scienc
Keyword: Animal Rights
Link ID: 20041 - Posted: 09.06.2014
Ewen Callaway Caffeine's buzz is so nice it evolved twice. The coffee genome has now been published, and it reveals that the coffee plant makes caffeine using a different set of genes from those found in tea, cacao and other perk-you-up plants. Coffee plants are grown across some 11 million hectares of land, with more than two billion cups of the beverage drunk every day. It is brewed from the fermented, roasted and ground berries of Coffea canephora and Coffea arabica, known as robusta and arabica, respectively. An international team of scientists has now identified more than 25,000 protein-making genes in the robusta coffee genome. The species accounts for about one-third of the coffee produced, much of it for instant-coffee brands such as Nescafe. Arabica contains less caffeine, but its lower acidity and bitterness make it more flavourful to many coffee drinkers. However, the robusta species was selected for sequencing because its genome is simpler than arabica’s. Caffeine evolved long before sleep-deprived humans became addicted to it, probably to defend the coffee plant against predators and for other benefits. For example, coffee leaves contain the highest levels of caffeine of any part of the plant, and when they fall on the soil they stop other plants from growing nearby. “Caffeine also habituates pollinators and makes them want to come back for more, which is what it does to us, too,” says Victor Albert, a genome scientist at the University of Buffalo in New York, who co-led the sequencing effort. The results were published on 4 September in Science1. © 2014 Nature Publishing Group
By LISA SANDERS, M.D. On Thursday, we challenged Well readers to take on the case of a 19-year-old man who suddenly collapsed at work after months of weakness and fatigue dotted with episodes of nausea and vomiting. More than 500 of you wrote in with suggested diagnoses. And more than 60 of you nailed it. The cause of this man’s collapse, weakness, nausea and vomiting was… Addisonian crisis because of Addison’s disease Addison’s disease, named after Dr. Thomas Addison, the 19th-century physician who first described the disorder, occurs when the adrenal glands stop producing the fight-or-flight hormones, particularly cortisol and adrenaline and a less well known but equally important hormone called aldosterone that helps the body manage salt. In Addison’s, the immune system mistakenly attacks the adrenal glands as if they were foreign invaders. Why this happens is not well understood, but without these glands and the essential hormones they make, the body cannot respond to biological stress. The symptoms of Addison’s are vague. That’s one reason it’s so hard to diagnosis. Patients complain of weakness and fatigue. They often crave salt. And when confronted with any stress — an infection or an injury — patients with Addison’s may go into adrenal crisis, characterized by nausea and vomiting, low blood pressure and, sometimes, physical collapse. Their blood pressure may drop so low that oxygen-carrying blood cannot reach the extremities, causing skin to turn blue; if blood fails to reach even more essential organs, it can lead to death. © 2014 The New York Times Company
Keyword: Hormones & Behavior
Link ID: 20037 - Posted: 09.06.2014
by Sandrine Ceurstemont Screening an instructional monkey movie in a forest reveals that marmosets do not only learn from family members: they also copy on-screen strangers. It is the first time such a video has been used for investigations in the wild. Tina Gunhold at the University of Vienna, Austria, and her colleagues filmed a common marmoset retrieving a treat from a plastic device. They then took the device to the Atlantic Forest near Aldeia in Pernambuco, Brazil, and showed the movie to wild marmosets there. Although monkeys are known to learn from others in their social group, especially when they are youngMovie Camera, little is known about their ability to learn from monkeys that do not belong to the same group. Marmosets are territorial, so the presence of an outsider – even a virtual one on a screen – could provoke an attack. "We didn't know if wild marmosets would be frightened of the video box but actually they were all attracted to it," says Gunhold. Compared to monkeys shown a static image of the stranger, video-watching marmosets were more likely to manipulate the device, typically copying the technique shown (see video). Young monkeys spent more time near the video box than older family members, suggesting that they found the movie more engaging – although as soon as one monkey mastered the task, it was impossible to tell whether the others were learning from the video or from their relative. "We think it's a combination of both," says Gunhold. © Copyright Reed Business Information Ltd.
Yves Frégnac & Gilles Laurent Launched in October 2013, the Human Brain Project (HBP) was sold by charismatic neurobiologist Henry Markram as a bold new path towards understanding the brain, treating neurological diseases and building information technology. It is one of two 'flagship' proposals funded by the European Commission's Future and Emerging Technologies programme (see go.nature.com/icotmi). Selected after a multiyear competition, the project seemed like an exciting opportunity to bring together neuroscience and IT to generate practical applications for health and medicine (see go.nature.com/2eocv8). Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. In recent months, the HBP executive board revealed plans to drastically reduce its experimental and cognitive neuroscience arm, provoking wrath in the European neuroscience community. The crisis culminated with an open letter from neuroscientists (including one of us, G.L.) to the European Commission on 7 July 2014 (see www.neurofuture.eu), which has now gathered more than 750 signatures. Many signatories are scientists in experimental and theoretical fields, and the list includes former HBP participants. The letter incorporates a pledge of non-participation in a planned call for 'partnering projects' that must raise about half of the HBP's total funding. This pledge could seriously lower the quality of the project's final output and leave the planned databases empty. © 2014 Nature Publishing Group
Keyword: Brain imaging
Link ID: 20033 - Posted: 09.04.2014
By GRETCHEN REYNOLDS Amyotrophic lateral sclerosis has been all over the news lately because of the ubiquitous A.L.S. ice bucket challenge. That attention has also reinvigorated a long-simmering scientific debate about whether participating in contact sports or even vigorous exercise might somehow contribute to the development of the fatal neurodegenerative disease, an issue that two important new studies attempt to answer. Ever since the great Yankees first baseman Lou Gehrig died of A.L.S. in 1941 at age 37, many Americans have vaguely connected A.L.S. with athletes and sports. In Europe, the possible linkage has been more overtly discussed. In the past decade, several widely publicized studies indicated that professional Italian soccer players were disproportionately prone to A.L.S., with about a sixfold higher incidence than would have been expected numerically. Players were often diagnosed while in their 30s; the normal onset is after 60. These findings prompted some small, follow-up epidemiological studies of A.L.S. patients in Europe. To the surprise and likely consternation of the researchers, they found weak but measurable associations between playing contact sports and a heightened risk for A.L.S. The data even showed links between being physically active — meaning exercising regularly — and contracting the disease, raising concerns among scientists that exercise might somehow be inducing A.L.S. in susceptible people, perhaps by affecting brain neurons or increasing bodily stress. But these studies were extremely small and had methodological problems. So to better determine what role sports and exercise might play in the risk for A.L.S., researchers from across Europe recently combined their efforts into two major new studies. The results should reassure those of us who exercise. The numbers showed that physical activity — whether at work, in sports or during exercise — did not increase people’s risk of developing A.L.S. © 2014 The New York Times Company
Keyword: ALS-Lou Gehrig's Disease
Link ID: 20031 - Posted: 09.03.2014
By Kate Wong In 1871 Charles Darwin surmised that humans were evolutionarily closer to the African apes than to any other species alive. The recent sequencing of the gorilla, chimpanzee and bonobo genomes confirms that supposition and provides a clearer view of how we are connected: chimps and bonobos in particular take pride of place as our nearest living relatives, sharing approximately 99 percent of our DNA, with gorillas trailing at 98 percent. Yet that tiny portion of unshared DNA makes a world of difference: it gives us, for instance, our bipedal stance and the ability to plan missions to Mars. Scientists do not yet know how most of the DNA that is uniquely ours affects gene function. But they can conduct whole-genome analyses—with intriguing results. For example, comparing the 33 percent of our genome that codes for proteins with our relatives' genomes reveals that although the sum total of our genetic differences is small, the individual differences pervade the genome, affecting each of our chromosomes in numerous ways. © 2014 Scientific American
By Jonathan Webb Science reporter, BBC News Monkeys at the top and bottom of the social pecking order have physically different brains, research has found. A particular network of brain areas was bigger in dominant animals, while other regions were bigger in subordinates. The study suggests that primate brains, including ours, can be specialised for life at either end of the hierarchy. The differences might reflect inherited tendencies toward leading or following, or the brain adapting to an animal's role in life - or a little of both. Neuroscientists made the discovery, which appears in the journal Plos Biology, by comparing brain scans from 25 macaque monkeys that were already "on file" as part of ongoing research at the University of Oxford. "We were also looking at learning and memory and decision-making, and the changes that are going on in your brain when you're doing those things," explained Dr MaryAnn Noonan, the study's first author. The decision to look at the animals' social status produced an unexpectedly clear result, Dr Noonan said. "It was surprising. All our monkeys were of different ages and different genders - but with fMRI (functional magnetic resonance imaging) you can control for all of that. And we were consistently seeing these same networks coming out." BBC © 2014
|By Madhuvanthi Kannan We humans assume we are the smartest of all creations. In a world with over 8.7 million species, only we have the ability to understand the inner workings of our body while also unraveling the mysteries of the universe. We are the geniuses, the philosophers, the artists, the poets and savants. We amuse at a dog playing ball, a dolphin jumping rings, or a monkey imitating man because we think of these as remarkable acts for animals that, we presume, aren’t smart as us. But what is smart? Is it just about having ideas, or being good at language and math? Scientists have shown, time and again, that many animals have an extraordinary intellect. Unlike an average human brain that can barely recall a vivid scene from the last hour, chimps have a photographic memory and can memorize patterns they see in the blink of an eye. Sea lions and elephants can remember faces from decades ago. Animals also have a unique sense perception. Sniffer dogs can detect the first signs of colon cancer by the scents of patients, while doctors flounder in early diagnosis. So the point is animals are smart too. But that’s not the upsetting realization. What happens when, for just once, a chimp or a dog challenges man to one of their feats? Well, for one, a precarious face-off – like the one Matt Reeves conceived in the Planet of the Apes – would seem a tad less unlikely than we thought. In a recent study by psychologists Colin Camerer and Tetsuro Matsuzawa, chimps and humans played a strategy game – and unexpectedly, the chimps outplayed the humans. Chimps are a scientist’s favorite model to understand human brain and behavior. Chimp and human DNAs overlap by a whopping 99 percent, which makes us closer to chimps than horses to zebras. Yet at some point, we evolved differently. Our behavior and personalities, molded to some extent by our distinct societies, are strikingly different from that of our fellow primates. Chimps are aggressive and status-hungry within their hierarchical societies, knit around a dominant alpha male. We are, perhaps, a little less so. So the question arises whether competitive behavior is hard-wired in them. © 2014 Scientific American
By Virginia Morell Figaro, a Goffin’s cockatoo (Cacatua goffini) housed at a research lab in Austria, stunned scientists a few years ago when he began spontaneously making stick tools from the wooden beams of his aviary. The Indonesian parrots are not known to use tools in the wild, yet Figaro confidently employed his sticks to rake in nuts outside his wire enclosure. Wondering if Figaro’s fellow cockatoos could learn by watching his methods, scientists set up experiments for a dozen of them. One group watched as Figaro used a stick to reach a nut placed inside an acrylic box with a wire-mesh front panel; others saw “ghost demonstrators”—magnets that were hidden beneath a table and that the researchers controlled—displace the treats. Each bird was then placed in front of the box, with a stick just like Figaro’s lying nearby. The group of three males and three females that had watched Figaro also picked up the sticks, and made some efforts reminiscent of his actions. But only those three males, such as the one in the photo above, became proficient with the tool and successfully retrieved the nuts, the scientists report online today in the Proceedings of the Royal Society B. None of the females did so; nor did any of the birds, male or female, in the ghost demonstrator group. Because the latter group failed entirely, the study shows that the birds need living teachers, the scientists say. Intriguingly, the clever observers developed a better technique than Figaro’s for getting the treat. Thus, the cockatoos weren’t copying his exact actions, but emulating them—a distinction that implies some degree of creativity. Two of the successful cockatoos were later given a chance to make a tool of their own. One did so immediately (as in the video above), and the other succeeded after watching Figaro. It may be that by learning to use a tool, the birds are stimulated to make tools of their own, the scientists say. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20027 - Posted: 09.03.2014
Moheb Costandi Autism can be baffling, appearing in various forms and guises and thwarting our best attempts to understand the minds of people affected by it. Anything we know for sure about the disorder can probably be traced back to the pioneering research of the developmental psychologist Uta Frith. Frith was the first to propose that people with autism lack theory of mind, the ability to attribute beliefs, intentions and desires to others. She also recognized the superior perceptual abilities of many with the disorder — and their tendency to be unable to see the forest for the trees. Frith, now affiliated with the Institute of Cognitive Neuroscience at University College London (UCL), has shaped autism research for an entire generation of investigators. Meanwhile, her husband Chris Frith formulated a new view of schizophrenia, a mental illness marked by hallucinations, disordered thinking and apathy. His work explored how the disorder affects the experience of agency, the sense that we are in control of our bodies and responsible for our actions. And his innovations in brain imaging helped researchers examine the relationship between brain and mind. Independently, husband and wife explored the social and cognitive aspects of these psychiatric disorders. Together, they helped lay the foundations of cognitive neuroscience, the discipline that seeks to understand the biological basis of thought processes. Trevor Robbins, a cognitive neuroscientist at the University of Cambridge in the U.K., calls them “tremendously influential pioneers,” in particular because both brought a social perspective to cognitive neuroscience. © Copyright 2014 Simons Foundation
Link ID: 20019 - Posted: 09.02.2014
By ANAHAD O’CONNOR People who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight. The new study was financed by the National Institutes of Health and published in the Annals of Internal Medicine. It included a racially diverse group of 150 men and women — a rarity in clinical nutrition studies — who were assigned to follow diets for one year that limited either the amount of carbs or fat that they could eat, but not overall calories. “To my knowledge, this is one of the first long-term trials that’s given these diets without calorie restrictions,” said Dariush Mozaffarian, the dean of the Friedman School of Nutrition Science and Policy at Tufts University, who was not involved in the new study. “It shows that in a free-living setting, cutting your carbs helps you lose weight without focusing on calories. And that’s really important because someone can change what they eat more easily than trying to cut down on their calories.” © 2014 The New York Times Company
Link ID: 20018 - Posted: 09.02.2014