Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1553

By KEN BELSON The National Football League, which for years disputed evidence that its players had a high rate of severe brain damage, has stated in federal court documents that it expects nearly a third of retired players to develop long-term cognitive problems and that the conditions are likely to emerge at “notably younger ages” than in the general population. The findings are a result of data prepared by actuaries hired by the league and provided to the United States District Court judge presiding over the settlement between the N.F.L. and 5,000 former players who sued the league, alleging that it had hidden the dangers of concussions from them. “Thus, our assumptions result in prevalence rates by age group that are materially higher than those expected in the general population,” said the report, prepared by the Segal Group for the N.F.L. “Furthermore, the model forecasts that players will develop these diagnoses at notably younger ages than the generation population.” The statements are the league’s most unvarnished admission yet that the sport’s professional participants sustain severe brain injuries at far higher rates than the general population. They also appear to confirm what scientists have said for years: that playing football increases the risk of developing neurological conditions like chronic traumatic encephalopathy, a degenerative brain disease that can be identified only in an autopsy. “This statement clears up all the confusion and doubt manufactured over the years questioning the link between brain trauma and long-term neurological impairment,” said Chris Nowinski, the executive director of the Sports Legacy Institute, who has for many years pressured the league to acknowledge the connection between football and brain diseases. © 2014 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 20073 - Posted: 09.13.2014

By Smitha Mundasad Health reporter, BBC News Giving young people Botox treatment may restrict their emotional growth, experts warn. Writing in the Journal of Aesthetic Nursing, clinicians say there is a growing trend for under-25s to seek the wrinkle-smoothing injections. But the research suggests "frozen faces" could stop young people from learning how to express emotions fully. A leading body of UK plastic surgeons says injecting teenagers for cosmetic reasons is "morally wrong". Botox and other versions of the toxin work by temporarily paralysing muscles in the upper face to reduce wrinkling when people frown. Nurse practitioner Helen Collier, who carried out the research, says reality TV shows and celebrity culture are driving young people to idealise the "inexpressive frozen face." But she points to a well-known psychological theory, the facial feedback hypothesis, that suggests adolescents learn how best to relate to people by mimicking their facial expressions. She says: "As a human being our ability to demonstrate a wide range of emotions is very dependent on facial expressions. "Emotions such as empathy and sympathy help us to survive and grow into confident and communicative adults." But she warns that a "growing generation of blank-faced" young people could be harming their ability to correctly convey their feelings. "If you wipe those expressions out, this might stunt their emotional and social development," she says. The research calls for practitioners to use assessment tools to decide whether there are clear clinical reasons for Botox treatment. BBC © 2014

Keyword: Emotions
Link ID: 20070 - Posted: 09.13.2014

Corie Lok Tami Morehouse's vision was not great as a child, but as a teenager she noticed it slipping even further. The words she was trying to read began disappearing into the page and eventually everything faded to a dull, grey haze. The culprit was a form of Leber's congenital amaurosis (LCA), a group of genetic disorders in which light-sensing cells in the retina die off, usually resulting in total blindness by the time people reach their thirties or forties. But Morehouse got a reprieve. In 2009, at the age of 44, the social worker from Ashtabula, Ohio, became the oldest participant in a ground-breaking clinical trial to test a gene therapy for LCA. Now, she says, she can see her children's eyes, and the colours of the sunset seem brighter than before. Morehouse calls these improvements life-changing, but they are minor compared with the changes in some of the younger trial participants. Corey Haas was eight years old when he was treated in 2008 — the youngest person to receive the therapy. He went from using a white cane to riding a bicycle and playing softball. Morehouse often wonders what she would be able to see now if she had been closer to Haas's age when she had the therapy. “I was born a little too soon,” she says. Visual impairment affects some 285 million people worldwide, about 39 million of whom are considered blind, according to a 2010 estimate from the World Health Organization. Roughly 80% of visual impairment is preventable or curable, including operable conditions such as cataracts that account for much of the blindness in the developing world. But retinal-degeneration disorders — including age-related macular degeneration, the leading cause of blindness in the developed world — have no cure. © 2014 Nature Publishing Group

Keyword: Vision
Link ID: 20064 - Posted: 09.11.2014

By JOSHUA A. KRISCH PHILADELPHIA — McBaine, a bouncy black and white springer spaniel, perks up and begins his hunt at the Penn Vet Working Dog Center. His nose skims 12 tiny arms that protrude from the edges of a table-size wheel, each holding samples of blood plasma, only one of which is spiked with a drop of cancerous tissue. The dog makes one focused revolution around the wheel before halting, steely-eyed and confident, in front of sample No. 11. A trainer tosses him his reward, a tennis ball, which he giddily chases around the room, sliding across the floor and bumping into walls like a clumsy puppy. McBaine is one of four highly trained cancer detection dogs at the center, which trains purebreds to put their superior sense of smell to work in search of the early signs of ovarian cancer. Now, Penn Vet, part of the University of Pennsylvania’s School of Veterinary Medicine, is teaming with the university’s chemistry and physics departments to isolate cancer chemicals that only dogs can smell. They hope this will lead to the manufacture of nanotechnology sensors that are capable of detecting bits of cancerous tissue 1/100,000th the thickness of a sheet of paper. “We don’t ever anticipate our dogs walking through a clinic,” said the veterinarian Dr. Cindy Otto, the founder and executive director of the Working Dog Center. “But we do hope that they will help refine chemical and nanosensing techniques for cancer detection.” Since 2004, research has begun to accumulate suggesting that dogs may be able to smell the subtle chemical differences between healthy and cancerous tissue, including bladder cancer, melanoma and cancers of the lung, breast and prostate. But scientists debate whether the research will result in useful medical applications. © 2014 The New York Times Company

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20063 - Posted: 09.11.2014

By Sarah Zielinski The marshmallow test is pretty simple: Give a child a treat, such as a marshmallow, and promise that if he doesn’t eat it right away, he’ll soon be rewarded with a second one. The experiment was devised by Stanford psychologist Walter Mischel in the late 1960s as a measure of self-control. When he later checked back in with kids he had tested as preschoolers, those who had been able to wait for the second treat appeared to be doing better in life. They tended to have fewer behavioral or drug-abuse problems, for example, than those who had given in to temptation. Most attempts to perform this experiment on animals haven’t worked out so well. Many animals haven’t been willing to wait at all. Dogs, primates, and some birds have done a bit better, managing to wait at least a couple of minutes before eating the first treat. The best any animal has managed has been 10 minutes—a record set earlier this year by a couple of crows. The African grey parrot is a species known for its intelligence. Animal psychologist Irene Pepperberg, now at Harvard, spent 30 years studying one of these parrots, Alex, and showed that the bird had an extraordinary vocabulary and capacity for learning. Alex even learned to add numerals before his death in 2007. Could an African grey pass the marshmallow test? Adrienne E. Koepke of Hunter College and Suzanne L. Gray of Harvard University tried the experiment on Pepperberg’s current star African grey, a 19-year-old named Griffin. In their test, a researcher took two treats, one of which Griffin liked slightly better, and put them into cups. Then she placed the cup with the less preferred food in front of Griffin and told him, “wait.” She took the other cup and either stood a few feet away or left the room. After a random amount of time, from 10 seconds to 15 minutes, she would return. If the food was still in the cup, Griffin got the nut he was waiting for. Koepke and colleagues presented their findings last month at the Animal Behavior Society meeting at Princeton. © 2014 The Slate Group LLC.

Keyword: Intelligence; Aggression
Link ID: 20061 - Posted: 09.11.2014

|By Amy Nordrum If you were one of millions of children who completed the Drug Abuse Resistance Education program, or D.A.R.E., between 1983 and 2009, you may be surprised to learn that scientists have repeatedly shown that the program did not work. Despite being the nation’s most popular substance-abuse prevention program, D.A.R.E. did not make you less likely to become a drug addict or even to refuse that first beer from your friends. But over the past few years prevention scientists have helped D.A.R.E. America, the nonprofit organization that administers the program, replace the old curriculum with a course based on a few concepts that should make the training more effective for today’s students. The new course, called keepin’ it REAL, differs in both form and content from the former D.A.R.E.—replacing long, drug-fact laden lectures with interactive lessons that present stories meant to help kids make smart decisions. Beginning in 2009 D.A.R.E. administrators required middle schools across the country that teach the program to switch over to the 10-week, researcher-designed curriculum for seventh graders. By 2013, they had ordered elementary schools to start teaching a version of those lessons to fifth and sixth graders, too. "It's not an antidrug program," says Michelle Miller-Day, co-developer of the new curriculum and a communications researcher at Chapman University. “It's about things like being honest and safe and responsible." Even so, keepin’ it REAL has reduced substance abuse and maintained antidrug attitudes over time among students in early trials—an achievement that largely eluded the former iteration of the program. D.A.R.E.’s original curriculum was not shaped by prevention specialists but by police officers and teachers in Los Angeles. They started D.A.R.E. in 1983 to curb the use of drugs, alcohol and tobacco among teens and to improve community–police relations. Fueled by word of mouth, the program quickly spread to 75 percent of U.S. schools. © 2014 Scientific American,

Keyword: Drug Abuse
Link ID: 20060 - Posted: 09.11.2014

By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20056 - Posted: 09.10.2014

By SOMINI SENGUPTA A coalition of political figures from around the world, including Kofi Annan, the former United Nations secretary general, and several former European and Latin American presidents, is urging governments to decriminalize a variety of illegal drugs and set up regulated drug markets within their own countries. The proposal by the group, the Global Commission on Drug Policy, goes beyond its previous call to abandon the nearly half-century-old American-led war on drugs. As part of a report scheduled to be released on Tuesday, the group goes much further than its 2011 recommendation to legalize cannabis. The former Brazilian president Fernando Henrique Cardoso, a member of the commission, said the group was calling for the legal regulation of “as many of the drugs that are currently illegal as possible, with the understanding that some drugs may remain too dangerous to decriminalize.” The proposal comes at a time when several countries pummeled by drug violence, particularly in Latin America, are rewriting their own drug laws, and when even the United States is allowing state legislatures to gingerly regulate cannabis use. The United Nations is scheduled to hold a summit meeting in 2016 to evaluate global drug laws. The commission includes former presidents like Mr. Cardoso of Brazil, Ernesto Zedillo of Mexico and Ruth Dreifuss of Switzerland, along with George P. Shultz, a former secretary of state in the Reagan administration, among others. The group stops short of calling on countries to legalize all drugs right away. It calls instead for countries to continue to pursue violent criminal gangs, to stop incarcerating users and to offer treatment for addicts. © 2014 The New York Times Company

Keyword: Drug Abuse
Link ID: 20052 - Posted: 09.10.2014

By Mo Costandi The nerve endings in your fingertips can perform complex neural computations that were thought to be carried out by the brain, according to new research published in the journal Nature Neuroscience. The processing of both touch and visual information involves computations that extract the geometrical features of objects we touch and see, such as the edge orientation. Most of this processing takes place in the brain, which contains cells that are sensitive to the orientation of edges on the things we touch and see, and which pass this information onto cells in neighbouring regions, that encode other features. The brain has outsourced some aspects of visual processing, such as motion detection, to the retina, and the new research shows that something similar happens in the touch processing pathway. Delegating basic functions to the sense organs in this way could be an evolutionary mechanism that enables the brain to perform other, more sophisticated information processing tasks more efficiently. Your fingertips are among the most sensitive parts of your body. They are densely packed with thousands of nerve endings, which produce complex patterns of nervous impulses that convey information about the size, shape and texture of objects, and your ability to identify objects by touch and manipulate them depends upon the continuous influx of this information. © 2014 Guardian News and Media Limited

Keyword: Pain & Touch
Link ID: 20051 - Posted: 09.09.2014

By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.

Keyword: Attention
Link ID: 20049 - Posted: 09.09.2014

By Maggie Fox, Erika Edwards and Judy Silverman Here’s how you might be able to turn autism around in a baby: Carefully watch her cues, and push just a little harder with that game of peek-a-boo or “This little piggy.” But don’t push too hard — kids with autism are super-sensitive. That’s what Sally Rogers of the University of California, Davis has found in an intense experiment with the parents of infants who showed clear signs of autism. It’s one of the most hopeful signs yet that if you diagnose autism very early, you can help children rewire their brains and reverse the symptoms. It was a small study, and it’s very hard to find infants who are likely to have autism, which is usually diagnosed in the toddler years. But the findings, published in the Journal of Autism and Developmental Disorders, offer some hope to parents worried about their babies. “With only seven infants in the treatment group, no conclusions can be drawn,” they wrote. However, the effects were striking. Six out of the seven children in the study had normal learning and language skills by the time they were 2 to 3. Isobel was one of them. “She is 3 years old now and she is a 100 percent typical, normally developing child,” her mother, Megan, told NBC News. The family doesn’t want their last name used for privacy reasons. “We don’t have to do the therapy any more. It literally rewired her brain.” Autism is a very common diagnosis for children in the U.S. The latest survey by the Centers for Disease Control and Prevention shows a startling 30 percent jump among 8-year-olds diagnosed with the disorder in a two-year period, to one in every 68 children.

Keyword: Autism
Link ID: 20047 - Posted: 09.09.2014

// by Richard Farrell Conventional thinking has long held that pelvic bones in whales and dolphins, evolutionary throwbacks to ancestors that once walked on land, are vestigial and will disappear millions of years from now. But researchers from University of Southern California and the Natural History Museum of Los Angeles County (NHM) have upended that assumption. The scientists argue in a paper just published in the journal Evolution that cetacean (whale and dolphin) pelvic bones certainly do have a purpose and that they're specifically targeted, by selection, for mating. The muscles that control a cetacean's penis are attached to the creature's pelvic bones. Matthew Dean, assistant professor at the USC Dornsife College of Letters, Arts and Sciences, and Jim Dines, collections manager of mammalogy at NHM, wanted to find out if pelvic bones could be evolutionarily advantageous by impacting the overall amount of control an individual creature has with its penis. The pair spent four years examining whale and dolphin pelvic bones, using a 3D laser scanner to study the shape and size of the samples in extreme detail. Then they gathered as much data as they could find -- reaching back to whaler days -- on whale testis size relative to body mass. The testis data was important because in nature, species in "promiscuous," competitive mating environments (where females mate with multiple males) develop larger testes, relative to their body mass, in order to outdo the competition. © 2014 Discovery Communications, LLC.

Keyword: Evolution; Aggression
Link ID: 20046 - Posted: 09.09.2014

By BENEDICT CAREY Imagine that on Day 1 of a difficult course, before you studied a single thing, you got hold of the final exam. The motherlode itself, full text, right there in your email inbox — attached mistakenly by the teacher, perhaps, or poached by a campus hacker. No answer key, no notes or guidelines. Just the questions. Would that help you study more effectively? Of course it would. You would read the questions carefully. You would know exactly what to focus on in your notes. Your ears would perk up anytime the teacher mentioned something relevant to a specific question. You would search the textbook for its discussion of each question. If you were thorough, you would have memorized the answer to every item before the course ended. On the day of that final, you would be the first to finish, sauntering out with an A+ in your pocket. And you would be cheating. But what if, instead, you took a test on Day 1 that was just as comprehensive as the final but not a replica? You would bomb the thing, for sure. You might not understand a single question. And yet as disorienting as that experience might feel, it would alter how you subsequently tuned into the course itself — and could sharply improve your overall performance. This is the idea behind pretesting, one of the most exciting developments in learning-­science. Across a variety of experiments, psychologists have found that, in some circumstances, wrong answers on a pretest aren’t merely useless guesses. Rather, the attempts themselves change how we think about and store the information contained in the questions. On some kinds of tests, particularly multiple-choice, we benefit from answering incorrectly by, in effect, priming our brain for what’s coming later. That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20043 - Posted: 09.08.2014

by Laura Beil The obesity crisis has given prehistoric dining a stardom not known since Fred Flintstone introduced the Bronto Burger. Last year, “Paleo diet” topped the list of most-Googled weight loss searches, as modern Stone Age dieters sought the advice of bestsellers like The Paleo Solution or The Primal Blueprint, which encourages followers to “honor your primal genes.” The assumption is that America has a weight problem because human metabolism runs on ancient genes that are ill equipped for contemporary eating habits. In this line of thinking, a diet true to the hunter-gatherers we once were — heavy on protein, light on carbs — will make us skinny again. While the fad has attracted skepticism from those who don’t buy the idea whole hog, there’s still plenty of acceptance for one common premise about the evolution of obesity: Our bodies want to stockpile fat. For most of human history, the theory goes, hunter-gatherers ate heartily when they managed to slay a fleeing mastodon. Otherwise, prehistoric life meant prolonged stretches of near starvation, surviving only on inner reserves of adipose. Today, modern humans mostly hunt and gather at the drive-thru, but our Pleistocene genes haven’t stopped fretting over the coming famine. The idea that evolution favored calorie-hoarding genes has long shaped popular and scientific thinking. Called the “thrifty gene” hypothesis, it has arguably been the dominant theory for evolutionary origins of obesity, and by extension diabetes. (Insulin resistance and diabetes so commonly accompany obesity that doctors have coined the term “diabesity.”) However, it’s not that difficult to find scientists who call the rise of the thrifty gene theory a feat of enthusiasm over evidence. Greg Gibson, director of the Center for Integrative Genomics at Georgia Tech in Atlanta, calls the data “somewhere between scant and nonexistent — a great example of crowd mentality in science.” © Society for Science & the Public 2000 - 2014

Keyword: Obesity
Link ID: 20042 - Posted: 09.06.2014

By Jeffrey Mervis Embattled U.K. biomedical researchers are drawing some comfort from a new survey showing that a sizable majority of the public continues to support the use of animals in research. But there’s another twist that should interest social scientists as well: The government’s decision this year to field two almost identical surveys on the topic offers fresh evidence that the way you ask a question affects how people answer it. Since 1999, the U.K. Department for Business, Innovation & Skills (BIS) has been funding a survey of 1000 adults about their attitudes toward animal experimentation. But this year the government asked the London-based pollsters, Ipsos MORI, to carry out a new survey, changing the wording of several questions. (The company also collected additional information, including public attitudes toward different animal species and current rules regarding their use.) For example, the phrase “animal experimentation” was replaced by “animal research” because the latter is “less inflammatory,” notes Ipsos MORI Research Manager Jerry Latter. In addition, says Emma Brown, a BIS spokeswoman, the word research “more accurately reflects the range of procedures that animals may be involved in, including the breeding of genetically modified animals.” But government officials also value the information about long-term trends in public attitudes that can be gleaned from the current survey. So they told the company to conduct one last round—the 10th in the series—at the same time they deployed the new survey. Each survey went to a representative, but different, sample of U.K. adults. © 2014 American Association for the Advancement of Scienc

Keyword: Animal Rights
Link ID: 20041 - Posted: 09.06.2014

Ewen Callaway Caffeine's buzz is so nice it evolved twice. The coffee genome has now been published, and it reveals that the coffee plant makes caffeine using a different set of genes from those found in tea, cacao and other perk-you-up plants. Coffee plants are grown across some 11 million hectares of land, with more than two billion cups of the beverage drunk every day. It is brewed from the fermented, roasted and ground berries of Coffea canephora and Coffea arabica, known as robusta and arabica, respectively. An international team of scientists has now identified more than 25,000 protein-making genes in the robusta coffee genome. The species accounts for about one-third of the coffee produced, much of it for instant-coffee brands such as Nescafe. Arabica contains less caffeine, but its lower acidity and bitterness make it more flavourful to many coffee drinkers. However, the robusta species was selected for sequencing because its genome is simpler than arabica’s. Caffeine evolved long before sleep-deprived humans became addicted to it, probably to defend the coffee plant against predators and for other benefits. For example, coffee leaves contain the highest levels of caffeine of any part of the plant, and when they fall on the soil they stop other plants from growing nearby. “Caffeine also habituates pollinators and makes them want to come back for more, which is what it does to us, too,” says Victor Albert, a genome scientist at the University of Buffalo in New York, who co-led the sequencing effort. The results were published on 4 September in Science1. © 2014 Nature Publishing Group

Keyword: Drug Abuse; Aggression
Link ID: 20040 - Posted: 09.06.2014

By LISA SANDERS, M.D. On Thursday, we challenged Well readers to take on the case of a 19-year-old man who suddenly collapsed at work after months of weakness and fatigue dotted with episodes of nausea and vomiting. More than 500 of you wrote in with suggested diagnoses. And more than 60 of you nailed it. The cause of this man’s collapse, weakness, nausea and vomiting was… Addisonian crisis because of Addison’s disease Addison’s disease, named after Dr. Thomas Addison, the 19th-century physician who first described the disorder, occurs when the adrenal glands stop producing the fight-or-flight hormones, particularly cortisol and adrenaline and a less well known but equally important hormone called aldosterone that helps the body manage salt. In Addison’s, the immune system mistakenly attacks the adrenal glands as if they were foreign invaders. Why this happens is not well understood, but without these glands and the essential hormones they make, the body cannot respond to biological stress. The symptoms of Addison’s are vague. That’s one reason it’s so hard to diagnosis. Patients complain of weakness and fatigue. They often crave salt. And when confronted with any stress — an infection or an injury — patients with Addison’s may go into adrenal crisis, characterized by nausea and vomiting, low blood pressure and, sometimes, physical collapse. Their blood pressure may drop so low that oxygen-carrying blood cannot reach the extremities, causing skin to turn blue; if blood fails to reach even more essential organs, it can lead to death. © 2014 The New York Times Company

Keyword: Hormones & Behavior
Link ID: 20037 - Posted: 09.06.2014

by Sandrine Ceurstemont Screening an instructional monkey movie in a forest reveals that marmosets do not only learn from family members: they also copy on-screen strangers. It is the first time such a video has been used for investigations in the wild. Tina Gunhold at the University of Vienna, Austria, and her colleagues filmed a common marmoset retrieving a treat from a plastic device. They then took the device to the Atlantic Forest near Aldeia in Pernambuco, Brazil, and showed the movie to wild marmosets there. Although monkeys are known to learn from others in their social group, especially when they are youngMovie Camera, little is known about their ability to learn from monkeys that do not belong to the same group. Marmosets are territorial, so the presence of an outsider – even a virtual one on a screen – could provoke an attack. "We didn't know if wild marmosets would be frightened of the video box but actually they were all attracted to it," says Gunhold. Compared to monkeys shown a static image of the stranger, video-watching marmosets were more likely to manipulate the device, typically copying the technique shown (see video). Young monkeys spent more time near the video box than older family members, suggesting that they found the movie more engaging – although as soon as one monkey mastered the task, it was impossible to tell whether the others were learning from the video or from their relative. "We think it's a combination of both," says Gunhold. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Aggression
Link ID: 20035 - Posted: 09.04.2014

Yves Frégnac & Gilles Laurent Launched in October 2013, the Human Brain Project (HBP) was sold by charismatic neurobiologist Henry Markram as a bold new path towards understanding the brain, treating neurological diseases and building information technology. It is one of two 'flagship' proposals funded by the European Commission's Future and Emerging Technologies programme (see go.nature.com/icotmi). Selected after a multiyear competition, the project seemed like an exciting opportunity to bring together neuroscience and IT to generate practical applications for health and medicine (see go.nature.com/2eocv8). Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. In recent months, the HBP executive board revealed plans to drastically reduce its experimental and cognitive neuroscience arm, provoking wrath in the European neuroscience community. The crisis culminated with an open letter from neuroscientists (including one of us, G.L.) to the European Commission on 7 July 2014 (see www.neurofuture.eu), which has now gathered more than 750 signatures. Many signatories are scientists in experimental and theoretical fields, and the list includes former HBP participants. The letter incorporates a pledge of non-participation in a planned call for 'partnering projects' that must raise about half of the HBP's total funding. This pledge could seriously lower the quality of the project's final output and leave the planned databases empty. © 2014 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20033 - Posted: 09.04.2014

By GRETCHEN REYNOLDS Amyotrophic lateral sclerosis has been all over the news lately because of the ubiquitous A.L.S. ice bucket challenge. That attention has also reinvigorated a long-simmering scientific debate about whether participating in contact sports or even vigorous exercise might somehow contribute to the development of the fatal neurodegenerative disease, an issue that two important new studies attempt to answer. Ever since the great Yankees first baseman Lou Gehrig died of A.L.S. in 1941 at age 37, many Americans have vaguely connected A.L.S. with athletes and sports. In Europe, the possible linkage has been more overtly discussed. In the past decade, several widely publicized studies indicated that professional Italian soccer players were disproportionately prone to A.L.S., with about a sixfold higher incidence than would have been expected numerically. Players were often diagnosed while in their 30s; the normal onset is after 60. These findings prompted some small, follow-up epidemiological studies of A.L.S. patients in Europe. To the surprise and likely consternation of the researchers, they found weak but measurable associations between playing contact sports and a heightened risk for A.L.S. The data even showed links between being physically active — meaning exercising regularly — and contracting the disease, raising concerns among scientists that exercise might somehow be inducing A.L.S. in susceptible people, perhaps by affecting brain neurons or increasing bodily stress. But these studies were extremely small and had methodological problems. So to better determine what role sports and exercise might play in the risk for A.L.S., researchers from across Europe recently combined their efforts into two major new studies. The results should reassure those of us who exercise. The numbers showed that physical activity — whether at work, in sports or during exercise — did not increase people’s risk of developing A.L.S. © 2014 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 20031 - Posted: 09.03.2014