Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 19774

By Paula Span First, an acknowledgment: Insomnia bites. S. Bliss, a reader from Albuquerque, comments that even taking Ativan, he or she awakens at 4:30 a.m., can’t get back to sleep and suffers “a state of sleep deprivation and eventually a kind of walking exhaustion.” Molly from San Diego bemoans “confusion, anxiety, exhaustion, depression, loss of appetite, frankly a loss of will to go on,” all consequences of her sleeplessness. She memorably adds, “Give me Ambien or give me death.” Marciacornute reports that she’s turned to vodka (prompting another reader to wonder if Medicare will cover booze). After several rounds of similar laments here (and not only here; insomnia is prevalent among older adults), I found the results of a study by University of Chicago researchers particularly striking. What if people who report sleep problems are actually getting enough hours of sleep, overall? What if they’re not getting significantly less sleep than people who don’t complain of insomnia? Maybe there’s something else going on. It has always been difficult to ascertain how much people sleep; survey questions are unreliable (how can you tell when you’ve dozed off?), and wiring people with electrodes creates such an abnormal situation that the results may bear little resemblance to ordinary nightlife. Enter the actigraph, a wrist-motion monitor. “The machines have gotten better, smaller, less clunky and more reliable,” said Linda Waite, a sociologist and a co-author of the study. By having 727 older adults across the United States (average age: almost 72) wear actigraphs for three full days, Dr. Waite and her colleagues could tell when subjects were asleep and when they weren’t. Then they could compare their reported insomnia to their actual sleep patterns. Overall, in this random sample, taken from an ongoing national study of older adults, people didn’t appear sleep-deprived. They fell asleep at 10:27 p.m. on average, and awakened at 6:22 a.m. After subtracting wakeful periods during the night, they slept an average seven and a quarter hours. But averages don’t tell us much, so let’s look more closely at their reported insomnia. “What was surprising to us is that there’s very little association between people’s specific sleep problems and what the actigraph shows,” Dr. Waite said. © 2014 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 20250 - Posted: 10.28.2014

By Eric Niiler Has our reliance on iPhones and other instant-info devices harmed our memories? Michael Kahana, a University of Pennsylvania psychology professor who studies memory, says maybe: “We don’t know what the long-lasting impact of this technology will be on our brains and our ability to recall.” Kahana, 45, who has spent the past 20 years looking at how the brain creates memories, is leading an ambitious four-year Pentagon project to build a prosthetic memory device that can be implanted into human brains to help veterans with traumatic brain injuries. He spoke by telephone with The Post about what we can do to preserve or improve memory. Practicing the use of your memory is helpful. The other thing which I find helpful is sleep, which I don’t get enough of. As a general principle, skills that one continues to practice are skills that one will maintain in the face of age-related changes in cognition. [As for all those brain games available], I am not aware of any convincing data that mental exercises have a more general effect other than maintaining the skills for those exercises. I think the jury is out on that. If you practice doing crossword puzzles, you will preserve your ability to do crossword puzzles. If you practice any other cognitive skill, you will get better at that as well. Michael Kahana once could name every student in a class of 100. Now, says the University of Pennsylvania psychology professor who studies memory, “I find it too difficult even with a class of 20.” (From Michael Kahana)

Keyword: Learning & Memory
Link ID: 20249 - Posted: 10.28.2014

With the passing away of Professor Allison Doupe on Friday, October 24, of cancer, UCSF and biomedical science have lost a scholar of extraordinary intelligence and erudition and a campus leader. Allison Doupe was a psychiatrist and systems neuroscientist who became a leader of her field, the study of sensorimotor learning and its neural control. Allison was recruited to the Departments of Psychiatry and Physiology and the Neuroscience Graduate Program in 1993, rising to Professor in 2000. Her academic career has been outstanding at every stage, including First Class Honors at McGill, an MD and PhD in Neurobiology from Harvard, and a prestigious Junior Fellowship from the Harvard University Society of Fellows. Her PhD work with Professor Paul Patterson definitively established the role of particular environmental factors in the development of autonomic neurons and was important in the molecular and cellular investigations of the roles of hormones and growth factors in that system. After internship at the Massachusetts General Hospital and residency in psychiatry at UCLA, she chose to pursue a postdoctoral fellowship at Caltech, studying song learning in birds with Professor Mark Konishi as a way of combining her clinical interests in behavior and development with research in cognitive neuroscience. The development of birdsong is in many important respects similar to language development in humans. The pioneering work of Peter Marler, on song sparrows in Golden Gate Park, showed that each baby songbird learns its father’s dialect but could readily learn the dialect of any singing bird of the same species placed in the role of tutor. Many birds, including the ones studied by Allison Doupe, learn their song by listening to their father sing during a period of life in which they are not themselves singing, and they later practice and perfect their own song by comparison with their memory of the father’s (or tutor’s) song.

Keyword: Animal Communication; Language
Link ID: 20248 - Posted: 10.28.2014

by Bethany Brookshire In many scientific fields, the study of the body is the study of boys. In neuroscience, for example, studies in male rats, mice, monkeys and other mammals outnumber studies in females 5.5 to 1. When scientists are hunting for clues, treatments or cures for a human population that is around 50 percent female, this boys-only club may miss important questions about how the other half lives. So in an effort to reduce this sex bias in biomedical studies, National Institutes of Health director Francis Collins and Office of Research on Women’s Health director Janine Clayton announced in May a new policy that will roll out practices promoting sex parity in research, beginning with a requirement that scientists state whether males, females or both were used in experiments, and moving on to mandate that both males and females are included in all future funded research. The end goal will be to make sure that NIH-funded scientists “balance male and female cells and animals in preclinical studies in all future [grant] applications” to the NIH. In 1993, the NIH Revitalization Act mandated the inclusion of women and minorities in clinical trials. This latest move extends that inclusion to cells and animals in preclinical research. Because NIH funds the work of morethan 300,000 researchers in the United States and other countries, many of whom work on preclinical and basic biomedical science, the new policy has broad implications for the biomedical research community. And while some scientists are pleased with the effort, others are worried that the mandate is ill-conceived and underfunded. In the end, whether it succeeds or fails comes down to interpretation and future implementation. © Society for Science & the Public 2000 - 2014

Keyword: Sexual Behavior; Pain & Touch
Link ID: 20247 - Posted: 10.27.2014

By PAM BELLUCK Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age. In a small study in the journal Nature Neuroscience, healthy people, ages 50 to 69, who drank a mixture high in antioxidants called cocoa flavanols for three months performed better on a memory test than people who drank a low-flavanol mixture. On average, the improvement of high-flavanol drinkers meant they performed like people two to three decades younger on the study’s memory task, said Dr. Scott A. Small, a neurologist at Columbia University Medical Center and the study’s senior author. They performed about 25 percent better than the low-flavanol group. “An exciting result,” said Craig Stark, a neurobiologist at the University of California, Irvine, who was not involved in the research. “It’s an initial study, and I sort of view this as the opening salvo.” He added, “And look, it’s chocolate. Who’s going to complain about chocolate?” The findings support recent research linking flavanols, especially epicatechin, to improved blood circulation, heart health and memory in mice, snails and humans. But experts said the new study, although involving only 37 participants and partly funded by Mars Inc., the chocolate company, goes further and was a well-controlled, randomized trial led by experienced researchers. Besides improvements on the memory test — a pattern recognition test involving the kind of skill used in remembering where you parked the car or recalling the face of someone you just met — researchers found increased function in an area of the brain’s hippocampus called the dentate gyrus, which has been linked to this type of memory. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20246 - Posted: 10.27.2014

By Gary Stix Scott Small, a professor of neurology at Columbia University’s College of Physicians and Surgeons, researches Alzheimer’s, but he also studies the memory loss that occurs during the normal aging process. Research on the commonplace “senior moments” focuses on the hippocampus, an area of the brain involved with formation of new memories. In particular, one area of the hippocampus, the dentate gyrus, which helps distinguish one object from another, has lured researchers on age-related memory problems. In a study by Small and colleagues published Oct. 26 in Nature Neuroscience, naturally occurring chemicals in cocoa increased dentate gyrus blood flow. Psychological testing showed that the pattern recognition abilities of a typical 60-year-old on a high dose of the cocoa phytochemicals in the 37-person study matched those of a 30-or 40-year old after three months. The study received support from the food company Mars, but Small cautions against going out to gorge on Snickers Bars, as most of the beneficial chemicals, or flavanols, are removed when processing cocoa. An edited transcript of an interview with Small follows: Can you explain what you found in your study? The main motive of the study was to causally establish an anatomical source of age-related memory loss. A number of labs have shown in the last 10 years that there’s one area of the brain called the dentate gyrus that is linked to the aging process. But no one has tested that concept. Until now the observations have been correlational. There is decreased function in that region and, to prove causation, we were trying to see if we could reverse that. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20245 - Posted: 10.27.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Keyword: Attention; Emotions
Link ID: 20244 - Posted: 10.27.2014

By Neuroskeptic A new paper threatens to turn the world of autism neuroscience upside down. Its title is Anatomical Abnormalities in Autism?, and it claims that, well, there aren’t very many. Published in Cerebral Cortex by Israeli researchers Shlomi Haar and colleagues, the new research reports that there are virtually no differences in brain anatomy between people with autism and those without. What makes Haar et al.’s essentially negative claims so powerful is that their study had a huge sample size: they included structural MRI scans from 539 people diagnosed with high-functioning autism spectrum disorder (ASD) and 573 controls. This makes the paper an order of magnitude bigger than a typical structural MRI anatomy study in this field. The age range was 6 to 35. The scans came from the public Autism Brain Imaging Data Exchange (ABIDE) database, a data sharing initiative which pools scans from 18 different neuroimaging centers. Haar et al. examined the neuroanatomy of the cases and controls using the popular FreeSurfer software package. What did they find? Well… not much. First off, the ASD group had no differences in overall brain size (intracranial volume). Nor were there any group differences in the volumes of most brain areas; the only significant finding here was an increased ventricle volume in the ASD group, but even this had a small effect size (d = 0.34). Enlarged ventricles is not specific to ASD by any means – the same thing has been reported in schizophrenia, dementia, and many other brain disorders.

Keyword: Autism; Brain imaging
Link ID: 20243 - Posted: 10.27.2014

Sarah Boseley, health editor A record haul of “smart” drugs, sold to students to enhance their memory and thought processes, stay awake and improve concentration, has been seized from a UK website by the medicines regulator, which is alarmed about the recent rise of such sites. The seizure, worth £200,000, illustrates the increasing internet trade in cognitive enhancement drugs and suggests people who want to stay focused and sharp are moving on from black coffee and legally available caffeine tablets. Most of the seized drugs are medicines that should only be available on a doctor’s prescription. One, Sunifiram, is entirely experimental and has never been tested on humans in clinical trials. Investigators from the Medicines and Healthcare Products Regulatory Authority (MHRA) are worried at what they see as a new phenomenon – the polished, plausible, commercial website targeting students and others who are looking for a mental edge over the competition. In addition to Ritalin, the drug that helps young people with attention deficit disorder (ADD) focus in class and while writing essays, and Modafinil (sold as Provigil), licensed in the US for people with narcolepsy, they are also offering experimental drugs and research chemicals. MHRA head of enforcement, Alastair Jeffrey, said the increase in people buying cognitive-enhancing drugs or “nootropics” is recent and very worrying. “The idea that people are willing to put their overall health at risk in order to attempt to get an intellectual edge over others is deeply troubling,” he said. © 2014 Guardian News and Media Limited

Keyword: Drug Abuse; ADHD
Link ID: 20242 - Posted: 10.27.2014

by Clare Wilson Call them the neuron whisperers. Researchers are eavesdropping on conversations going on between brain cells in a dish. Rather than hearing the chatter, they watch neurons that have been genetically modified so that the electrical impulses moving along their branched tendrils cause sparkles of red light (see video). Filming these cells at up to 100,000 frames a second is allowing researchers to analyse their firing in unprecedented detail. Until recently, a neuron's electrical activity could only be measured with tiny electrodes. As well as being technically difficult, such "patch clamping" only reveals the voltage at those specific points. The new approach makes the neuron's entire surface fluoresce as the impulse passes by. "Now we see the whole thing sweep through," says Adam Cohen of Harvard University. "We get much more information - like how fast and where does it start and what happens at a branch." The idea is a reverse form of optogenetics – where neurons are given a gene from bacteria that make a light-sensitive protein, so the cells fire when illuminated. The new approach uses genes that make the neurons do the opposite - glow when they fire. "It's pretty cool," says Dimitri Kullmann of University College London. "It's amazing that you can dispense with electrodes." Cohen's team is using the technique to compare cells from typical brains with those from people with disorders such as motor neuron disease or amyotrophic lateral sclerosis. Rather than taking a brain sample, they remove some of the person's skin cells and grow them alongside chemicals that rewind the cells into an embryonic-like state. Another set of chemicals is used to turn these stem cells into neurons. "You can recreate something reminiscent of the person's brain in the dish," says Cohen. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging
Link ID: 20241 - Posted: 10.25.2014

By Michael Hedrick I have a hard time making friends. Getting to trust people well enough to call them a friend takes a lot of work. It’s especially hard when you are living with schizophrenia and think everyone is making fun of you. Schizophrenia is the devil on your shoulder that keeps whispering in your ear and, no matter what you try, the little demon won’t stop. He hasn’t stopped in the almost nine years I’ve lived with the illness, and he’s not about to stop now. He’s just quieted down a bit. I’d call him my companion but that would imply a degree of friendship, and there’s no way in hell I’m the little devil’s friend. I have plenty of acquaintances, and a couple hundred “friends” on Facebook. But real friends, mostly family, I can count on one hand. For me, making friends is like climbing a vertical rock wall with no ropes, requiring a degree of thrill-seeking, and a good deal of risk. For someone to be my friend, they have to accept that I’m crazy, and even getting to the point of telling them that is daunting when all you hear is the devil’s whispering that they’re making snap judgments about you or will be going back to their real friends and laughing about you. But interestingly, in my efforts to make friends, coffee shops have helped. The simple routine of going to get your fix of liquid energy every day provides a sort of breeding ground for community. You see these people every day,whether you like it or not and, over time, friendships form. I used to live in a small town called Niwot, about five miles down the highway from Boulder, where I now live. Every morning around 6 I would go to Winot Coffee, the small independent coffee shop, and every morning, without fail, there was a guy my age sitting outside with his computer smoking clove cigarettes. Given the regularity of seeing him every morning, and given that we were some of the only 20-somethings in town, we got to talking. © 2014 The New York Times Company

Keyword: Schizophrenia
Link ID: 20240 - Posted: 10.25.2014

by Neurobonkers A paper published in Nature Reviews Neuroscience last week addressed the prevalence of neuromyths among educators. The paper has been widely reported, but the lion's share of the coverage glossed over the impact that neuromyths have had in the real world. Your first thought after reading the neuromyths in the table below — which were widely believed by teachers — may well be, "so what?" It is true that some of the false beliefs are relatively harmless. For example, encouraging children to drink a little more water might perhaps result in the consumption of less sugary drinks. This may do little if anything to reduce hyperactivity but could encourage a more nutritious diet which might have impacts on problems such as Type II diabetes. So, what's the harm? The paper addressed a number of areas where neuromyths have had real world impacts on educators and policymakers, which may have resulted negatively on the provision of education. The graph above, reprinted in the Nature Reviews Neuroscience, paper has been included as empirical data in educational policy documents to provide evidence for an "allegedly scientific argument for withdrawing public funding of university education." The problem? The data is made up. The graph is in fact a model that is based on the false assumption that investment before the age of three will have many times the benefit of investment made in education later in life. The myth of three — the belief that there is a critical window to educate children before the age of three, after which point the trajectory is fixed — is one of the most persistent neuromyths. Viewed on another level, while some might say investment in early education can never be a bad thing, how about the implication that the potential of a child is fixed at such an early point in their life, when in reality their journey has just begun. © Copyright 2014, The Big Think, Inc

Keyword: Development of the Brain; Learning & Memory
Link ID: 20239 - Posted: 10.25.2014

By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 20238 - Posted: 10.23.2014

By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science

Keyword: Alzheimers; Learning & Memory
Link ID: 20237 - Posted: 10.23.2014

By J. PEDER ZANE Striking it rich is the American dream, a magnetic myth that has drawn millions to this nation. And yet, a countervailing message has always percolated through the culture: Money can’t buy happiness. From Jay Gatsby and Charles Foster Kane to Tony Soprano and Walter White, the woefully wealthy are among the seminal figures of literature, film and television. A thriving industry of gossipy, star-studded magazines and websites combines these two ideas, extolling the lifestyles of the rich and famous while exposing the sadness of celebrity. All of which raises the question: Is the golden road paved with misery? Yes, in a lot of cases, according to a growing body of research exploring the connection between wealth and happiness. Studies in behavioral economics, cognitive psychology and neuroscience are providing new insights into how a changing American economy and the wiring of the human brain can make life on easy street feel like a slog. Make no mistake, it is better to be rich than poor — psychologically as well as materially. Levels of depression, anxiety and stress diminish as incomes rise. What has puzzled researchers is that the psychological benefits of wealth seem to stop accruing once people reach an income of about $75,000 a year. “The question is, What are the factors that dampen the rewards of income?” said Scott Schieman, a professor of sociology at the University of Toronto. “Why doesn’t earning even more money — beyond a certain level — make us feel even happier and more satisfied?” The main culprit, he said, is the growing demands of work. For millenniums, leisure was wealth’s bedfellow. The rich were different because they worked less. The tables began to turn in America during the 1960s, when inherited privilege gave way to educational credentials and advancement became more closely tied to merit. © 2014 The New York Times Company

Keyword: Emotions
Link ID: 20236 - Posted: 10.23.2014

by Helen Thomson For the first time, doctors have opened and closed the brain's protector – the blood-brain barrier – on demand. The breakthrough will allow drugs to reach diseased areas of the brain that are otherwise out of bounds. Ultimately, it could make it easier to treat conditions such as Alzheimer's and brain cancer. The blood-brain barrier (BBB) is a sheath of cells that wraps around blood vessels (in black) throughout the brain. It protects precious brain tissue from toxins in the bloodstream, but it is a major obstacle for treating brain disorders because it also blocks the passage of drugs. Several teams have opened the barrier in animals to sneak drugs through. Now Michael Canney at Paris-based medical start-up CarThera, and his colleagues have managed it in people using an ultrasound brain implant and an injection of microbubbles. When ultrasound waves meet microbubbles in the blood, they make the bubbles vibrate. This pushes apart the cells of the BBB. With surgeon Alexandre Carpentier at Pitié-Salpêtrière Hospital in Paris, Canney tested the approach in people with a recurrence of glioblastoma, the most aggressive type of brain tumour. People with this cancer have surgery to remove the tumours and then chemotherapy drugs, such as Carboplatin, are used to try to kill any remaining tumour cells. Tumours make the BBB leaky, allowing in a tiny amount of chemo drugs: if more could get through, their impact would be greater, says Canney. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging
Link ID: 20235 - Posted: 10.23.2014

James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."

Keyword: Emotions; Attention
Link ID: 20234 - Posted: 10.23.2014

By PAUL VITELLO Most adults do not remember anything before the age of 3 or 4, a gap that researchers had chalked up to the vagaries of the still-developing infant brain. By some accounts, the infant brain was just not equipped to remember much. Textbooks referred to the deficiency as infant amnesia. Carolyn Rovee-Collier, a developmental psychologist at Rutgers University who died on Oct. 2 at 72, challenged the theory, showing in a series of papers in the early 1980s that babies remember plenty. A 3-month-old can recall what he or she learned yesterday, she found, and a 9-month-old can remember a game for as long as a month and a half. She cited experiments suggesting that memory processes in adults and infants are virtually the same, and argued that infant memories were never lost. They just become increasingly harder to retrieve as the child grows, learns language and loses touch with the visual triggers that had kept those memories sharp — a view from between the bars of a crib, say, or the view of the floor as a crawler, not a toddler, sees it. Not all of Dr. Rovee-Collier’s theories won over the psychology establishment, which still uses the infant amnesia concept to explain why people do not remember life as a baby. But her insights about an infant’s short-term memory and ability to learn have been widely accepted, and have helped recast scientific thinking about the infant mind over the past 30 years. Since the first of her 200 papers was published, infant cognitive studies has undergone a boom in university programs around the country. It was a field that had been largely unexplored in any systematic way by the giants of psychological theory. Freud and Jean Piaget never directly addressed the subject of infant memory. William James, considered the father of American psychology, once hazarded a guess that the human baby’s mind was a place of “blooming, buzzing confusion.” © 2014 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 20233 - Posted: 10.23.2014

David DiSalvo @neuronarrative One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions. For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will. Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.

Keyword: Consciousness
Link ID: 20232 - Posted: 10.23.2014

Carl Zimmer Scientists have reconstructed the genome of a man who lived 45,000 years ago, by far the oldest genetic record ever obtained from modern humans. The research, published on Wednesday in the journal Nature, provided new clues to the expansion of modern humans from Africa about 60,000 years ago, when they moved into Europe and Asia. And the genome, extracted from a fossil thighbone found in Siberia, added strong support to a provocative hypothesis: Early humans interbred with Neanderthals. “It’s irreplaceable evidence of what once existed that we can’t reconstruct from what people are now,” said John Hawks, a paleoanthropologist at the University of Wisconsin who was not involved in the study. “It speaks to us with information about a time that’s lost to us.” The discoveries were made by a team of scientists led by Svante Paabo, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Over the past three decades, Dr. Paabo and his colleagues have developed tools for plucking out fragments of DNA from fossils and reading their sequences. Early on, the scientists were able only to retrieve tiny snippets of ancient genes. But gradually, they have invented better methods for joining the overlapping fragments together, assembling larger pieces of ancient genomes that have helped shed light on the evolution of humans and their relatives. In December, they published the entirety of a Neanderthal genome extracted from a single toe bone. Comparing Neanderthal to human genomes, Dr. Paabo and his colleagues found that we share a common ancestor, which they estimated lived about 600,000 years ago. © 2014 The New York Times Company

Keyword: Evolution
Link ID: 20231 - Posted: 10.23.2014