Chapter 1. An Introduction to Brain and Behavior

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 916

By Glendon Mellow University and scientific research center programs are increasingly finding it useful to employ artists and illustrators to help them see things in a new way. Few works of art from the Renaissance have been studied and pored over as meticulously as Michelangelo’s frescoes in the Sistine Chapel. Yet, the Master may still have some surprises hidden for an illustrator-scientist. Biomedical Illustrator Ian Suk (BSc, BMC) and Neurological Surgeon Rafael Tamargo (MD, FACS), both of Johns Hopkins proposed in a 2010 article in the journal Neurosurgery, that the panel above, Dividing Light from the Darkness by Michelangelo actually depicts the brain stem of God. Using a series of comparisons of the unusual shadows and contours on God’s neck to photos of actual brain stems, the evidence seems completely overwhelming that Michelangelo used his own limited anatomical studies to depict the brain stem. It’s unlikely even the educated members of Michelangelo’s audience would recognize it. I encourage you to look over the paper here, and enlarge the images in the slideshow: Suk and Tamargo are utterly convincing. Unlike R. Douglas Fields in this previous blog post from 2010 on Scientific American, I don’t think there’s room to believe this is a case of pareidolia. I imagine the thrill of feeling Michelangelo communicating directly with the authors across the centuries was immense. © 2014 Scientific American,

Keyword: Brain imaging
Link ID: 20067 - Posted: 09.12.2014

By Sarah Zielinski The marshmallow test is pretty simple: Give a child a treat, such as a marshmallow, and promise that if he doesn’t eat it right away, he’ll soon be rewarded with a second one. The experiment was devised by Stanford psychologist Walter Mischel in the late 1960s as a measure of self-control. When he later checked back in with kids he had tested as preschoolers, those who had been able to wait for the second treat appeared to be doing better in life. They tended to have fewer behavioral or drug-abuse problems, for example, than those who had given in to temptation. Most attempts to perform this experiment on animals haven’t worked out so well. Many animals haven’t been willing to wait at all. Dogs, primates, and some birds have done a bit better, managing to wait at least a couple of minutes before eating the first treat. The best any animal has managed has been 10 minutes—a record set earlier this year by a couple of crows. The African grey parrot is a species known for its intelligence. Animal psychologist Irene Pepperberg, now at Harvard, spent 30 years studying one of these parrots, Alex, and showed that the bird had an extraordinary vocabulary and capacity for learning. Alex even learned to add numerals before his death in 2007. Could an African grey pass the marshmallow test? Adrienne E. Koepke of Hunter College and Suzanne L. Gray of Harvard University tried the experiment on Pepperberg’s current star African grey, a 19-year-old named Griffin. In their test, a researcher took two treats, one of which Griffin liked slightly better, and put them into cups. Then she placed the cup with the less preferred food in front of Griffin and told him, “wait.” She took the other cup and either stood a few feet away or left the room. After a random amount of time, from 10 seconds to 15 minutes, she would return. If the food was still in the cup, Griffin got the nut he was waiting for. Koepke and colleagues presented their findings last month at the Animal Behavior Society meeting at Princeton. © 2014 The Slate Group LLC.

Keyword: Intelligence; Aggression
Link ID: 20061 - Posted: 09.11.2014

Ewen Callaway Researchers found 69 genes that correlate with higher educational attainment — and three of those also also appear to have a direct link to slightly better cognitive abilities. Scientists looking for the genes underlying intelligence are in for a slog. One of the largest, most rigorous genetic study of human cognition1 has turned up inconclusive findings, and experts concede that they will probably need to scour the genomes of more than 1 million people to confidently identify even a small genetic influence on intelligence and other behavioural traits. Studies of twins have repeatedly confirmed a genetic basis for intelligence, personality and other aspects of behaviour. But efforts to link IQ to specific variations in DNA have led to a slew of irreproducible results. Critics have alleged that some of these studies' methods were marred by wishful thinking and shoddy statistics. A sobering editorial in the January 2012 issue of Behavior Genetics2 declared that “it now seems likely that many of the published findings of the last decade are wrong or misleading and have not contributed to real advances in knowledge”. In 2011, an international collaboration of researchers launched an effort to bring more rigour to studies of how genes contribute to behaviour. The group, called the Social Sciences Genetic Association Consortium, aimed to do studies using practices borrowed from the medical genetics community, which emphasizes large numbers of participants, rigorous statistics and reproducibility. In a 2013 study3 comparing the genomes of more than 126,000 people, the group identified three gene variants associated with with how many years of schooling a person had gone through or whether they had attended university. But the effect of these variants was small — each variant correlated with roughly one additional month of schooling in people who had it compared with people who did not. © 2014 Nature Publishing Group

Keyword: Intelligence; Aggression
Link ID: 20050 - Posted: 09.09.2014

By Jeffrey Mervis Embattled U.K. biomedical researchers are drawing some comfort from a new survey showing that a sizable majority of the public continues to support the use of animals in research. But there’s another twist that should interest social scientists as well: The government’s decision this year to field two almost identical surveys on the topic offers fresh evidence that the way you ask a question affects how people answer it. Since 1999, the U.K. Department for Business, Innovation & Skills (BIS) has been funding a survey of 1000 adults about their attitudes toward animal experimentation. But this year the government asked the London-based pollsters, Ipsos MORI, to carry out a new survey, changing the wording of several questions. (The company also collected additional information, including public attitudes toward different animal species and current rules regarding their use.) For example, the phrase “animal experimentation” was replaced by “animal research” because the latter is “less inflammatory,” notes Ipsos MORI Research Manager Jerry Latter. In addition, says Emma Brown, a BIS spokeswoman, the word research “more accurately reflects the range of procedures that animals may be involved in, including the breeding of genetically modified animals.” But government officials also value the information about long-term trends in public attitudes that can be gleaned from the current survey. So they told the company to conduct one last round—the 10th in the series—at the same time they deployed the new survey. Each survey went to a representative, but different, sample of U.K. adults. © 2014 American Association for the Advancement of Scienc

Keyword: Animal Rights
Link ID: 20041 - Posted: 09.06.2014

One of the best things about being a neuroscientist used to be the aura of mystery around it. It was once so mysterious that some people didn’t even know it was a thing. When I first went to university and people asked what I studied, they thought I was saying I was a “Euroscientist”, which is presumably someone who studies the science of Europe. I’d get weird questions such as “what do you think of Belgium?” and I’d have to admit that, in all honesty, I never think of Belgium. That’s how mysterious neuroscience was, once. Of course, you could say this confusion was due to my dense Welsh accent, or the fact that I only had the confidence to talk to strangers after consuming a fair amount of alcohol, but I prefer to go with the mystery. It’s not like that any more. Neuroscience is “mainstream” now, to the point where the press coverage of it can be studied extensively. When there’s such a thing as Neuromarketing (well, there isn’t actually such a thing, but there’s a whole industry that would claim otherwise), it’s impossible to maintain that neuroscience is “cool” or “edgy”. It’s a bad time for us neurohipsters (which are the same as regular hipsters, except the designer beards are on the frontal lobes rather than the jaw-line). One way that we professional neuroscientists could maintain our superiority was by correcting misconceptions about the brain, but lately even that avenue looks to be closing to us. The recent film Lucy is based on the most classic brain misconception: that we only use 10% of our brain. But it’s had a considerable amount of flack for this already, suggesting that many people are wise to this myth. We also saw the recent release of Susan Greenfield’s new book Mind Change, all about how technology is changing (damaging?) our brains. This is a worryingly evidence-free but very common claim by Greenfield. Depressingly common, as this blog has pointed out many times. But now even the non-neuroscientist reviewers aren’t buying her claims. © 2014 Guardian News and Media Limited

Keyword: Miscellaneous
Link ID: 20011 - Posted: 08.30.2014

Sam McDougle By now, perhaps you’ve seen the trailer for the new sci-fi thriller Lucy. It starts with a flurry of stylized special effects and Scarlett Johansson serving up a barrage of bad-guy beatings. Then comes Morgan Freeman, playing a professorial neuroscientist with the obligatory brown blazer, to deliver the film’s familiar premise to a full lecture hall: “It is estimated most human beings only use 10 percent of the brain’s capacity. Imagine if we could access 100 percent. Interesting things begin to happen.” Johansson as Lucy, who has been kidnapped and implanted with mysterious drugs, becomes a test case for those interesting things, which seem to include even more impressive beatings and apparently some kind of Matrix-esque time-warping skills. Of course, the idea that “you only use 10 percent of your brain” is, indeed, 100 hundred percent bogus. Why has this myth persisted for so long, and when is it finally going to die? Unfortunately, not any time soon. A survey last year by The Michael J. Fox Foundation for Parkinson's Research found that 65 percent of Americans believe the myth is true, 5 percent more than those who believe in evolution. Even Mythbusters, which declared the statistic a myth a few years ago, further muddied the waters: The show merely increased the erroneous 10 percent figure and implied, incorrectly, that people use 35 percent of their brains. The idea that swaths of the brain are stagnant pudding while one section does all the work is silly. Like most legends, the origin of this fiction is unclear, though there are some clues. © 2014 by The Atlantic Monthly Group

Keyword: Brain imaging
Link ID: 19848 - Posted: 07.17.2014

By GARY MARCUS ARE we ever going to figure out how the brain works? After decades of research, diseases like schizophrenia and Alzheimer’s still resist treatment. Despite countless investigations into serotonin and other neurotransmitters, there is still no method to cure clinical depression. And for all the excitement about brain-imaging techniques, the limitations of fMRI studies are, as evidenced by popular books like “Brainwashed” and “Neuromania,” by now well known. In spite of the many remarkable advances in neuroscience, you might get the sinking feeling that we are not always going about brain science in the best possible way. This feeling was given prominent public expression on Monday, when hundreds of neuroscientists from all over the world issued an indignant open letter to the European Commission, which is funding the Human Brain Project, an approximately $1.6 billion effort that aims to build a complete computer simulation of the human brain. The letter charges that the project is “overly narrow” in approach and not “well conceived.” While no neuroscientist doubts that a faithful-to-life brain simulation would ultimately be tremendously useful, some have called the project “radically premature.” The controversy serves as a reminder that we scientists are not only far from a comprehensive explanation of how the brain works; we’re also not even in agreement about the best way to study it, or what questions we should be asking. The European Commission, like the Obama administration, which is promoting a large-scale research enterprise called the Brain Initiative, is investing heavily in neuroscience, and rightly so. (A set of new tools such as optogenetics, which allows neuroscientists to control the activity of individual neurons, gives considerable reason for optimism.) But neither project has grappled sufficiently with a critical question that is too often ignored in the field: What would a good theory of the brain actually look like? Different kinds of sciences call for different kinds of theories. Physicists, for example, are searching for a “grand unified theory” that integrates gravity, electromagnetism and the strong and weak nuclear forces into a neat package of equations. Whether or not they will get there, they have made considerable progress, in part because they know what they are looking for. © 2014 The New York Times Company

Keyword: Brain imaging
Link ID: 19818 - Posted: 07.12.2014

By ALEX HALBERSTADT Dr. Vint Virga likes to arrive at a zoo several hours before it opens, when the sun is still in the trees and the lanes are quiet and the trash cans empty. Many of the animals haven’t yet slipped into their afternoon ma­laise, when they retreat, appearing to wait out the heat and the visitors and not do much of anything. Virga likes to creep to the edge of their enclosures and watch. He chooses a spot and tries not to vary it, he says, “to give the animals a sense of control.” Sometimes he watches an animal for hours, hardly moving. That’s because what to an average zoo visitor looks like frolicking or restlessness or even boredom looks to Virga like a lot more — looks, in fact, like a veritable Russian novel of truculence, joy, sociability, horniness, ire, protectiveness, deference, melancholy and even humor. The ability to interpret animal behavior, Virga says, is a function of temperament, curiosity and, mostly, decades of practice. It is not, it turns out, especially easy. Do you know what it means when an elephant lowers her head and folds her trunk underneath it? Or when a zebra wuffles, softly blowing air between her lips; or when a colobus monkey snuffles, sounding a little like a hog rooting in the mud; or when a red fox screams, sounding disconcertingly like an infant; or when red fox kits chatter at one another; or when an African wild dog licks and nibbles at the lips of another; or when a California sea lion resting on the water’s surface stretches a fore flipper and one or both rear flippers in the air, like a synchronized swimmer; or when a hippopotamus “dung showers” by defecating while rapidly flapping its tail? Virga knows, because it is his job to know. He is a behaviorist, and what he does, expressed plainly, is see into the inner lives of animals. The profession is an odd one: It is largely unregulated, and declaring that you are an expert is sometimes enough to be taken for one. Most behaviorists are former animal trainers; some come from other fields entirely. Virga happens to be a veterinarian, very likely the only one in the country whose full-time job is tending to the psychological welfare of animals in captivity. © 2014 The New York Times Company

Keyword: Animal Rights; Aggression
Link ID: 19796 - Posted: 07.04.2014

By EDWARD ROTHSTEIN PHILADELPHIA — Clambering upward in dim violet light, stepping from one glass platform to another, you trigger flashes of light and polyps of sound. You climb through protective tubes of metallic mesh as you make your way through a maze of pathways. You are an electrical signal coursing through a neural network. You are immersed in the human brain. Well, almost. Here at the Franklin Institute, you’re at least supposed to get that impression. You pass through this realm (the climbing is optional) as part of “Your Brain” — the largest permanent exhibition at this venerable institution, and one of its best. That show, along with two other exhibitions, opens on Saturday in the new $41 million, 53,000-square-foot Nicholas and Athena Karabots Pavilion. This annex — designed by Saylor Gregg Architects, with an outer facade draped in a “shimmer wall” of hinged aluminum panels created by the artist Ned Kahn — expands the institution’s display space, educational facilities and convention possibilities. It also completes a transformation that began decades ago, turning one of the oldest hands-on science museums in the United States (as the Franklin puts it) into a contemporary science center, which typically combines aspects of a school, community center, amusement park, emporium, theater, international museum and interactive science lab — while also combining, as do many such institutions, those elements’ strengths and weaknesses. That brain immersion gallery gives a sense of this genre’s approach. It is designed more for amusement, effect and social interaction (cherished science center goals) than understanding. So I climb, but I’m not convinced. I hardly feel like part of a network of dendrites and axons as I weave through these pathways. I try, though, to imagine these tubes of psychedelically illuminated mesh filled with dozens of chattering children leaping around. That might offer a better inkling of the unpredictable, raucous complexity of the human brain. © 2014 The New York Times Company

Keyword: Miscellaneous
Link ID: 19730 - Posted: 06.14.2014

By ANNA NORTH The “brain” is a powerful thing. Not the organ itself — though of course it’s powerful, too — but the word. Including it in explanations of human behavior might make those explanations sound more legitimate — and that might be a problem. Though neuroscientific examinations of everyday experiences have fallen out of favor somewhat recently, the word “brain” remains popular in media. Ben Lillie, the director of the science storytelling series The Story Collider, drew attention to the phenomenon last week on Twitter, mentioning in particular a recent Atlantic article: “Your Kid’s Brain Might Benefit From an Extra Year in Middle School.” In the piece, Jessica Lahey, a teacher and education writer, examined the benefits of letting kids repeat eighth grade. Mr. Lillie told Op-Talk the word “brain” could take the emphasis off middle-school students as people: The piece, he said, was “not ignoring the fact that the middle schooler (in this case) is a person, but somehow taking a quarter-step away by focusing on a thing we don’t really think of as human.” The New York Times isn’t immune to “brain”-speak — in her 2013 project “Brainlines,” the artist Julia Buntaine collected all Times headlines using the word “brain” since 1851. She told Op-Talk in an email that “the number of headlines about the brain increased exponentially since around the year 2000, where some years before there were none at all, after that there were at least 30, 40, 80 headlines.” Adding “brain” to a headline may make it sound more convincing — some research shows that talking about the brain has measurable effects on how people respond to scientific explanations. In a 2008 study, researchers found that adding phrases like “brain scans indicate” to explanations of psychological concepts like attention made those explanations more satisfying to nonexpert audiences. Perhaps disturbingly, the effect was greatest when the explanations were actually wrong. © 2014 The New York Times Company

Keyword: Miscellaneous
Link ID: 19703 - Posted: 06.06.2014

By Matty Litwack One year ago, I thought I was going to die. Specifically, I believed an amoeba was eating my brain. As I’ve done countless times before, I called my mother in a panic: “Mom, I think I’m dying.” As she has done countless times before, she laughed at me. She doesn’t really take me seriously anymore, because I’m a massive hypochondriac. If there exists a disease, I’ve probably convinced myself that I have it. Every time I have a cough, I assume it’s lung cancer. One time I thought I had herpes, but it was just a piece of candy stuck to my face. In the case of the brain amoeba, however, I had a legitimate reason to believe I was dying. Several days prior, I had visited a doctor to treat my nasal congestion. The doctor deemed my sickness not severe enough to warrant antibiotics and instead suggested I try a neti pot to clear up my congestion. A neti pot is a vessel shaped like a genie’s lamp that’s used to irrigate the sinuses with saline solution. My neti pot came with an instruction manual, which I immediately discarded. Why would I need instructions? Nasal irrigation seemed like a simple enough process: water goes up one nostril and flows down the other – that’s just gravity. I dumped a bottle of natural spring water into the neti pot, mixed in some salt, shoved it in my nostril and started pouring. If there was in fact a genie living in the neti pot, I imagine this was very unpleasant for him. The pressure in my sinuses was instantly reduced. It worked so well that over the next couple of days, I was raving about neti pots to anybody who would allow me to annoy them. It was honestly surprising how little people wanted to hear about nasal irrigation. Some nodded politely, others asked me to stop talking about it, but one friend had a uniquely interesting reaction: “Oh, you’re using a neti pot?” he asked. “Watch out for the brain-eating amoeba.” This was hands-down the strangest warning I had ever received. I assumed it was a joke, but I made a mental note to Google brain amoebas as soon as I was done proselytizing the masses on the merits of saltwater nose genies. © 2014 Scientific American

Keyword: Miscellaneous
Link ID: 19618 - Posted: 05.15.2014

BERLIN—A national ad campaign targeting the work and person of neuroscientist Andreas Kreiter has caused an uproar in the German scientific community. Today, the Alliance of Scientific Organizations in Germany published a sharply worded statement against the full-page ads, which appeared in regional and national newspapers in April. The ad “crudely hurts the personal rights” of the scientist, the organizations write, and “defames biomedical research as a whole.” “The ad aims for personal annihilation,” Kreiter says, “and it is not acceptable for a state founded on the rule of law.” A professor of animal physiology at the University of Bremen (UB), Kreiter studies the neurophysiology of the macaque brain. His work has met with fierce resistance since the 1990s, but Kreiter says hostility peaked after he won a series of protracted legal battles over his work. The most recent trial finished in February, when the Federal Administrative Court of Germany confirmed earlier decisions that the animal distress caused by Kreiter’s research is justified given its scientific significance. A group called Tierversuchsgegner Bundesrepublik Deutschland (the German Association for Opponents of Animal Research), whose proclaimed goal is to end animal experimentation in Germany, has used advertising as a weapon for several years. But the most recent one (click here for a larger version in PDF) is the most personal and aggressive yet, Kreiter says; headlined “Kreiter continues in cold blood,” it features a photo of the researcher as well as a picture of a macaque, sitting immobilized in an experimental laboratory setup. It ran in many outlets on 16 and 17 April, including in leading newspapers such as the Frankfurter Allgemeine Zeitung and Die Zeit. © 2014 American Association for the Advancement of Science

Keyword: Animal Rights
Link ID: 19585 - Posted: 05.08.2014

Daniel Cressey Organizations such as People for the Ethical Treatment of Animals (PETA) have been campaigning for the disclosure of more information on animal research in the United Kingdom. The government of the United Kingdom wants to jettison rules that prevent it releasing any confidential information it holds about animal research, as part of a continuing push towards openness about such work. Animal-rights groups have long complained about what they characterise as a “secrecy clause” that prevents details of animal research in the UK being made public. The Home Office collects huge amounts of information such as the type of work done, the people and institutions doing it, and the results of inspections at laboratories. However it is currently prevented from revealing anything potentially considered confidential under ‘section 24’ of the rules governing animal research. Today the government said that it would like repeal this blanket ban on information disclosure, as it has previously promised, and requested comment on its proposal. In place of section 24, it would like to introduce a new rule prohibiting disclosure only of information relating to “people, places and intellectual property”. Home Office minister Norman Baker said in the consultation document released today, “To maintain public trust we must be as open and transparent as possible about activities under the regulatory framework.” If implemented, the new rule would still keep names and locations of animal research out of the public domain — a key concern of many researchers who fear protests or even violent attacks from extremist animal rights protestors. © 2014 Nature Publishing Group

Keyword: Animal Rights
Link ID: 19572 - Posted: 05.05.2014

By David Grimm “We did one study on cats—and that was enough!” Those words effectively ended my quest to understand the feline mind. I was a few months into writing Citizen Canine: Our Evolving Relationship With Cats and Dogs, which explores how pets are blurring the line between animal and person, and I was gearing up for a chapter on pet intelligence. I knew a lot had been written about dogs, and I assumed there must be at least a handful of studies on cats. But after weeks of scouring the scientific world for someone—anyone—who studied how cats think, all I was left with was this statement, laughed over the phone to me by one of the world’s top animal cognition experts, a Hungarian scientist named Ádám Miklósi. We are living in a golden age of canine cognition. Nearly a dozen laboratories around the world study the dog mind, and in the past decade scientists have published hundreds of articles on the topic. Researchers have shown that Fido can learn hundreds of words, may be capable of abstract thought, and possesses a rudimentary ability to intuit what others are thinking, a so-called theory of mind once thought to be uniquely human. Miklósi himself has written an entire textbook on the canine mind—and he’s a cat person. I knew I was in trouble even before I got Miklósi on the phone. After contacting nearly every animal cognition expert I could find (people who had studied the minds of dogs, elephants, chimpanzees, and other creatures), I was given the name of one man who might, just might, have done a study on cats. His name was Christian Agrillo, and he was a comparative psychologist at the University of Padova in Italy. When I looked at his website, I thought I had the wrong guy. A lot of his work was on fish. But when I talked to him he confirmed that, yes, he had done a study on felines. Then he laughed. “I can assure you that it’s easier to work with fish than cats,” he said. “It’s incredible.” © 2014 The Slate Group LLC.

Keyword: Intelligence; Aggression
Link ID: 19522 - Posted: 04.23.2014

Josh Fischman Dogs and cats, historically, have been people’s property like a couch or a toaster. But as they’ve moved into our houses and our hearts, courts of law have begun to treat them as something more. They can inherit your estate, get an appointed lawyer if your relatives challenge that inheritance and are protected from cruel acts. Your toaster can’t do any of that. As these animals inch closer to citizens' rights, the trend is being watched with worried eyes by biomedical researchers who fear judges could extend these rights to lab animals like monkeys and rats, thereby curbing experimentation. It also disturbs veterinarians who fear a flood of expensive malpractice suits if pets are worth more than their simple economic value. David Grimm, deputy news editor for Science magazine, explores this movement in his book Citizen Canine: Our Evolving Relationship with Cats and Dogs (PublicAffairs Books, 2014), published this month. He explained to Scientific American why scientists and animal doctors have good reason to be concerned. An edited transcript of the interview follows. In what way have dogs and cats moved beyond the status of property? They can inherit money, for one thing. And since property cannot inherit property, that makes them different. Legal scholars say that is the biggest change. About 25 US states have adopted the Uniform Trust Code, which allows animals to inherit.* Also judges have granted owners of slain animals awards of emotional damages. You cannot get emotional damages from the loss of a toaster. In 2004 a California jury awarded a man named Marc Bluestone $39,000 for the loss of his dog Shane; $30,000 of that was for Shane’s special and unique value to Bluestone. © 2014 Nature Publishing Group

Keyword: Animal Rights
Link ID: 19521 - Posted: 04.23.2014

By David Z. Hambrick and Christopher Chabris The College Board—the standardized testing behemoth that develops and administers the SAT and other tests—has redesigned its flagship product again. Beginning in spring 2016, the writing section will be optional, the reading section will no longer test “obscure” vocabulary words, and the math section will put more emphasis on solving problems with real-world relevance. Overall, as the College Board explains on its website, “The redesigned SAT will more closely reflect the real work of college and career, where a flexible command of evidence—whether found in text or graphic [sic]—is more important than ever.” A number of pressures may be behind this redesign. Perhaps it’s competition from the ACT, or fear that unless the SAT is made to seem more relevant, more colleges will go the way of Wake Forest, Brandeis, and Sarah Lawrence and join the “test optional admissions movement,” which already boasts several hundred members. Or maybe it’s the wave of bad press that standardized testing, in general, has received over the past few years. Critics of standardized testing are grabbing this opportunity to take their best shot at the SAT. They make two main arguments. The first is simply that a person’s SAT score is essentially meaningless—that it says nothing about whether that person will go on to succeed in college. Leon Botstein, president of Bard College and longtime standardized testing critic, wrote in Time that the SAT “needs to be abandoned and replaced,” © 2014 The Slate Group LLC.

Keyword: Intelligence
Link ID: 19508 - Posted: 04.19.2014

James Gorman Crows and their relatives, like jays and rooks, are definitely in the gifted class when it comes to the kinds of cognitive puzzles that scientists cook up. They recognize human faces. They make tools to suit a given problem. Sometimes they seem, as humans like to say, almost human. But the last common ancestor of humans and crows lived perhaps 300 million years ago, and was almost certainly no intellectual giant. So the higher levels of crow and primate intelligence evolved on separate tracks, but somehow reached some of the same destinations. And scientists are now asking what crows can’t do, as one way to understand how they learn and how their intelligence works. One very useful tool for this research comes from an ancient Greek (or perhaps Ethiopian), the fabulist known as Aesop. One of his stories is about a thirsty crow that drops pebbles into a pitcher to raise the level of water high enough that it can get a drink. Researchers have modified this task by adding a floating morsel of food to a tube with water and seeing which creatures solve the problem of using stones to raise the water enough to get the food. It can be used for a variety of species because it’s new to all of them. “No animal has a natural predisposition to drop stones to change water levels,” said Sarah Jelbert, a Ph.D. student at Auckland University in New Zealand, who works with crows. But in the latest experiment to test the crows, Ms. Jelbert, working with Alex Taylor and Russell Gray of Auckland and Lucy Cheke and Nicola Clayton of the University of Cambridge in England, found some clear limitations to what the crows can learn. And those limitations provide some hints to how they think. © 2014 The New York Times Company

Keyword: Intelligence; Aggression
Link ID: 19476 - Posted: 04.12.2014

For years, some biomedical researchers have worried that a push for more bench-to-bedside studies has meant less support for basic research. Now, the chief of one of the National Institutes of Health’s (NIH’s) largest institutes has added her voice—and hard data—to the discussion. Story Landis describes what she calls a “sharp decrease” in basic research at her institute, a trend she finds worrisome. In a blog post last week, Landis, director of the $1.6 billion National Institute of Neurological Disorders and Stroke (NINDS), says her staff started out asking why, in the mid-2000s, NINDS funding declined for R01s, the investigator-initiated grants that are the mainstay of most labs. After examining the aims and abstracts of grants funded between 1997 and 2012, her staff found that the portion of NINDS competing grant funding that went to basic research has declined (from 87% to 71%) while applied research rose (from 13% to 29%). To dig deeper, the staffers divided the grants into four categories—basic/basic; basic/disease-focused; applied/translational; and applied/clinical. Here, the decline in basic/basic research was “striking”: It fell from 52% to 27% of new and competing grants, while basic/disease-focused has been rising (see graph). The same trend emerged when the analysts looked only at investigator-initiated grants, which are proposals based on a researcher’s own ideas, not a solicitation by NINDS for proposals in a specific area. The shift could reflect changes in science and “a natural progression of the field,” Landis writes. Or it could mean researchers “falsely believe” that NINDS is not interested in basic studies and they have a better shot at being funded if they propose disease-focused or applied studies. The tight NIH budget and new programs focused on translational research could be fostering this belief, she writes. When her staff compared applications submitted in 2008 and 2011, they found support for a shift to disease-focused proposals: There was a “striking” 21% decrease in the amount of funding requested for basic studies, even though those grants had a better chance of being funded. © 2014 American Association for the Advancement of Science.

Keyword: Movement Disorders
Link ID: 19440 - Posted: 04.02.2014

by Aviva Rutkin Eureka! Like Archimedes in his bath, crows know how to displace water, showing that Aesop's fable The Crow and the Pitcher isn't purely fictional. To see if New Caledonian crows could handle some of the basic principles of volume displacement, Sarah Jelbert at the University of Auckland in New Zealand and her colleagues placed scraps of meat just out of a crow's reach, floating in a series of tubes that were part-filled with water. Objects potentially useful for bringing up the water level, like stones or heavy rubber erasers, were left nearby. The crows successfully figured out that heavy and solid objects would help them get a treat faster. They also preferred to drop objects in tubes where they could access a reward more easily, picking out tubes with higher water levels and choosing tubes of water over sand-filled ones. However, the crows failed at more challenging tasks that required an understanding of the effect of tube width or the ability to infer a hidden connection between two linked tubes. The crows displayed reasoning skills equivalent to an average 5 to 7 year old human child, the researchers claim. Previously, Eurasian jays have shown some understanding of water displacement, as have chimpanzees and orang-utans, but using similar experiments could assess and compare their skill levels. "Any animal capable of picking up stones could potentially participate," write the researchers. © Copyright Reed Business Information Ltd.

Keyword: Intelligence; Aggression
Link ID: 19413 - Posted: 03.27.2014

By Dominic Basulto In last weekend’s Wall Street Journal, two leading brain researchers conjectured that as a result of rapid breakthroughs in fields such as molecular biology and neuroscience, one day “brain implants” will be just about as common as getting a bit of plastic surgery is today. In short, today’s tummy tucks are tomorrow’s brain tucks. Similar to what you’d expect from watching science fiction films such as “The Matrix,” these brain implants would enable you to learn foreign languages effortlessly, upgrade your memory capabilities, and, yes, help you to know Kung Fu. Vinton Cerf argues that today’s Internet (think Google) is already a form of cognitive implant, helping us to learn the answer to just about anything within seconds. If computing power continues to increase at the same rate as it has for the past 50 years, it is likely that a single computer will have the computing capacity of a human brain by 2023. By 2045, a single computer could have the processing capability of all human brains put together. Just think what you’d be able to use Google to do then. You wouldn’t even need to type in a search query, your brain would already know the answer. Of course, the ability to create these brain implants raises a number of philosophical, ethical and moral questions. If you’re a young student having a tough time in a boring class, why not just buy a brain module that simulates the often repetitive nature of learning? If you’re a parent of a child looking to get into a top university, why not buy a brain implant as a way to gain an advantage over children from less privileged backgrounds, especially when it’s SAT time? Instead of the digital divide, we may be talking about the cognitive divide at some point in the next two decades. Some parents would be able to afford a 99 percent percentile brain for their children, while others wouldn’t. © 1996-2014 The Washington Post

Keyword: Robotics
Link ID: 19391 - Posted: 03.21.2014