Chapter 1. An Introduction to Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By EDWARD ROTHSTEIN PHILADELPHIA — Clambering upward in dim violet light, stepping from one glass platform to another, you trigger flashes of light and polyps of sound. You climb through protective tubes of metallic mesh as you make your way through a maze of pathways. You are an electrical signal coursing through a neural network. You are immersed in the human brain. Well, almost. Here at the Franklin Institute, you’re at least supposed to get that impression. You pass through this realm (the climbing is optional) as part of “Your Brain” — the largest permanent exhibition at this venerable institution, and one of its best. That show, along with two other exhibitions, opens on Saturday in the new $41 million, 53,000-square-foot Nicholas and Athena Karabots Pavilion. This annex — designed by Saylor Gregg Architects, with an outer facade draped in a “shimmer wall” of hinged aluminum panels created by the artist Ned Kahn — expands the institution’s display space, educational facilities and convention possibilities. It also completes a transformation that began decades ago, turning one of the oldest hands-on science museums in the United States (as the Franklin puts it) into a contemporary science center, which typically combines aspects of a school, community center, amusement park, emporium, theater, international museum and interactive science lab — while also combining, as do many such institutions, those elements’ strengths and weaknesses. That brain immersion gallery gives a sense of this genre’s approach. It is designed more for amusement, effect and social interaction (cherished science center goals) than understanding. So I climb, but I’m not convinced. I hardly feel like part of a network of dendrites and axons as I weave through these pathways. I try, though, to imagine these tubes of psychedelically illuminated mesh filled with dozens of chattering children leaping around. That might offer a better inkling of the unpredictable, raucous complexity of the human brain. © 2014 The New York Times Company
Link ID: 19730 - Posted: 06.14.2014
By ANNA NORTH The “brain” is a powerful thing. Not the organ itself — though of course it’s powerful, too — but the word. Including it in explanations of human behavior might make those explanations sound more legitimate — and that might be a problem. Though neuroscientific examinations of everyday experiences have fallen out of favor somewhat recently, the word “brain” remains popular in media. Ben Lillie, the director of the science storytelling series The Story Collider, drew attention to the phenomenon last week on Twitter, mentioning in particular a recent Atlantic article: “Your Kid’s Brain Might Benefit From an Extra Year in Middle School.” In the piece, Jessica Lahey, a teacher and education writer, examined the benefits of letting kids repeat eighth grade. Mr. Lillie told Op-Talk the word “brain” could take the emphasis off middle-school students as people: The piece, he said, was “not ignoring the fact that the middle schooler (in this case) is a person, but somehow taking a quarter-step away by focusing on a thing we don’t really think of as human.” The New York Times isn’t immune to “brain”-speak — in her 2013 project “Brainlines,” the artist Julia Buntaine collected all Times headlines using the word “brain” since 1851. She told Op-Talk in an email that “the number of headlines about the brain increased exponentially since around the year 2000, where some years before there were none at all, after that there were at least 30, 40, 80 headlines.” Adding “brain” to a headline may make it sound more convincing — some research shows that talking about the brain has measurable effects on how people respond to scientific explanations. In a 2008 study, researchers found that adding phrases like “brain scans indicate” to explanations of psychological concepts like attention made those explanations more satisfying to nonexpert audiences. Perhaps disturbingly, the effect was greatest when the explanations were actually wrong. © 2014 The New York Times Company
Link ID: 19703 - Posted: 06.06.2014
By Matty Litwack One year ago, I thought I was going to die. Specifically, I believed an amoeba was eating my brain. As I’ve done countless times before, I called my mother in a panic: “Mom, I think I’m dying.” As she has done countless times before, she laughed at me. She doesn’t really take me seriously anymore, because I’m a massive hypochondriac. If there exists a disease, I’ve probably convinced myself that I have it. Every time I have a cough, I assume it’s lung cancer. One time I thought I had herpes, but it was just a piece of candy stuck to my face. In the case of the brain amoeba, however, I had a legitimate reason to believe I was dying. Several days prior, I had visited a doctor to treat my nasal congestion. The doctor deemed my sickness not severe enough to warrant antibiotics and instead suggested I try a neti pot to clear up my congestion. A neti pot is a vessel shaped like a genie’s lamp that’s used to irrigate the sinuses with saline solution. My neti pot came with an instruction manual, which I immediately discarded. Why would I need instructions? Nasal irrigation seemed like a simple enough process: water goes up one nostril and flows down the other – that’s just gravity. I dumped a bottle of natural spring water into the neti pot, mixed in some salt, shoved it in my nostril and started pouring. If there was in fact a genie living in the neti pot, I imagine this was very unpleasant for him. The pressure in my sinuses was instantly reduced. It worked so well that over the next couple of days, I was raving about neti pots to anybody who would allow me to annoy them. It was honestly surprising how little people wanted to hear about nasal irrigation. Some nodded politely, others asked me to stop talking about it, but one friend had a uniquely interesting reaction: “Oh, you’re using a neti pot?” he asked. “Watch out for the brain-eating amoeba.” This was hands-down the strangest warning I had ever received. I assumed it was a joke, but I made a mental note to Google brain amoebas as soon as I was done proselytizing the masses on the merits of saltwater nose genies. © 2014 Scientific American
Link ID: 19618 - Posted: 05.15.2014
BERLIN—A national ad campaign targeting the work and person of neuroscientist Andreas Kreiter has caused an uproar in the German scientific community. Today, the Alliance of Scientific Organizations in Germany published a sharply worded statement against the full-page ads, which appeared in regional and national newspapers in April. The ad “crudely hurts the personal rights” of the scientist, the organizations write, and “defames biomedical research as a whole.” “The ad aims for personal annihilation,” Kreiter says, “and it is not acceptable for a state founded on the rule of law.” A professor of animal physiology at the University of Bremen (UB), Kreiter studies the neurophysiology of the macaque brain. His work has met with fierce resistance since the 1990s, but Kreiter says hostility peaked after he won a series of protracted legal battles over his work. The most recent trial finished in February, when the Federal Administrative Court of Germany confirmed earlier decisions that the animal distress caused by Kreiter’s research is justified given its scientific significance. A group called Tierversuchsgegner Bundesrepublik Deutschland (the German Association for Opponents of Animal Research), whose proclaimed goal is to end animal experimentation in Germany, has used advertising as a weapon for several years. But the most recent one (click here for a larger version in PDF) is the most personal and aggressive yet, Kreiter says; headlined “Kreiter continues in cold blood,” it features a photo of the researcher as well as a picture of a macaque, sitting immobilized in an experimental laboratory setup. It ran in many outlets on 16 and 17 April, including in leading newspapers such as the Frankfurter Allgemeine Zeitung and Die Zeit. © 2014 American Association for the Advancement of Science
Keyword: Animal Rights
Link ID: 19585 - Posted: 05.08.2014
Daniel Cressey Organizations such as People for the Ethical Treatment of Animals (PETA) have been campaigning for the disclosure of more information on animal research in the United Kingdom. The government of the United Kingdom wants to jettison rules that prevent it releasing any confidential information it holds about animal research, as part of a continuing push towards openness about such work. Animal-rights groups have long complained about what they characterise as a “secrecy clause” that prevents details of animal research in the UK being made public. The Home Office collects huge amounts of information such as the type of work done, the people and institutions doing it, and the results of inspections at laboratories. However it is currently prevented from revealing anything potentially considered confidential under ‘section 24’ of the rules governing animal research. Today the government said that it would like repeal this blanket ban on information disclosure, as it has previously promised, and requested comment on its proposal. In place of section 24, it would like to introduce a new rule prohibiting disclosure only of information relating to “people, places and intellectual property”. Home Office minister Norman Baker said in the consultation document released today, “To maintain public trust we must be as open and transparent as possible about activities under the regulatory framework.” If implemented, the new rule would still keep names and locations of animal research out of the public domain — a key concern of many researchers who fear protests or even violent attacks from extremist animal rights protestors. © 2014 Nature Publishing Group
Keyword: Animal Rights
Link ID: 19572 - Posted: 05.05.2014
By David Grimm “We did one study on cats—and that was enough!” Those words effectively ended my quest to understand the feline mind. I was a few months into writing Citizen Canine: Our Evolving Relationship With Cats and Dogs, which explores how pets are blurring the line between animal and person, and I was gearing up for a chapter on pet intelligence. I knew a lot had been written about dogs, and I assumed there must be at least a handful of studies on cats. But after weeks of scouring the scientific world for someone—anyone—who studied how cats think, all I was left with was this statement, laughed over the phone to me by one of the world’s top animal cognition experts, a Hungarian scientist named Ádám Miklósi. We are living in a golden age of canine cognition. Nearly a dozen laboratories around the world study the dog mind, and in the past decade scientists have published hundreds of articles on the topic. Researchers have shown that Fido can learn hundreds of words, may be capable of abstract thought, and possesses a rudimentary ability to intuit what others are thinking, a so-called theory of mind once thought to be uniquely human. Miklósi himself has written an entire textbook on the canine mind—and he’s a cat person. I knew I was in trouble even before I got Miklósi on the phone. After contacting nearly every animal cognition expert I could find (people who had studied the minds of dogs, elephants, chimpanzees, and other creatures), I was given the name of one man who might, just might, have done a study on cats. His name was Christian Agrillo, and he was a comparative psychologist at the University of Padova in Italy. When I looked at his website, I thought I had the wrong guy. A lot of his work was on fish. But when I talked to him he confirmed that, yes, he had done a study on felines. Then he laughed. “I can assure you that it’s easier to work with fish than cats,” he said. “It’s incredible.” © 2014 The Slate Group LLC.
Josh Fischman Dogs and cats, historically, have been people’s property like a couch or a toaster. But as they’ve moved into our houses and our hearts, courts of law have begun to treat them as something more. They can inherit your estate, get an appointed lawyer if your relatives challenge that inheritance and are protected from cruel acts. Your toaster can’t do any of that. As these animals inch closer to citizens' rights, the trend is being watched with worried eyes by biomedical researchers who fear judges could extend these rights to lab animals like monkeys and rats, thereby curbing experimentation. It also disturbs veterinarians who fear a flood of expensive malpractice suits if pets are worth more than their simple economic value. David Grimm, deputy news editor for Science magazine, explores this movement in his book Citizen Canine: Our Evolving Relationship with Cats and Dogs (PublicAffairs Books, 2014), published this month. He explained to Scientific American why scientists and animal doctors have good reason to be concerned. An edited transcript of the interview follows. In what way have dogs and cats moved beyond the status of property? They can inherit money, for one thing. And since property cannot inherit property, that makes them different. Legal scholars say that is the biggest change. About 25 US states have adopted the Uniform Trust Code, which allows animals to inherit.* Also judges have granted owners of slain animals awards of emotional damages. You cannot get emotional damages from the loss of a toaster. In 2004 a California jury awarded a man named Marc Bluestone $39,000 for the loss of his dog Shane; $30,000 of that was for Shane’s special and unique value to Bluestone. © 2014 Nature Publishing Group
Keyword: Animal Rights
Link ID: 19521 - Posted: 04.23.2014
By David Z. Hambrick and Christopher Chabris The College Board—the standardized testing behemoth that develops and administers the SAT and other tests—has redesigned its flagship product again. Beginning in spring 2016, the writing section will be optional, the reading section will no longer test “obscure” vocabulary words, and the math section will put more emphasis on solving problems with real-world relevance. Overall, as the College Board explains on its website, “The redesigned SAT will more closely reflect the real work of college and career, where a flexible command of evidence—whether found in text or graphic [sic]—is more important than ever.” A number of pressures may be behind this redesign. Perhaps it’s competition from the ACT, or fear that unless the SAT is made to seem more relevant, more colleges will go the way of Wake Forest, Brandeis, and Sarah Lawrence and join the “test optional admissions movement,” which already boasts several hundred members. Or maybe it’s the wave of bad press that standardized testing, in general, has received over the past few years. Critics of standardized testing are grabbing this opportunity to take their best shot at the SAT. They make two main arguments. The first is simply that a person’s SAT score is essentially meaningless—that it says nothing about whether that person will go on to succeed in college. Leon Botstein, president of Bard College and longtime standardized testing critic, wrote in Time that the SAT “needs to be abandoned and replaced,” © 2014 The Slate Group LLC.
Link ID: 19508 - Posted: 04.19.2014
James Gorman Crows and their relatives, like jays and rooks, are definitely in the gifted class when it comes to the kinds of cognitive puzzles that scientists cook up. They recognize human faces. They make tools to suit a given problem. Sometimes they seem, as humans like to say, almost human. But the last common ancestor of humans and crows lived perhaps 300 million years ago, and was almost certainly no intellectual giant. So the higher levels of crow and primate intelligence evolved on separate tracks, but somehow reached some of the same destinations. And scientists are now asking what crows can’t do, as one way to understand how they learn and how their intelligence works. One very useful tool for this research comes from an ancient Greek (or perhaps Ethiopian), the fabulist known as Aesop. One of his stories is about a thirsty crow that drops pebbles into a pitcher to raise the level of water high enough that it can get a drink. Researchers have modified this task by adding a floating morsel of food to a tube with water and seeing which creatures solve the problem of using stones to raise the water enough to get the food. It can be used for a variety of species because it’s new to all of them. “No animal has a natural predisposition to drop stones to change water levels,” said Sarah Jelbert, a Ph.D. student at Auckland University in New Zealand, who works with crows. But in the latest experiment to test the crows, Ms. Jelbert, working with Alex Taylor and Russell Gray of Auckland and Lucy Cheke and Nicola Clayton of the University of Cambridge in England, found some clear limitations to what the crows can learn. And those limitations provide some hints to how they think. © 2014 The New York Times Company
For years, some biomedical researchers have worried that a push for more bench-to-bedside studies has meant less support for basic research. Now, the chief of one of the National Institutes of Health’s (NIH’s) largest institutes has added her voice—and hard data—to the discussion. Story Landis describes what she calls a “sharp decrease” in basic research at her institute, a trend she finds worrisome. In a blog post last week, Landis, director of the $1.6 billion National Institute of Neurological Disorders and Stroke (NINDS), says her staff started out asking why, in the mid-2000s, NINDS funding declined for R01s, the investigator-initiated grants that are the mainstay of most labs. After examining the aims and abstracts of grants funded between 1997 and 2012, her staff found that the portion of NINDS competing grant funding that went to basic research has declined (from 87% to 71%) while applied research rose (from 13% to 29%). To dig deeper, the staffers divided the grants into four categories—basic/basic; basic/disease-focused; applied/translational; and applied/clinical. Here, the decline in basic/basic research was “striking”: It fell from 52% to 27% of new and competing grants, while basic/disease-focused has been rising (see graph). The same trend emerged when the analysts looked only at investigator-initiated grants, which are proposals based on a researcher’s own ideas, not a solicitation by NINDS for proposals in a specific area. The shift could reflect changes in science and “a natural progression of the field,” Landis writes. Or it could mean researchers “falsely believe” that NINDS is not interested in basic studies and they have a better shot at being funded if they propose disease-focused or applied studies. The tight NIH budget and new programs focused on translational research could be fostering this belief, she writes. When her staff compared applications submitted in 2008 and 2011, they found support for a shift to disease-focused proposals: There was a “striking” 21% decrease in the amount of funding requested for basic studies, even though those grants had a better chance of being funded. © 2014 American Association for the Advancement of Science.
Keyword: Movement Disorders
Link ID: 19440 - Posted: 04.02.2014
by Aviva Rutkin Eureka! Like Archimedes in his bath, crows know how to displace water, showing that Aesop's fable The Crow and the Pitcher isn't purely fictional. To see if New Caledonian crows could handle some of the basic principles of volume displacement, Sarah Jelbert at the University of Auckland in New Zealand and her colleagues placed scraps of meat just out of a crow's reach, floating in a series of tubes that were part-filled with water. Objects potentially useful for bringing up the water level, like stones or heavy rubber erasers, were left nearby. The crows successfully figured out that heavy and solid objects would help them get a treat faster. They also preferred to drop objects in tubes where they could access a reward more easily, picking out tubes with higher water levels and choosing tubes of water over sand-filled ones. However, the crows failed at more challenging tasks that required an understanding of the effect of tube width or the ability to infer a hidden connection between two linked tubes. The crows displayed reasoning skills equivalent to an average 5 to 7 year old human child, the researchers claim. Previously, Eurasian jays have shown some understanding of water displacement, as have chimpanzees and orang-utans, but using similar experiments could assess and compare their skill levels. "Any animal capable of picking up stones could potentially participate," write the researchers. © Copyright Reed Business Information Ltd.
By Dominic Basulto In last weekend’s Wall Street Journal, two leading brain researchers conjectured that as a result of rapid breakthroughs in fields such as molecular biology and neuroscience, one day “brain implants” will be just about as common as getting a bit of plastic surgery is today. In short, today’s tummy tucks are tomorrow’s brain tucks. Similar to what you’d expect from watching science fiction films such as “The Matrix,” these brain implants would enable you to learn foreign languages effortlessly, upgrade your memory capabilities, and, yes, help you to know Kung Fu. Vinton Cerf argues that today’s Internet (think Google) is already a form of cognitive implant, helping us to learn the answer to just about anything within seconds. If computing power continues to increase at the same rate as it has for the past 50 years, it is likely that a single computer will have the computing capacity of a human brain by 2023. By 2045, a single computer could have the processing capability of all human brains put together. Just think what you’d be able to use Google to do then. You wouldn’t even need to type in a search query, your brain would already know the answer. Of course, the ability to create these brain implants raises a number of philosophical, ethical and moral questions. If you’re a young student having a tough time in a boring class, why not just buy a brain module that simulates the often repetitive nature of learning? If you’re a parent of a child looking to get into a top university, why not buy a brain implant as a way to gain an advantage over children from less privileged backgrounds, especially when it’s SAT time? Instead of the digital divide, we may be talking about the cognitive divide at some point in the next two decades. Some parents would be able to afford a 99 percent percentile brain for their children, while others wouldn’t. © 1996-2014 The Washington Post
Link ID: 19391 - Posted: 03.21.2014
Animal rights activists have dramatically shifted their tactics over the last decade, targeting individual researchers and the businesses that support them, instead of going after their universities. That’s the biggest revelation to come out of a report released today by the Federation of American Societies for Experimental Biology (FASEB), the largest coalition of biomedical research associations in the United States. The purpose of the report—The Threat of Extremism to Medical Research: Best Practices to Mitigate Risk through Preparation and Communication—is to provide guidance to scientists and institutions around the world in dealing with animal rights extremists. That includes individuals and groups that damage laboratories, send threatening e-mails, and even desecrate the graves of researchers’ relatives. In 2004, for example, Animal Liberation Front activists broke into psychology laboratories at the University of Iowa, where they smashed equipment, spray-painted walls, and removed hundreds of animals, causing more than $400,000 in damage. In 2009, extremists set fire to the car of a University of California, Los Angeles, neuroscientist who worked on rats and monkeys. And other researchers say activists have shown up at their homes in the middle of the night, threatening their families and children. “We wanted to create an international document to get people thinking about the potential of animal extremism,” says Michael Conn, a co-chair of the committee that created the report and the senior vice president for research at the Texas Tech University Health Sciences Center in Lubbock. “These activities can happen to anybody—no one is immune.” © 2014 American Association for the Advancement of Science
Keyword: Animal Rights
Link ID: 19355 - Posted: 03.13.2014
Penis envy. Repression. Libido. Ego. Few have left a legacy as enduring and pervasive as Sigmund Freud. Despite being dismissed long ago as pseudoscientific, Freudian concepts such as these not only permeate many aspects of popular culture, but also had an overarching influence on, and played an important role in the development of, modern psychology, leading Time magazine to name him as one of the most important thinkers of the 20th century. Before his rise to fame as the founding father of psychoanalysis, however, Freud trained and worked as a neurologist. He carried out pioneering neurobiological research, which was cited by Santiago Ramóny Cajal, the father of modern neuroscience, and helped to establish neuroscience as a discipline. The eldest of eight children, Freud was born on 6 May, 1856, in the Moravian town of Příbor, in what is now the Czech Republic. Four years later, Freud's father Jakob, a wool merchant, moved the family to Austria in search of new business opportunities. Freud subsequently entered the university there, aged just 17, to study medicine and, in the second year of his degree, became preoccupied with scientific research. His early work was a harbinger of things to come – it focused on the sexual organs of the eel. The work was, by all accounts, satisfactory, but Freud was disappointed with his results and, perhaps dismayed by the prospect of dissecting more eels, moved to Ernst Brücke's laboratory in 1877. There, he switched to studying the biology of nervous tissue, an endeavour that would last for 10 years. © 2014 Guardian News and Media Limited
Link ID: 19350 - Posted: 03.12.2014
If you ever feel like your emotions are getting the best of you, you may want to try dimming the lights. According to researchers at the University of Toronto Scarborough, bright light can make us more emotional — for better or for worse — making us experience both positive and negative feelings more intensely. The findings seem to contradict commonly held notions that people feel happier and more optimistic on bright, sunny days and gloomier on dark, cloudy days. In fact, the idea for the study was spurred by findings that suicide rates peak in the late spring and summer, when sunshine is most abundant. “I was very surprised by this,” study author Alison Jing Xu told CBC News. Xu is an assistant professor of management at UTSC and the Rotman School of Management. “Normally I would say if brighter days generally increase people’s affect, then suicide rates should peak in winter — but actually it does not,” she said. Xu, along with the study’s co-author Aparna Labroo of Northwestern University in the U.S., conducted six experiments to explore the relationship between light and emotion. Their paper is published in the Journal of Consumer Psychology. Participants in each case were divided into two groups: Some were placed in a brightly lit room where fluorescent ceiling lights were turned on, while others were placed in a dimly lit room where the only light came from computer monitors. © CBC 2014
By BENEDICT CAREY BETHESDA, Md. — The police arrived at the house just after breakfast, dressed in full riot gear, and set up a perimeter at the front and back. Not long after, animal rights marchers began filling the street: scores of people, young and old, yelling accusations of murder and abuse, invoking Hitler, as neighbors stepped out onto their porches and stared. It was 1997, in Decatur, Ga. The demonstrators had clashed with the police that week, at the Yerkes National Primate Research Center at nearby Emory University, but this time, they were paying a personal call — on the house of the center’s director, inside with his wife and two teenage children. “I think it affected the three of them more than it did me, honestly,” said Dr. Thomas R. Insel, shaking his head at the memory. “But the university insisted on moving all of us to a safe place for a few days, to an ‘undisclosed location.’ “I’ll say this. I learned that if you’re going to take a stand, you’re going to make some people really angry — so you’d better believe in what you’re doing, and believe it completely.” For the past 11 years, Dr. Insel, a 62-year-old brain scientist, has run an equally contentious but far more influential outfit: the National Institute of Mental Health, the world’s leading backer of behavioral health research. The job comes with risk as well as power. Patient groups and scientists continually question the agency’s priorities, and politicians occasionally snipe at its decisions. Two previous directors resigned in the wake of inflammatory statements (one on marijuana laws, one comparing urban neighborhoods to jungles), and another stepped down after repeatedly objecting to White House decisions. © 2014 The New York Times Company
By Evelyn Boychuk, Caleb is a 14-year-old who enjoys playing video games and reading any book he can get his hands on – and in his spare time, he edits neuroscience papers for a scientific journal. Frontiers for Young Minds is the first journal to bring kids into the middle of the scientific process by making them editors – and it’s free for everyone. The idea came “from the depths of my mind, in a moment when I was bored at a scientific meeting,” says Bob Knight, editor in chief of Frontiers for Young Minds and a professor of psychology and neuroscience at the University of California, Berkeley. This is one of many science outreach efforts that are trying to get youth excited about science, technology, engineering and math courses. A preview version with 15 articles was released at the Society for Neuroscience conference on Nov. 11. The official launch of the monthly journal is planned for the U.S.A. Science and Engineering Festival in Washington D.C. in April. “The kids have been great,” says Knight. “Their reviews are not filtered, they just tell you what they think.” In an e-mail, one of the young editors said, “'Hey Bob, I have to tell you, I didn’t understand anything in this article. The words are too big and it’s too confusing,'” Knight recounted. When Caleb was asked if he would edit an article for this preview, "it seemed like an interesting opportunity," he said, so he gave it a try. © CBC 2014
Link ID: 19145 - Posted: 01.18.2014
Imagine a couple of million years ago, a curious young alien from the planet Zantar — let's call him a grad student — lands on Earth, looks around and asks, "Who's the brainiest critter on this planet? Relative to body size, who's got the biggest brain?" The answer, back then, would not have been us. (Two million years ago, apes — even walking ones — had much smaller brains.) The brainiest weren't ancestral crows or parrots or magpies or ravens or elephants or colonies of ants or bees or termites. The Earthlings with the biggest brains back then were dolphins (and certain whales). The Zantarian grad student would have wanted to meet them. A visitor from Zantar and a dolphin check each other out. But had the grad student arrived earlier, dolphins wouldn't have been the champs, because evolution is always changing life. , at Emory University in Atlanta, has been studying fossilized brains. And looking back, she sees sudden spurts of brain growth in different animals. "[T]he most dramatic increase in brain-to-body ratio in dolphins and toothed whales occurred 35 million years ago," she tells Chris Impey, the astronomer and writer, in Talking About Life. Something happened to make their medium-sized brains bigger, Lori says, then bigger still. For 20 million years certain dolphin species kept their brains growing until — just as mysteriously as it started — about 15 million years ago, they stopped. Why? Had the dolphins answered some secret dolphin question? Figured out a puzzle? Adapted to an environmental change? Gotten tired? Hit a limit? What? Dolphin says, "Enough." ©2014 NPR
The battle over animal experimentation in Italy took a nasty turn this week when anonymous activists posted fliers showing photos, home addresses, and telephone numbers of scientists involved in animal research at the University of Milan and labeled them as "murderers." The leaflets, which appeared in the night of 6 to 7 January, triggered widespread condemnation in academic and political circles. The posters targeted physiologist Edgardo D'Angelo, parasitologist Claudio Genchi, pharmacologist Alberto Corsini, and Maura Francolini, a biologist. The texts say they are “guilty” of performing animal experiments; Corsini is said to "have tortured and killed animals for more than 30 years.” His flier ends with his phone number and the suggestion to "call this executioner and tell him what you think of him." Although the fliers didn't contain a specific call to violence, the implicit threat is unmistakable, Italian scientists say. Pro-Test Italia, an organization that seeks to defend and explain animal research, has likened the campaign to a witch hunt. “It's unacceptable that those who work for the good of science and public health are called murderers by someone who publicly incites violence against them,” says Dario Padovan, a biologist and president of Pro-Test Italia. Many politicians condemned the new tactic as well. "I wish to express my deepest sympathy and support to the researchers in Milan for the intimidation and threats they suffered," Italy's minister of education, universities and research, Maria Chiara Carrozza, tweeted yesterday. The University of Milan has filed a complaint and the city's police department has started an investigation. “We will strengthen our commitment to the defense of research as a tool to improve knowledge and care for sick people,” Gianluca Vago, the university's rector, told the newspaper Corriere della Sera. © 2014 American Association for the Advancement of Science
Keyword: Animal Rights
Link ID: 19116 - Posted: 01.11.2014
By Christian Jarrett Christmas is over and the start of the movie awards season is only weeks away! This is my excuse for a post about cinema and the brain. Over the years I’ve been keeping note of actors who studied neuroscience and other similar factoids and now I have the chance to share them with you. So here, in no particular order, are 10 surprising links between the worlds of Hollywood and brain research: 1. Actress Mayim Bialik is a neuroscientist. Bialik currently plays the character of neuroscientist Amy Fowler in the Big Bang Theory, which is neat because Bialik herself has a PhD in neuroscience. Her PhD thesis, completed at UCLA in 2007, has the title: “Hypothalamic regulation in relation to maladaptive, obsessive-compulsive, affiliative, and satiety behaviors in Prader-Willi syndrome.” “I don’t try and rub my neuroscience brain in people’s face[s],” Bialik says, “but when we have lab scenes … I have had to say that’s not where the tectum would be, we need it down here … or I’ve actually carved the fourth ventricle into slices … ’cause you know, why not have me do it.” Among her other acting roles, Bialik also featured in the short film for Michael Jackson’s Liberian Girl and she played the child version of Bette Midler’s character in Beaches (1988). 2. Natalie Portman is a neuroscientist. Perform a Google Scholar search on her name and you won’t get very far. But under her original name of Natalie Hershlag, the Oscar-winning actress co-authored a paper in 2002 on the role of the frontal lobes in infants’ understanding of “object permanence” – recognizing that things still exist even when you can’t see them. According to the Mind Hacks blog, Ms. Portman contributed to this research while working as a research assistant at Harvard University. Her paper has now been cited in the literature over 100 times. © 2013 Condé Nast.
Link ID: 19079 - Posted: 12.31.2013