Links for Keyword: Language

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 492

By Karen Weintraub Marjorie Nicholas, associate chairwoman of the department of communication sciences and disorders at the MGH Institute of Health Professions, is an expert in the language disorder aphasia, and has been treating former Arizona Representative Gabrielle Giffords, who has the condition. Q. What is aphasia and how do people get it? A. Aphasia affects your ability to speak, to understand language, to read and to write. It’s extremely variable. Some might have a severe problem in expression but really pretty good understanding of spoken language, and somebody else might have a very different profile. Typically, people get aphasia by having a stroke that damages parts of the left side of the brain, which is dominant for language. People can also get aphasia from other types of injuries like head injuries, or in Gabby’s case, a gunshot wound to the head that damages that same language area of the brain. It is more common than people realize. Q. How does Giffords fit into the spectrum of symptoms you’ve described? A. Her understanding of spoken language is really very good. Her difficulties are more in the expression. Q. You obviously can’t violate her privacy, but what can you say about your work with Giffords? A. I worked with her for two weeks last fall, and [colleague Nancy Helm-Estabrooks of the University of North Carolina] and I are planning to work with her again for a week this spring. We’ll need to see where she is again. I’m assuming she will have continued to improve and we’ll want to keep her going on that track. © 2012 NY Times Co

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16444 - Posted: 02.28.2012

By Ella Davies Reporter, BBC Nature Pygmy goats can develop "accents" as they grow older, according to scientists. The young animals, known as "kids", are raised in groups or "creches" with goats of a similar age. Researchers found that when young goats mixed in these social groups their calls became more similar. The animals join an elite group of mammals known to adapt a vocal sound in response to the environment that includes humans, bats and whales. Dr Elodie Briefer and Dr Alan McElligott from Queen Mary's School of Biological and Chemical Sciences at the University of London, UK published their results in the journal Animal Behaviour. In order to test the goats' vocal repertoire they recorded calls at one-week-old and again when they were aged five weeks. "Five weeks corresponds to the time when, in the wild, they join their social group after spending some time hidden in vegetation to avoid predators," Dr Briefer explained. "We found that genetically-related kids produced similar calls... but the calls of kids raised in the same social groups were also similar to each other, and became more similar as the kids grew older." BBC © 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16400 - Posted: 02.20.2012

By Bruce Bower By age 6 months, infants on the verge of babbling already know — at least in a budding sense — the meanings of several common nouns for foods and body parts, a new study finds. Vocabulary learning and advances in sounding out syllables and consonants go hand in hand starting at about age 6 months, say graduate student Elika Bergelson and psychologist Daniel Swingley of the University of Pennsylvania. Babies don’t blurt out their first words until around 1 year of age. Bergelson and Swingley’s evidence that 6-month-olds direct their gaze to images of bananas, noses and other objects named by their mothers challenges the influential view that word learning doesn’t start until age 9 months. “Our guess is that a special human desire for social connection, on the part of parents and their infants, is an important component of early word learning,” Bergelson says. The work is published online the week of February 13 in the Proceedings of the National Academy of Sciences. In the study, 33 infants ages 6 to 9 months and 50 kids ages 10 to 20 months sat on their mothers’ laps in front of a computer connected to an eye-tracking device. Even at 6 months, babies looked substantially longer, on average, at images of various foods and body parts named by their mothers when those items appeared with other objects. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16378 - Posted: 02.14.2012

by Gisela Telis The right turn of phrase can activate the brain's sensory centers, a new study suggests. Researchers have found that textural metaphors—phrases such as "soft-hearted"—turn on a part of the brain that's important to the sense of touch. The result may help resolve a long-standing controversy over how the brain understands metaphors and may offer scientists a new way to study how different brain regions communicate. Scientists have disagreed for decades about how the brain processes metaphors, those figures of speech that liken one thing to another without using "like" or "as." One camp claims that when we hear a metaphor—a friend tells us she's had a rough day—we understand the expression only because we've heard it so many times. The brain learns that "rough" means both "abrasive" and "bad," this camp says, and it toggles from one definition to the other. The other camp claims the brain calls on sensory experiences, such as what roughness feels like, to comprehend the metaphor. Researchers from both camps have scanned the brain for signs of sensory activity triggered by metaphors, but these past studies, which tested a variety of metaphors without targeting specific senses or regions of the brain, have come up dry. Neurologist Krish Sathian of Emory University in Atlanta wondered whether using metaphors specific to only one of the senses might be a better strategy. He and his colleagues settled on touch and asked seven college students to distinguish between different textures while their brains were scanned using functional magnetic resonance imaging. This enabled them to map the brain regions each subject used to feel and classify textures. Then they scanned the subjects' brains again as they listened to a torrent of textural metaphors and their literal counterparts: "he is wet behind the ears" versus "he is naïve," for example, or "it was a hairy situation" versus "it was a precarious situation." © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16363 - Posted: 02.09.2012

by Helen Thomson When you read this sentence to yourself, it's likely that you hear the words in your head. Now, in what amounts to technological telepathy, others are on the verge of being able to hear your inner dialogue too. By peering inside the brain, it is possible to reconstruct speech from the activity that takes place when we hear someone talking. Because this brain activity is thought to be similar whether we hear a sentence or think the same sentence, the discovery brings us a step closer to broadcasting our inner thoughts to the world without speaking. The implications are enormous – people made mute through paralysis or locked-in syndrome could regain their voice. It might even be possible to read someone's mind. Imagine a musician watching a piano being played with no sound, says Brian Pasley at the University of California, Berkeley. "If a pianist were watching a piano being played on TV with the sound off, they would still be able to work out what the music sounded like because they know what key plays what note," Pasley says. His team has done something analogous with brain waves, matching neural areas to their corresponding noises. How the brain converts speech into meaningful information is a bit of a puzzle. The basic idea is that sound activates sensory neurons, which then pass this information to different areas of the brain where various aspects of the sound are extracted and eventually perceived as language. Pasley and colleagues wondered whether they could identify where some of the most vital aspects of speech are extracted by the brain. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16329 - Posted: 02.02.2012

By Carrie Arnold Put on a pair of headphones and turn up the volume so that you can’t even hear yourself speak. For those who stutter, this is when the magic happens. Without the ability to hear their own voice, people with this speech impediment no longer stumble over their words—as was recently portrayed in the movie The King’s Speech. This simple trick works because of the unusual way the brain of people who stutter is organized—a neural setup that affects other actions besides speech, according to a new study. Normal speech requires the brain to control movement of the mouth and vocal chords using the sound of the speaker’s own voice as a guide. This integration of movement and hearing typically happens in the brain’s left hemisphere, in a region of the brain known as the premotor cortex. In those who stutter, however, the process occurs in the right hemisphere—prob­ably because of a slight defect on the left side, according to past brain-imaging studies. Singing requires a similar integration of aural input and motor control, but the processing typically occurs in the right hemi­sphere, which may explain why those who stutter can sing as well as anyone else. (In a related vein, The King’s Speech also mentioned the common belief that people who stutter are often left-handed, but studies have found no such link.) In the new study, published in the September issue of Cortex, re­searchers found that the unusual neural organization underlying a stutter also includes motor tasks completely unrelated to speech. A group of 30 adults, half of whom stuttered and half of whom did not, tapped a finger in time to a metronome. When the sci­entists interfered with the function of their left hemisphere using trans­cranial magnetic stimulation, a non­invasive technique that temporarily dampens brain activity, nonstutterers found themselves unable to tap in time—but those who stuttered were unaffected. When the researchers interfered with the right hemisphere, the results were reversed: the stut­tering group was impaired, and the nonstutterers were fine. © 2012 Scientific American,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16289 - Posted: 01.24.2012

Sean O'Neill, contributor Sue Savage-Rumbaugh is a primatologist who works at the Great Ape Trust in Des Moines, Iowa, where she explores the mental and linguistic skills of our primate cousins. As a graduate, you were all set to do a postdoc in psychology at Harvard University. What happened? Yes, I was due to go to Harvard to work with behaviourist B. F. Skinner and his famous pigeons. But before I left I happened to sit in on a class by primate researcher Roger Fouts, who brought a chimpanzee named Booee to class. Roger held up objects like a hat, a key and a pair of shoes, and Booee would make what Roger said were signs for those objects. I saw a chimpanzee doing what seemed to be a symbolic task and I was hooked. I said to myself: "Wait a minute, people are teaching chimpanzees human language, and I'm going to Harvard to study pigeons? You need to stay here, this is where it's at if you are interested in the origins of the human mind." I have worked with apes ever since. Your work on the linguistic capabilities of apes has taken you into uncharted territory... Yes, for better or for worse, I have gone to a place that other researchers have not. If I had had any inkling into the huge degree of linguistic, conceptual and social similarity between ourselves and bonobos when I started the work I would have been scared to death to do it. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16277 - Posted: 01.21.2012

By Joan Raymond For years, the conventional wisdom was that babies learned how to talk by listening to their parents. But a new study in the Proceedings of the National Academy of Sciences shows that our little angels are using more than their ears to acquire language. They’re using their eyes, too, and are actually pretty good lip readers. The finding could lead to earlier diagnosis and intervention for autism spectrum disorders, estimated, on average, to affect 1 in 110 children in the United States alone. In the study, researchers from Florida Atlantic University tested groups of infants, ranging from four to 12 months of age and a group of adults for comparison. advertisement The babies watched videos of women speaking either in English, the native language used in the home, or in Spanish, a language foreign to them. Using an eye tracker device to study eye movements, the researchers looked at developmental changes in attention to the eyes and mouth. Results showed that at four months of age, babies focused almost solely on the women’s eyes. But by six to eight months of age, when the infants entered the so-called “babbling” stage of language acquisition and reached a milestone of cognitive development in which they can direct their attention to things they find interesting, their focus shifted to the women’s mouths. They continue to “lip read” until about 10 months of age, a point when they finally begin mastering the basic features of their native language. At this point, infants also begin to shift their attention back to the eyes. © 2012 msnbc.com

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Language and Our Divided Brain
Link ID: 16265 - Posted: 01.17.2012

By Victoria Gill Science reporter, BBC Nature Chimpanzees appear to consider who they are "talking to" before they call out. Researchers found that wild chimps that spotted a poisonous snake were more likely to make their "alert call" in the presence of a chimp that had not seen the threat. This indicates that the animals "understand the mindset" of others. The insight into the primates' remarkable intelligence will be published in the journal Current Biology. The University of St Andrews scientists, who carried out the work, study primate communication to uncover some of the origins of human language. To find out how the animals "talked to each other" about potential threats, they placed plastic snakes - models of rhino and gaboon vipers - into the paths of wild chimpanzees and monitored the primates' reactions. "These [snake species] are well camouflaged and they have a deadly bite," explained Dr Catherine Crockford from University of St Andrews, who led the research. "They also tend to sit in one place for weeks. So if a chimp discovers a snake, it makes sense for that animal to let everyone else know where [it] is." The scientists put the snake on a path that the chimps were using regularly, secreting the plastic models in the leaves. "When [the chimps] saw the model, they would be quite close to it and would leap away, but they wouldn't call," she told BBC Nature. BBC © 2011

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16205 - Posted: 01.03.2012

by Robert Kowalski What is the relationship between language and thought? The quest to create artificial intelligence may have come up with some unexpected answers THE idea of machines that think and act as intelligently as humans can generate strong emotions. This may explain why one of the most important accomplishments in the field of artificial intelligence has gone largely unnoticed: that some of the advances in AI can be used by ordinary people to improve their own natural intelligence and communication skills. Chief among these advances is a form of logic called computational logic. This builds and improves on traditional logic, and can be used both for the original purpose of logic - to improve the way we think - and, crucially, to improve the way we communicate in natural languages, such as English. Arguably, it is the missing link that connects language and thought. According to one school of philosophy, our thoughts have a language-like structure that is independent of natural language: this is what students of language call the language of thought (LOT) hypothesis. According to the LOT hypothesis, it is because human thoughts already have a linguistic structure that the emergence of common, natural languages was possible in the first place. The LOT hypothesis contrasts with the mildly contrary view that human thinking is actually conducted in natural language, and thus we could not think intelligently without it. It also contradicts the ultra-contrary view that human thinking does not have a language-like structure at all, implying that our ability to communicate in natural language is nothing short of a miracle. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 16131 - Posted: 12.10.2011

by Robert Krulwich Here's something you should know about yourself. Vowels control your brain. "I"s make you see things differently than "O"s. Here's how. Say these words out loud: Bean Mint Slim These "I" and "E" vowels are formed by putting your tongue forward in the mouth. That's why they're called "front" vowels. Now, say: Large Pod Or Ought With these words, your tongue depresses and folds back a bit. So "O", "A" and "U" are called "back" of the throat vowels. OK, here's the weird part. When comparing words across language groups, says Stanford linguistics professor Dan Jurafsky, a curious pattern shows up: Words with front vowels ("I" and "E") tend to represent small, thin, light things. Back vowels ("O" "U" and some "A"s ) show up in fat, heavy things. It's not always true, but it's a tendency that you can see in any of the stressed vowels in words like little, teeny or itsy-bitsy (all front vowels) versus humongous or gargantuan (back vowels). Or the i vowel in Spanish chico (front vowel meaning small) versus gordo (back vowel meaning fat). Or French petit (front vowel) versus grand (back vowel). Copyright 2011 NPR

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16126 - Posted: 12.10.2011

by Traci Watson Parrots have neither lips nor teeth, but that doesn't stop them from producing dead-on imitations of human speech. Now researchers have learned part of the reason: like humans, parrots use their tongues to form sounds. As they report today in The Journal of Experimental Biology, scientists took x-ray movies of monk parakeets, Myiopsitta monachus, South American natives that can be trained to speak but aren't star talkers. The parakeets lowered their tongues during loud, squeaky "contact calls" made when the birds can't see each other and during longer, trilling "greeting calls" made to show a social connection. As seen in the video, the parakeets also moved their tongues up and down while chattering. No other type of bird is known to move its tongue to vocalize. Parrots use their mobile, muscular tongues to explore their environment and manipulate food. Those capable organs, just by coincidence, also help parrots utter greetings in words that even humans can understand. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16125 - Posted: 12.08.2011

by Catherine de Lange Chimpanzee brains may be hard-wired to evolve language even though they can't talk. That is the suggestion of a study which found chimps link sounds and levels of brightness, something akin to synaesthesia in people. Such an association could help explain how our early ancestors took the first vital step from ape-like grunts to a proper vocabulary. Synaesthetes make unusual connections between different senses – they might sense certain tastes when they hear music, or "see" numbers as colours. This is less unusual than you might think: "The synaesthetic experience is a continuum," explains Roi Cohen Kadosh of University College London. "Most people have it at an implicit level, and some people have a stronger connection." Now, Vera Ludwig from the Charite University of Medicine in Berlin, Germany, and colleagues have shown for the first time that chimpanzees also make cross-sensory associations, suggesting they evolved early on. The team repeatedly flashed either black or white squares for 200 milliseconds at a time on screens in front of six chimpanzees (Pan troglodytes) and 33 humans. The subjects had to indicate whether the square was black or white by touching a button of the right colour. A high or low-pitched sound was randomly played in the background during each test. Chimps and humans were better at identifying white squares when they heard a high-pitched sound, and more likely to correctly identify dark squares when played a low-pitched sound. But performance was poor when the sounds were swapped: humans were slower to identify a white square paired with a low-pitched noise, or a black square with a high-pitched noise, and the chimps' responses became significantly less accurate. © Copyright Reed Bus

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16114 - Posted: 12.06.2011

By Charles Q. Choi Ravens use their beaks and wings much like humans rely on our hands to make gestures, such as for pointing to an object, scientists now find. This is the first time researchers have seen gestures used in this way in the wild by animals other than primates. From the age of 9 to 12 months, human infants often use gestures to direct the attention of adults to objects, or to hold up items so that others can take them. These gestures, produced before children speak their first words, are seen as milestones in the development of human speech. Dogs and other animals are known to point out items using gestures, but humans trained these animals, and scientists had suggested the natural development of these gestures was normally confined only to primates, said researcher Simone Pika, a biologist at the Max Planck Institute for Ornithology in Seewiesen, Germany. Even then, comparable gestures are rarely seen in the wild in our closest living relatives, the great apes—for instance, chimpanzees in the Kibale National Park in Uganda employ so-called directed scratches to indicate distinct spots on their bodies they want groomed. Still, ravens and their relatives such as crows and magpies have been found to be remarkably intelligent over the years, surpassing most other birds in terms of smarts and even rivaling great apes on some tests. © 2011 Scientific American

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16098 - Posted: 12.01.2011

by Marion Long and Valerie Ross For centuries experts held that every language is unique. Then one day in 1956, a young linguistics professor gave a legendary presentation at the Symposium on Information Theory at MIT. He argued that every intelligible sentence conforms not only to the rules of its particular language but to a universal grammar that encompasses all languages. And rather than absorbing language from the environment and learning to communicate by imitation, children are born with the innate capacity to master language, a power imbued in our species by evolution itself. Almost overnight, linguists’ thinking began to shift. Avram Noam Chomsky was born in Philadelphia on December 7, 1928, to William Chomsky, a Hebrew scholar, and Elsie Simonofsky Chomsky, also a scholar and an author of children’s books. While still a youngster, Noam read his father’s manuscript on medieval Hebrew grammar, setting the stage for his work to come. By 1955 he was teaching linguistics at MIT, where he formulated his groundbreaking theories. Today Chomsky continues to challenge the way we perceive ourselves. Language is “the core of our being,” he says. “We are always immersed in it. It takes a strong act of will to try not to talk to yourself when you’re walking down the street, because it’s just always going on.” Chomsky also bucked against scientific tradition by becoming active in politics. He was an outspoken critic of American involvement in Vietnam and helped organize the famous 1967 protest march on the Pentagon. When the leaders of the march were arrested, he found himself sharing a cell with Norman Mailer, who described him in his book Armies of the Night as “a slim, sharp-featured man with an ascetic expression, and an air of gentle but absolute moral integrity.” © 2011, Kalmbach Publishing Co.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16094 - Posted: 12.01.2011

Ewen Callaway A mutation that appeared more than half a million years ago may have helped humans learn the complex muscle movements that are critical to speech and language. The claim stems from the finding that mice genetically engineered to produce the human form of the gene, called FOXP2, learn more quickly than their normal counterparts. The work was presented by Christiane Schreiweis, a neuroscientist at the Max Planck Institute (MPI) for Evolutionary Anthropology in Leipzig, Germany, at the Society for Neuroscience meeting this week in Washington DC this week. Scientists discovered FOXP2 in the 1990s by studying a British family known as 'KE' in which three generations suffered from severe speech and language problems1. Those with language problems were found to share an inherited mutation that inactivates one copy of FOXP2. Most vertebrates have nearly identical versions of the gene, which is involved in the development of brain circuits important for the learning of movement. The human version of FOXP2, the protein encoded by the gene, differs from that of chimpanzees at two amino acids, hinting that changes to the human form may have had a hand in the evolution of language2. A team led by Schreiweis’ colleague Svante Pääbo discovered that the gene is identical in modern humans (Homo sapiens) and Neanderthals (Homo neanderthalensis), suggesting that the mutation appeared before these two human lineages diverged around 500,000 years ago3. © 2011 Nature Publishing Group,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16058 - Posted: 11.19.2011

by Nora Schultz Actions speak louder than words. Baby chimps, bonobos, gorillas and orang-utans – our four closest living relatives – quickly learn to use visual gestures to get their message across, providing the latest evidence that hand waving may have been a vital first step in the development of human language. After a long search for the origins of language in animal vocalisations, some evolutionary biologists have begun to change tack. The emerging "gesture theory" of language evolution has it that our ancestors' linguistic abilities may have begun with their hands rather than their vocal cords. Katja Liebal and colleagues at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, have found new evidence for the theory by studying how communication develops in our closest living relatives. They discovered that all four great apes – chimps, bonobos, gorillas and orang-utans – develop a complex repertoire of gestures during the first 20 months of life. Look at me Those gestures included the tactile pokes and nudges that are expected to effectively capture another's attention in any situation, but they also included visual gestures such as extending the arms towards another ape or head shaking. To be effective communication tools, these visual gestures require that a young ape be aware that another individual is paying attention before using them, if they want to get their message across. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16016 - Posted: 11.11.2011

By RITCHIE S. KING As plain-tailed wrens dart through Chusquea bamboo in the Andes, they can be heard singing a kind of song that no other bird is known to sing: a cooperative duet. New research shows that male-female pairs take turns producing notes, at a combined rate of three to six per second, to create what sounds like a single bird’s song. Each member of the duo reacts to what the other one does, adjusting the timing and pitch as needed to maintain the melody the two are trying to play together. The duet is like humans dancing, said Eric Fortune, a neuroscientist at Johns Hopkins University and an author of the study, which appeared in the journal Science. The cues between the birds are “continuous and subtle,” and brain scans show that each bird learns the entire duet — as a pair of ballroom dancers learns choreography — instead of only memorizing its individual part. In the world of plain-tailed wrens, it appears that females always lead, singing a simple backbone melody that the males fill in with something more variable, like a guitar solo. The research team suspects that a female engages in cooperative singing to put a male’s chirping prowess to the test and thereby determine his suitability as a mate. While alone, a female wren practices her section of a duet at full volume. But males make more mistakes during cooperative singing, so they tweet much more timidly when they rehearse their part. © 2011 The New York Times Company

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 8: Hormones and Sex
Link ID: 16007 - Posted: 11.08.2011

by Jennifer Barone Language seems to set humans apart from other animals, but scientists cannot just hand monkeys and birds an interspecies SAT to determine which linguistic abilities are singularly those of Homo sapiens and which we share with other animals. In August neuroscientists Kentaro Abe and Dai Watanabe of Kyoto University announced that they had devised the next-best thing, a systematic test of birds’ grammatical prowess. The results suggest that Bengalese finches have strict rules of syntax: The order of their chirps matters. “It’s the first experiment to show that any animal has perceived the especially complex patterns that supposedly make human language unique,” says Timothy Gentner, who studies animal cognition and communication at the University of California, San Diego, and was not involved in the study. Finches cry out whenever they hear a new tune, so Abe and Watanabe started by having individual birds listen to an unfamiliar finch’s song. At first the listeners called out in reply, but after 200 playbacks, their responses died down. Then the researchers created three remixes by changing the order of the song’s component syllables. The birds reacted indifferently to two of the revised tunes; apparently the gist of the message remained the same. But one remix elicited a burst of calls, as if the birds had detected something wrong. Abe and Watanabe concluded that the birds were reacting like grumpy middle-school English teachers to a violation of their rules of syntax. © 2011, Kalmbach Publishing C

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16003 - Posted: 11.08.2011

By Bruce Bower Talk is cheap, but scientific value lurks in all that gab. Words cascading out of countless flapping gums contain secrets about the evolution of language that a new breed of researchers plan to expose with statistical tools borrowed from genetics. For more than a century, traditional linguists have spent much of their time doing fieldwork — listening to native speakers to pick up on words with similar sounds, such as mother in English and madre in Spanish, and comparing how various tongues arrange subjects, verbs, objects and other grammatical elements into sentences. Such information has allowed investigators to group related languages into families and reconstruct ancestral forms of talk. But linguists generally agree that their methods can revive languages from no more than 10,000 years ago. Borrowing of words and grammar by speakers of neighboring languages, the researchers say, erases evolutionary signals from before that time. Now a small contingent of researchers, many of them evolutionary biologists who typically have nothing to do with linguistics, are looking at language from in front of their computers, using mathematical techniques imported from the study of DNA to wring scenarios of language evolution out of huge amounts of comparative speech data. These data analyzers assume that words and other language units change systematically as they are passed from one generation to the next, much the way genes do. Charles Darwin similarly argued in 1871 that languages, like biological species, have evolved into a series of related forms. © Society for Science & the Public 2000 - 2011

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 15990 - Posted: 11.05.2011