Links for Keyword: Language

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 496

A personality profile marked by overly gregarious yet anxious behavior is rooted in abnormal development of a circuit hub buried deep in the front center of the brain, say scientists at the National Institutes of Health. They used three different types of brain imaging to pinpoint the suspect brain area in people with Williams syndrome, a rare genetic disorder characterized by these behaviors. Matching the scans to scores on a personality rating scale revealed that the more an individual with Williams syndrome showed these personality/temperament traits, the more abnormalities there were in the brain structure, called the insula. "Scans of the brain's tissue composition, wiring, and activity produced converging evidence of genetically-caused abnormalities in the structure and function of the front part of the insula and in its connectivity to other brain areas in the circuit," explained Karen Berman, M.D., of the NIH's National Institute of Mental Health (NIMH). Berman, Drs. Mbemda Jabbi, Shane Kippenhan, and colleagues, report on their imaging study in Williams syndrome online in the journal Proceedings of the National Academy of Sciences. Williams syndrome is caused by the deletion of some 28 genes, many involved in brain development and behavior, in a particular section of chromosome 7. Among deficits characteristic of the syndrome are a lack of visual-spatial ability – such as is required to assemble a puzzle — and a tendency to be overly-friendly with people, while overly anxious about non-social matters, such as spiders or heights. Many people with the disorder are also mentally challenged and learning disabled, but some have normal IQs.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16569 - Posted: 03.24.2012

By Emily Sohn In his spare time, an otherwise ordinary 16-year old boy from New York taught himself Hebrew, Arabic, Russian, Swahili, and a dozen other languages, the New York Times reported last week. And even though it's not entirely clear how close to fluent Timothy Doner is in any of his studied languages, the high school sophomore -- along with other polyglots like him -- are certainly different from most Americans, who speak one or maybe two languages. That raises the question: Is there something unique about certain brains, which allows some people to speak and understand so many more languages than the rest of us? The answer, experts say, seems to be yes, no and it's complicated. For some people, genes may prime the brain to be good at language learning, according to some new research. And studies are just starting to pinpoint a few brain regions that are extra-large or extra-efficient in people who excel at languages. For others, though, it's more a matter of being determined and motivated enough to put in the hours and hard work necessary to learn new ways of communicating. "Kids do well in what they like," said Michael Paradis, a neurolinguist at McGill University in Montreal, who compared language learning to piano, sports or anything else that requires discipline. "Kids who love math do well in math. He loves languages and is doing well in languages." © 2012 Discovery Communications, LLC.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16547 - Posted: 03.20.2012

By YUDHIJIT BHATTACHARJEE SPEAKING two languages rather than just one has obvious practical benefits in an increasingly globalized world. But in recent years, scientists have begun to show that the advantages of bilingualism are even more fundamental than being able to converse with a wider range of people. Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age. This view of bilingualism is remarkably different from the understanding of bilingualism through much of the 20th century. Researchers, educators and policy makers long considered a second language to be an interference, cognitively speaking, that hindered a child’s academic and intellectual development. They were not wrong about the interference: there is ample evidence that in a bilingual’s brain both language systems are active even when he is using only one language, thus creating situations in which one system obstructs the other. But this interference, researchers are finding out, isn’t so much a handicap as a blessing in disguise. It forces the brain to resolve internal conflict, giving the mind a workout that strengthens its cognitive muscles. Bilinguals, for instance, seem to be more adept than monolinguals at solving certain kinds of mental puzzles. In a 2004 study by the psychologists Ellen Bialystok and Michelle Martin-Rhee, bilingual and monolingual preschoolers were asked to sort blue circles and red squares presented on a computer screen into two digital bins — one marked with a blue square and the other marked with a red circle. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16542 - Posted: 03.19.2012

By ANNIE MURPHY PAUL Brain scans are revealing what happens in our heads when we read a detailed description, an evocative metaphor or an emotional exchange between characters. Stories, this research is showing, stimulate the brain and even change how we act in life. Researchers have long known that the “classical” language regions, like Broca’s area and Wernicke’s area, are involved in how the brain interprets written words. What scientists have come to realize in the last few years is that narratives activate many other parts of our brains as well, suggesting why the experience of reading can feel so alive. Words like “lavender,” “cinnamon” and “soap,” for example, elicit a response not only from the language-processing areas of our brains, but also those devoted to dealing with smells. In a 2006 study published in the journal NeuroImage, researchers in Spain asked participants to read words with strong odor associations, along with neutral words, while their brains were being scanned by a functional magnetic resonance imaging (fMRI) machine. When subjects looked at the Spanish words for “perfume” and “coffee,” their primary olfactory cortex lit up; when they saw the words that mean “chair” and “key,” this region remained dark. The way the brain handles metaphors has also received extensive study; some scientists have contended that figures of speech like “a rough day” are so familiar that they are treated simply as words and no more. Last month, however, a team of researchers from Emory University reported in Brain & Language that when subjects in their laboratory read a metaphor involving texture, the sensory cortex, responsible for perceiving texture through touch, became active. Metaphors like “The singer had a velvet voice” and “He had leathery hands” roused the sensory cortex, while phrases matched for meaning, like “The singer had a pleasing voice” and “He had strong hands,” did not. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 14: Attention and Consciousness
Link ID: 16536 - Posted: 03.19.2012

By Karen Weintraub Marjorie Nicholas, associate chairwoman of the department of communication sciences and disorders at the MGH Institute of Health Professions, is an expert in the language disorder aphasia, and has been treating former Arizona Representative Gabrielle Giffords, who has the condition. Q. What is aphasia and how do people get it? A. Aphasia affects your ability to speak, to understand language, to read and to write. It’s extremely variable. Some might have a severe problem in expression but really pretty good understanding of spoken language, and somebody else might have a very different profile. Typically, people get aphasia by having a stroke that damages parts of the left side of the brain, which is dominant for language. People can also get aphasia from other types of injuries like head injuries, or in Gabby’s case, a gunshot wound to the head that damages that same language area of the brain. It is more common than people realize. Q. How does Giffords fit into the spectrum of symptoms you’ve described? A. Her understanding of spoken language is really very good. Her difficulties are more in the expression. Q. You obviously can’t violate her privacy, but what can you say about your work with Giffords? A. I worked with her for two weeks last fall, and [colleague Nancy Helm-Estabrooks of the University of North Carolina] and I are planning to work with her again for a week this spring. We’ll need to see where she is again. I’m assuming she will have continued to improve and we’ll want to keep her going on that track. © 2012 NY Times Co

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16444 - Posted: 02.28.2012

By Ella Davies Reporter, BBC Nature Pygmy goats can develop "accents" as they grow older, according to scientists. The young animals, known as "kids", are raised in groups or "creches" with goats of a similar age. Researchers found that when young goats mixed in these social groups their calls became more similar. The animals join an elite group of mammals known to adapt a vocal sound in response to the environment that includes humans, bats and whales. Dr Elodie Briefer and Dr Alan McElligott from Queen Mary's School of Biological and Chemical Sciences at the University of London, UK published their results in the journal Animal Behaviour. In order to test the goats' vocal repertoire they recorded calls at one-week-old and again when they were aged five weeks. "Five weeks corresponds to the time when, in the wild, they join their social group after spending some time hidden in vegetation to avoid predators," Dr Briefer explained. "We found that genetically-related kids produced similar calls... but the calls of kids raised in the same social groups were also similar to each other, and became more similar as the kids grew older." BBC © 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16400 - Posted: 02.20.2012

By Bruce Bower By age 6 months, infants on the verge of babbling already know — at least in a budding sense — the meanings of several common nouns for foods and body parts, a new study finds. Vocabulary learning and advances in sounding out syllables and consonants go hand in hand starting at about age 6 months, say graduate student Elika Bergelson and psychologist Daniel Swingley of the University of Pennsylvania. Babies don’t blurt out their first words until around 1 year of age. Bergelson and Swingley’s evidence that 6-month-olds direct their gaze to images of bananas, noses and other objects named by their mothers challenges the influential view that word learning doesn’t start until age 9 months. “Our guess is that a special human desire for social connection, on the part of parents and their infants, is an important component of early word learning,” Bergelson says. The work is published online the week of February 13 in the Proceedings of the National Academy of Sciences. In the study, 33 infants ages 6 to 9 months and 50 kids ages 10 to 20 months sat on their mothers’ laps in front of a computer connected to an eye-tracking device. Even at 6 months, babies looked substantially longer, on average, at images of various foods and body parts named by their mothers when those items appeared with other objects. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16378 - Posted: 02.14.2012

by Gisela Telis The right turn of phrase can activate the brain's sensory centers, a new study suggests. Researchers have found that textural metaphors—phrases such as "soft-hearted"—turn on a part of the brain that's important to the sense of touch. The result may help resolve a long-standing controversy over how the brain understands metaphors and may offer scientists a new way to study how different brain regions communicate. Scientists have disagreed for decades about how the brain processes metaphors, those figures of speech that liken one thing to another without using "like" or "as." One camp claims that when we hear a metaphor—a friend tells us she's had a rough day—we understand the expression only because we've heard it so many times. The brain learns that "rough" means both "abrasive" and "bad," this camp says, and it toggles from one definition to the other. The other camp claims the brain calls on sensory experiences, such as what roughness feels like, to comprehend the metaphor. Researchers from both camps have scanned the brain for signs of sensory activity triggered by metaphors, but these past studies, which tested a variety of metaphors without targeting specific senses or regions of the brain, have come up dry. Neurologist Krish Sathian of Emory University in Atlanta wondered whether using metaphors specific to only one of the senses might be a better strategy. He and his colleagues settled on touch and asked seven college students to distinguish between different textures while their brains were scanned using functional magnetic resonance imaging. This enabled them to map the brain regions each subject used to feel and classify textures. Then they scanned the subjects' brains again as they listened to a torrent of textural metaphors and their literal counterparts: "he is wet behind the ears" versus "he is naïve," for example, or "it was a hairy situation" versus "it was a precarious situation." © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16363 - Posted: 02.09.2012

by Helen Thomson When you read this sentence to yourself, it's likely that you hear the words in your head. Now, in what amounts to technological telepathy, others are on the verge of being able to hear your inner dialogue too. By peering inside the brain, it is possible to reconstruct speech from the activity that takes place when we hear someone talking. Because this brain activity is thought to be similar whether we hear a sentence or think the same sentence, the discovery brings us a step closer to broadcasting our inner thoughts to the world without speaking. The implications are enormous – people made mute through paralysis or locked-in syndrome could regain their voice. It might even be possible to read someone's mind. Imagine a musician watching a piano being played with no sound, says Brian Pasley at the University of California, Berkeley. "If a pianist were watching a piano being played on TV with the sound off, they would still be able to work out what the music sounded like because they know what key plays what note," Pasley says. His team has done something analogous with brain waves, matching neural areas to their corresponding noises. How the brain converts speech into meaningful information is a bit of a puzzle. The basic idea is that sound activates sensory neurons, which then pass this information to different areas of the brain where various aspects of the sound are extracted and eventually perceived as language. Pasley and colleagues wondered whether they could identify where some of the most vital aspects of speech are extracted by the brain. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16329 - Posted: 02.02.2012

By Carrie Arnold Put on a pair of headphones and turn up the volume so that you can’t even hear yourself speak. For those who stutter, this is when the magic happens. Without the ability to hear their own voice, people with this speech impediment no longer stumble over their words—as was recently portrayed in the movie The King’s Speech. This simple trick works because of the unusual way the brain of people who stutter is organized—a neural setup that affects other actions besides speech, according to a new study. Normal speech requires the brain to control movement of the mouth and vocal chords using the sound of the speaker’s own voice as a guide. This integration of movement and hearing typically happens in the brain’s left hemisphere, in a region of the brain known as the premotor cortex. In those who stutter, however, the process occurs in the right hemisphere—prob­ably because of a slight defect on the left side, according to past brain-imaging studies. Singing requires a similar integration of aural input and motor control, but the processing typically occurs in the right hemi­sphere, which may explain why those who stutter can sing as well as anyone else. (In a related vein, The King’s Speech also mentioned the common belief that people who stutter are often left-handed, but studies have found no such link.) In the new study, published in the September issue of Cortex, re­searchers found that the unusual neural organization underlying a stutter also includes motor tasks completely unrelated to speech. A group of 30 adults, half of whom stuttered and half of whom did not, tapped a finger in time to a metronome. When the sci­entists interfered with the function of their left hemisphere using trans­cranial magnetic stimulation, a non­invasive technique that temporarily dampens brain activity, nonstutterers found themselves unable to tap in time—but those who stuttered were unaffected. When the researchers interfered with the right hemisphere, the results were reversed: the stut­tering group was impaired, and the nonstutterers were fine. © 2012 Scientific American,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16289 - Posted: 01.24.2012

Sean O'Neill, contributor Sue Savage-Rumbaugh is a primatologist who works at the Great Ape Trust in Des Moines, Iowa, where she explores the mental and linguistic skills of our primate cousins. As a graduate, you were all set to do a postdoc in psychology at Harvard University. What happened? Yes, I was due to go to Harvard to work with behaviourist B. F. Skinner and his famous pigeons. But before I left I happened to sit in on a class by primate researcher Roger Fouts, who brought a chimpanzee named Booee to class. Roger held up objects like a hat, a key and a pair of shoes, and Booee would make what Roger said were signs for those objects. I saw a chimpanzee doing what seemed to be a symbolic task and I was hooked. I said to myself: "Wait a minute, people are teaching chimpanzees human language, and I'm going to Harvard to study pigeons? You need to stay here, this is where it's at if you are interested in the origins of the human mind." I have worked with apes ever since. Your work on the linguistic capabilities of apes has taken you into uncharted territory... Yes, for better or for worse, I have gone to a place that other researchers have not. If I had had any inkling into the huge degree of linguistic, conceptual and social similarity between ourselves and bonobos when I started the work I would have been scared to death to do it. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16277 - Posted: 01.21.2012

By Joan Raymond For years, the conventional wisdom was that babies learned how to talk by listening to their parents. But a new study in the Proceedings of the National Academy of Sciences shows that our little angels are using more than their ears to acquire language. They’re using their eyes, too, and are actually pretty good lip readers. The finding could lead to earlier diagnosis and intervention for autism spectrum disorders, estimated, on average, to affect 1 in 110 children in the United States alone. In the study, researchers from Florida Atlantic University tested groups of infants, ranging from four to 12 months of age and a group of adults for comparison. advertisement The babies watched videos of women speaking either in English, the native language used in the home, or in Spanish, a language foreign to them. Using an eye tracker device to study eye movements, the researchers looked at developmental changes in attention to the eyes and mouth. Results showed that at four months of age, babies focused almost solely on the women’s eyes. But by six to eight months of age, when the infants entered the so-called “babbling” stage of language acquisition and reached a milestone of cognitive development in which they can direct their attention to things they find interesting, their focus shifted to the women’s mouths. They continue to “lip read” until about 10 months of age, a point when they finally begin mastering the basic features of their native language. At this point, infants also begin to shift their attention back to the eyes. © 2012 msnbc.com

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Language and Our Divided Brain
Link ID: 16265 - Posted: 01.17.2012

By Victoria Gill Science reporter, BBC Nature Chimpanzees appear to consider who they are "talking to" before they call out. Researchers found that wild chimps that spotted a poisonous snake were more likely to make their "alert call" in the presence of a chimp that had not seen the threat. This indicates that the animals "understand the mindset" of others. The insight into the primates' remarkable intelligence will be published in the journal Current Biology. The University of St Andrews scientists, who carried out the work, study primate communication to uncover some of the origins of human language. To find out how the animals "talked to each other" about potential threats, they placed plastic snakes - models of rhino and gaboon vipers - into the paths of wild chimpanzees and monitored the primates' reactions. "These [snake species] are well camouflaged and they have a deadly bite," explained Dr Catherine Crockford from University of St Andrews, who led the research. "They also tend to sit in one place for weeks. So if a chimp discovers a snake, it makes sense for that animal to let everyone else know where [it] is." The scientists put the snake on a path that the chimps were using regularly, secreting the plastic models in the leaves. "When [the chimps] saw the model, they would be quite close to it and would leap away, but they wouldn't call," she told BBC Nature. BBC © 2011

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16205 - Posted: 01.03.2012

by Robert Kowalski What is the relationship between language and thought? The quest to create artificial intelligence may have come up with some unexpected answers THE idea of machines that think and act as intelligently as humans can generate strong emotions. This may explain why one of the most important accomplishments in the field of artificial intelligence has gone largely unnoticed: that some of the advances in AI can be used by ordinary people to improve their own natural intelligence and communication skills. Chief among these advances is a form of logic called computational logic. This builds and improves on traditional logic, and can be used both for the original purpose of logic - to improve the way we think - and, crucially, to improve the way we communicate in natural languages, such as English. Arguably, it is the missing link that connects language and thought. According to one school of philosophy, our thoughts have a language-like structure that is independent of natural language: this is what students of language call the language of thought (LOT) hypothesis. According to the LOT hypothesis, it is because human thoughts already have a linguistic structure that the emergence of common, natural languages was possible in the first place. The LOT hypothesis contrasts with the mildly contrary view that human thinking is actually conducted in natural language, and thus we could not think intelligently without it. It also contradicts the ultra-contrary view that human thinking does not have a language-like structure at all, implying that our ability to communicate in natural language is nothing short of a miracle. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 16131 - Posted: 12.10.2011

by Robert Krulwich Here's something you should know about yourself. Vowels control your brain. "I"s make you see things differently than "O"s. Here's how. Say these words out loud: Bean Mint Slim These "I" and "E" vowels are formed by putting your tongue forward in the mouth. That's why they're called "front" vowels. Now, say: Large Pod Or Ought With these words, your tongue depresses and folds back a bit. So "O", "A" and "U" are called "back" of the throat vowels. OK, here's the weird part. When comparing words across language groups, says Stanford linguistics professor Dan Jurafsky, a curious pattern shows up: Words with front vowels ("I" and "E") tend to represent small, thin, light things. Back vowels ("O" "U" and some "A"s ) show up in fat, heavy things. It's not always true, but it's a tendency that you can see in any of the stressed vowels in words like little, teeny or itsy-bitsy (all front vowels) versus humongous or gargantuan (back vowels). Or the i vowel in Spanish chico (front vowel meaning small) versus gordo (back vowel meaning fat). Or French petit (front vowel) versus grand (back vowel). Copyright 2011 NPR

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16126 - Posted: 12.10.2011

by Traci Watson Parrots have neither lips nor teeth, but that doesn't stop them from producing dead-on imitations of human speech. Now researchers have learned part of the reason: like humans, parrots use their tongues to form sounds. As they report today in The Journal of Experimental Biology, scientists took x-ray movies of monk parakeets, Myiopsitta monachus, South American natives that can be trained to speak but aren't star talkers. The parakeets lowered their tongues during loud, squeaky "contact calls" made when the birds can't see each other and during longer, trilling "greeting calls" made to show a social connection. As seen in the video, the parakeets also moved their tongues up and down while chattering. No other type of bird is known to move its tongue to vocalize. Parrots use their mobile, muscular tongues to explore their environment and manipulate food. Those capable organs, just by coincidence, also help parrots utter greetings in words that even humans can understand. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16125 - Posted: 12.08.2011

by Catherine de Lange Chimpanzee brains may be hard-wired to evolve language even though they can't talk. That is the suggestion of a study which found chimps link sounds and levels of brightness, something akin to synaesthesia in people. Such an association could help explain how our early ancestors took the first vital step from ape-like grunts to a proper vocabulary. Synaesthetes make unusual connections between different senses – they might sense certain tastes when they hear music, or "see" numbers as colours. This is less unusual than you might think: "The synaesthetic experience is a continuum," explains Roi Cohen Kadosh of University College London. "Most people have it at an implicit level, and some people have a stronger connection." Now, Vera Ludwig from the Charite University of Medicine in Berlin, Germany, and colleagues have shown for the first time that chimpanzees also make cross-sensory associations, suggesting they evolved early on. The team repeatedly flashed either black or white squares for 200 milliseconds at a time on screens in front of six chimpanzees (Pan troglodytes) and 33 humans. The subjects had to indicate whether the square was black or white by touching a button of the right colour. A high or low-pitched sound was randomly played in the background during each test. Chimps and humans were better at identifying white squares when they heard a high-pitched sound, and more likely to correctly identify dark squares when played a low-pitched sound. But performance was poor when the sounds were swapped: humans were slower to identify a white square paired with a low-pitched noise, or a black square with a high-pitched noise, and the chimps' responses became significantly less accurate. © Copyright Reed Bus

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16114 - Posted: 12.06.2011

By Charles Q. Choi Ravens use their beaks and wings much like humans rely on our hands to make gestures, such as for pointing to an object, scientists now find. This is the first time researchers have seen gestures used in this way in the wild by animals other than primates. From the age of 9 to 12 months, human infants often use gestures to direct the attention of adults to objects, or to hold up items so that others can take them. These gestures, produced before children speak their first words, are seen as milestones in the development of human speech. Dogs and other animals are known to point out items using gestures, but humans trained these animals, and scientists had suggested the natural development of these gestures was normally confined only to primates, said researcher Simone Pika, a biologist at the Max Planck Institute for Ornithology in Seewiesen, Germany. Even then, comparable gestures are rarely seen in the wild in our closest living relatives, the great apes—for instance, chimpanzees in the Kibale National Park in Uganda employ so-called directed scratches to indicate distinct spots on their bodies they want groomed. Still, ravens and their relatives such as crows and magpies have been found to be remarkably intelligent over the years, surpassing most other birds in terms of smarts and even rivaling great apes on some tests. © 2011 Scientific American

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16098 - Posted: 12.01.2011

by Marion Long and Valerie Ross For centuries experts held that every language is unique. Then one day in 1956, a young linguistics professor gave a legendary presentation at the Symposium on Information Theory at MIT. He argued that every intelligible sentence conforms not only to the rules of its particular language but to a universal grammar that encompasses all languages. And rather than absorbing language from the environment and learning to communicate by imitation, children are born with the innate capacity to master language, a power imbued in our species by evolution itself. Almost overnight, linguists’ thinking began to shift. Avram Noam Chomsky was born in Philadelphia on December 7, 1928, to William Chomsky, a Hebrew scholar, and Elsie Simonofsky Chomsky, also a scholar and an author of children’s books. While still a youngster, Noam read his father’s manuscript on medieval Hebrew grammar, setting the stage for his work to come. By 1955 he was teaching linguistics at MIT, where he formulated his groundbreaking theories. Today Chomsky continues to challenge the way we perceive ourselves. Language is “the core of our being,” he says. “We are always immersed in it. It takes a strong act of will to try not to talk to yourself when you’re walking down the street, because it’s just always going on.” Chomsky also bucked against scientific tradition by becoming active in politics. He was an outspoken critic of American involvement in Vietnam and helped organize the famous 1967 protest march on the Pentagon. When the leaders of the march were arrested, he found himself sharing a cell with Norman Mailer, who described him in his book Armies of the Night as “a slim, sharp-featured man with an ascetic expression, and an air of gentle but absolute moral integrity.” © 2011, Kalmbach Publishing Co.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16094 - Posted: 12.01.2011

Ewen Callaway A mutation that appeared more than half a million years ago may have helped humans learn the complex muscle movements that are critical to speech and language. The claim stems from the finding that mice genetically engineered to produce the human form of the gene, called FOXP2, learn more quickly than their normal counterparts. The work was presented by Christiane Schreiweis, a neuroscientist at the Max Planck Institute (MPI) for Evolutionary Anthropology in Leipzig, Germany, at the Society for Neuroscience meeting this week in Washington DC this week. Scientists discovered FOXP2 in the 1990s by studying a British family known as 'KE' in which three generations suffered from severe speech and language problems1. Those with language problems were found to share an inherited mutation that inactivates one copy of FOXP2. Most vertebrates have nearly identical versions of the gene, which is involved in the development of brain circuits important for the learning of movement. The human version of FOXP2, the protein encoded by the gene, differs from that of chimpanzees at two amino acids, hinting that changes to the human form may have had a hand in the evolution of language2. A team led by Schreiweis’ colleague Svante Pääbo discovered that the gene is identical in modern humans (Homo sapiens) and Neanderthals (Homo neanderthalensis), suggesting that the mutation appeared before these two human lineages diverged around 500,000 years ago3. © 2011 Nature Publishing Group,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16058 - Posted: 11.19.2011