Links for Keyword: Language

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 481

By Morgen E. Peck Like a musician tuning a guitar, adults subconsciously listen to their own voice to tune the pitch, volume and pronunciation of their speech. Young children just learning how to talk, however, do not, a new study suggests. The result offers clues about how kids learn language—­and how parents can help. Past studies have shown that adults use aural feedback to tweak their pronunc­iation. Ewen MacDonald, a professor at the Center for Applied Hearing Research at the Technical University of Denmark, decided to see if toddlers could do this as well. He had adults and children play a video game in which they guided the actions of a robot by repeating the word “bed.” Through headphones, the players heard their own voice every time they spoke—but with the frequency spectrum shifted so they heard “bad” instead of “bed.” MacDonald found that adults and four-year-old kids tried to com­pensate for the error by pronouncing the word more like “bid,” but two-year-olds never budged from “bed,” suggesting that they were not using auditory feedback to monitor their speech. Although the toddlers may have been suppressing the feedback mechanism, MacDonald thinks they might not start listening to themselves until they are older. If that is the case, they may rely heavily on feedback from adults to gauge how they sound. In­deed, most parents and caregivers naturally repeat the words toddlers say, as praise and encouragement. “I think the real take-home message is that social interaction is important for the development of speech,” MacDonald says. “The general act of talking and interacting with the child in a normal way is the key.” © 2012 Scientific American,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16871 - Posted: 06.05.2012

Analysis by Jennifer Viegas Monkeys smack their lips during friendly face-to-face encounters, and now a new study says that this seemingly simple behavior may be tied to human speech. Previously experts thought the evolutionary origins of human speech came from primate vocalizations, such as chimpanzee hoots or monkey coos. But now scientists suspect that rapid, controlled movements of the tongue, lips and jaw -- all of which are needed for lip smacking -- were more important to the emergence of speech. For the study, published in the latest Current Biology, W. Tecumseh Fitch and colleagues used x-ray movies to investigate lip-smacking gestures in macaque monkeys. Mother monkeys do this a lot with their infants, so it seems to be kind of an endearing thing, perhaps like humans going goo-goo-goo in a baby's face while playing. (Monkeys will also vibrate their lips to make a raspberry sound.) Monkey lip-smacking, however, makes a quiet sound, similar to "p p p p". It's not accompanied by phonation, meaning sound produced by vocal cord vibration in the larynx. Fitch, who is head of the Department of Cognitive Biology at the University of Vienna, and his team determined that lip-smacking is a complex behavior that requires rapid, coordinated movements of the lips, jaw, tongue and the hyoid bone (which provides the supporting skeleton for the larynx and tongue). © 2012 Discovery Communications, LLC.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16869 - Posted: 06.02.2012

by Catherine de Lange WHEN I was just a newborn baby, my mother gazed down at me in her hospital bed and did something that was to permanently change the way my brain developed. Something that would make me better at learning, multitasking and solving problems. Eventually, it might even protect my brain against the ravages of old age. Her trick? She started speaking to me in French. At the time, my mother had no idea that her actions would give me a cognitive boost. She is French and my father English, so they simply felt it made sense to raise me and my brothers as bilingual. Yet as I've grown up, a mass of research has emerged to suggest that speaking two languages may have profoundly affected the way I think. Cognitive enhancement is just the start. According to some studies, my memories, values, even my personality, may change depending on which language I happen to be speaking. It is almost as if the bilingual brain houses two separate minds. All of which highlights the fundamental role of language in human thought. "Bilingualism is quite an extraordinary microscope into the human brain," says neuroscientist Laura Ann Petitto of Gallaudet University in Washington DC. The view of bilingualism has not always been this rosy. For many parents like mine, the decision to raise children speaking two languages was controversial. Since at least the 19th century, educators warned that it would confuse the child, making them unable to learn either language properly. At best, they thought the child would become a jack-of-all-trades and master of none. At worst, they suspected it might hinder other aspects of development, resulting in a lower IQ. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16765 - Posted: 05.08.2012

By JAMES GORMAN First things first: The hyrax is not the Lorax. And it does not speak for the trees. It sings, on its own behalf. The hyrax is a bit Seussian, however. It looks something like a rabbit, something like a woodchuck. Its closest living relatives are elephants, manatees and dugongs. And male rock hyraxes have complex songs like those of birds, in the sense that males will go on for 5 or 10 minutes at a stretch, apparently advertising themselves. One might have expected that the hyrax would have some unusual qualities — the animals’ feet, if you know how to look at them, resemble elephants’ toes, the experts say. And their visible front teeth are actually very small tusks. But Arik Kershenbaum and colleagues at the University of Haifa and Tel Aviv University have found something more surprising. Hyraxes’ songs have something rarely found in mammals: syntax that varies according to where the hyraxes live, geographical dialects in how they put their songs together. The research was published online Wednesday in The Proceedings of the Royal Society B. Bird songs show syntax, this ordering of song components in different ways, but very few mammals make such orderly, arranged sounds. Whales, bats and some primates show syntax in their vocalizations, but nobody really expected such sophistication from the hyrax, and it was thought that the selection of sounds in the songs were relatively random. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 8: Hormones and Sex; Chapter 15: Language and Our Divided Brain
Link ID: 16685 - Posted: 04.21.2012

by Erin Loury Monkeys banging on typewriters might never reproduce the works of Shakespeare, but they may be closer to reading Hamlet than we thought. Scientists have trained baboons to distinguish English words from similar-looking nonsense words by recognizing common arrangements of letters. The findings indicate that visual word recognition, the most basic step of reading, can be learned without any knowledge of spoken language. The study builds on the idea that when humans read, our brains first have to recognize individual letters, as well as their order. "We're actually reading words much like we identify any kind of visual object, like we identify chairs and tables," says study author Jonathan Grainger, a cognitive psychologist at France's National Center for Scientific Research, and Aix-Marseille University in Marseille, France. Our brains construct words from an assembly of letters like they recognize tables as a surface connected to four legs, Grainger says. Much of the current reading research has stressed that readers first need to have familiarity with spoken language, so they can connect sounds (or hand signs for the hearing-impaired) with the letters they see. Grainger and his colleagues wanted to test whether it's possible to learn the letter patterns of words without any idea of what they mean or how they sound—that is, whether a monkey could do it. The scientists used a unique testing facility, consisting of a trailer with computers set up next to a baboon enclosure, which the animals could enter at will and perform trials on the touch-screen computers for as long as they pleased. The computers cued up the appropriate test for each of the six study baboons using microchips in their arms. When letters appeared on the monitor, the baboons got wheat rewards for touching the correct shape on the screen: an oval on the right of the screen if the word was real, and a cross on the left if it was nonsense (see video). © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16647 - Posted: 04.14.2012

A personality profile marked by overly gregarious yet anxious behavior is rooted in abnormal development of a circuit hub buried deep in the front center of the brain, say scientists at the National Institutes of Health. They used three different types of brain imaging to pinpoint the suspect brain area in people with Williams syndrome, a rare genetic disorder characterized by these behaviors. Matching the scans to scores on a personality rating scale revealed that the more an individual with Williams syndrome showed these personality/temperament traits, the more abnormalities there were in the brain structure, called the insula. "Scans of the brain's tissue composition, wiring, and activity produced converging evidence of genetically-caused abnormalities in the structure and function of the front part of the insula and in its connectivity to other brain areas in the circuit," explained Karen Berman, M.D., of the NIH's National Institute of Mental Health (NIMH). Berman, Drs. Mbemda Jabbi, Shane Kippenhan, and colleagues, report on their imaging study in Williams syndrome online in the journal Proceedings of the National Academy of Sciences. Williams syndrome is caused by the deletion of some 28 genes, many involved in brain development and behavior, in a particular section of chromosome 7. Among deficits characteristic of the syndrome are a lack of visual-spatial ability – such as is required to assemble a puzzle — and a tendency to be overly-friendly with people, while overly anxious about non-social matters, such as spiders or heights. Many people with the disorder are also mentally challenged and learning disabled, but some have normal IQs.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16569 - Posted: 03.24.2012

By Emily Sohn In his spare time, an otherwise ordinary 16-year old boy from New York taught himself Hebrew, Arabic, Russian, Swahili, and a dozen other languages, the New York Times reported last week. And even though it's not entirely clear how close to fluent Timothy Doner is in any of his studied languages, the high school sophomore -- along with other polyglots like him -- are certainly different from most Americans, who speak one or maybe two languages. That raises the question: Is there something unique about certain brains, which allows some people to speak and understand so many more languages than the rest of us? The answer, experts say, seems to be yes, no and it's complicated. For some people, genes may prime the brain to be good at language learning, according to some new research. And studies are just starting to pinpoint a few brain regions that are extra-large or extra-efficient in people who excel at languages. For others, though, it's more a matter of being determined and motivated enough to put in the hours and hard work necessary to learn new ways of communicating. "Kids do well in what they like," said Michael Paradis, a neurolinguist at McGill University in Montreal, who compared language learning to piano, sports or anything else that requires discipline. "Kids who love math do well in math. He loves languages and is doing well in languages." © 2012 Discovery Communications, LLC.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16547 - Posted: 03.20.2012

By YUDHIJIT BHATTACHARJEE SPEAKING two languages rather than just one has obvious practical benefits in an increasingly globalized world. But in recent years, scientists have begun to show that the advantages of bilingualism are even more fundamental than being able to converse with a wider range of people. Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age. This view of bilingualism is remarkably different from the understanding of bilingualism through much of the 20th century. Researchers, educators and policy makers long considered a second language to be an interference, cognitively speaking, that hindered a child’s academic and intellectual development. They were not wrong about the interference: there is ample evidence that in a bilingual’s brain both language systems are active even when he is using only one language, thus creating situations in which one system obstructs the other. But this interference, researchers are finding out, isn’t so much a handicap as a blessing in disguise. It forces the brain to resolve internal conflict, giving the mind a workout that strengthens its cognitive muscles. Bilinguals, for instance, seem to be more adept than monolinguals at solving certain kinds of mental puzzles. In a 2004 study by the psychologists Ellen Bialystok and Michelle Martin-Rhee, bilingual and monolingual preschoolers were asked to sort blue circles and red squares presented on a computer screen into two digital bins — one marked with a blue square and the other marked with a red circle. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16542 - Posted: 03.19.2012

By ANNIE MURPHY PAUL Brain scans are revealing what happens in our heads when we read a detailed description, an evocative metaphor or an emotional exchange between characters. Stories, this research is showing, stimulate the brain and even change how we act in life. Researchers have long known that the “classical” language regions, like Broca’s area and Wernicke’s area, are involved in how the brain interprets written words. What scientists have come to realize in the last few years is that narratives activate many other parts of our brains as well, suggesting why the experience of reading can feel so alive. Words like “lavender,” “cinnamon” and “soap,” for example, elicit a response not only from the language-processing areas of our brains, but also those devoted to dealing with smells. In a 2006 study published in the journal NeuroImage, researchers in Spain asked participants to read words with strong odor associations, along with neutral words, while their brains were being scanned by a functional magnetic resonance imaging (fMRI) machine. When subjects looked at the Spanish words for “perfume” and “coffee,” their primary olfactory cortex lit up; when they saw the words that mean “chair” and “key,” this region remained dark. The way the brain handles metaphors has also received extensive study; some scientists have contended that figures of speech like “a rough day” are so familiar that they are treated simply as words and no more. Last month, however, a team of researchers from Emory University reported in Brain & Language that when subjects in their laboratory read a metaphor involving texture, the sensory cortex, responsible for perceiving texture through touch, became active. Metaphors like “The singer had a velvet voice” and “He had leathery hands” roused the sensory cortex, while phrases matched for meaning, like “The singer had a pleasing voice” and “He had strong hands,” did not. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 14: Attention and Consciousness
Link ID: 16536 - Posted: 03.19.2012

By Karen Weintraub Marjorie Nicholas, associate chairwoman of the department of communication sciences and disorders at the MGH Institute of Health Professions, is an expert in the language disorder aphasia, and has been treating former Arizona Representative Gabrielle Giffords, who has the condition. Q. What is aphasia and how do people get it? A. Aphasia affects your ability to speak, to understand language, to read and to write. It’s extremely variable. Some might have a severe problem in expression but really pretty good understanding of spoken language, and somebody else might have a very different profile. Typically, people get aphasia by having a stroke that damages parts of the left side of the brain, which is dominant for language. People can also get aphasia from other types of injuries like head injuries, or in Gabby’s case, a gunshot wound to the head that damages that same language area of the brain. It is more common than people realize. Q. How does Giffords fit into the spectrum of symptoms you’ve described? A. Her understanding of spoken language is really very good. Her difficulties are more in the expression. Q. You obviously can’t violate her privacy, but what can you say about your work with Giffords? A. I worked with her for two weeks last fall, and [colleague Nancy Helm-Estabrooks of the University of North Carolina] and I are planning to work with her again for a week this spring. We’ll need to see where she is again. I’m assuming she will have continued to improve and we’ll want to keep her going on that track. © 2012 NY Times Co

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16444 - Posted: 02.28.2012

By Ella Davies Reporter, BBC Nature Pygmy goats can develop "accents" as they grow older, according to scientists. The young animals, known as "kids", are raised in groups or "creches" with goats of a similar age. Researchers found that when young goats mixed in these social groups their calls became more similar. The animals join an elite group of mammals known to adapt a vocal sound in response to the environment that includes humans, bats and whales. Dr Elodie Briefer and Dr Alan McElligott from Queen Mary's School of Biological and Chemical Sciences at the University of London, UK published their results in the journal Animal Behaviour. In order to test the goats' vocal repertoire they recorded calls at one-week-old and again when they were aged five weeks. "Five weeks corresponds to the time when, in the wild, they join their social group after spending some time hidden in vegetation to avoid predators," Dr Briefer explained. "We found that genetically-related kids produced similar calls... but the calls of kids raised in the same social groups were also similar to each other, and became more similar as the kids grew older." BBC © 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16400 - Posted: 02.20.2012

By Bruce Bower By age 6 months, infants on the verge of babbling already know — at least in a budding sense — the meanings of several common nouns for foods and body parts, a new study finds. Vocabulary learning and advances in sounding out syllables and consonants go hand in hand starting at about age 6 months, say graduate student Elika Bergelson and psychologist Daniel Swingley of the University of Pennsylvania. Babies don’t blurt out their first words until around 1 year of age. Bergelson and Swingley’s evidence that 6-month-olds direct their gaze to images of bananas, noses and other objects named by their mothers challenges the influential view that word learning doesn’t start until age 9 months. “Our guess is that a special human desire for social connection, on the part of parents and their infants, is an important component of early word learning,” Bergelson says. The work is published online the week of February 13 in the Proceedings of the National Academy of Sciences. In the study, 33 infants ages 6 to 9 months and 50 kids ages 10 to 20 months sat on their mothers’ laps in front of a computer connected to an eye-tracking device. Even at 6 months, babies looked substantially longer, on average, at images of various foods and body parts named by their mothers when those items appeared with other objects. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16378 - Posted: 02.14.2012

by Gisela Telis The right turn of phrase can activate the brain's sensory centers, a new study suggests. Researchers have found that textural metaphors—phrases such as "soft-hearted"—turn on a part of the brain that's important to the sense of touch. The result may help resolve a long-standing controversy over how the brain understands metaphors and may offer scientists a new way to study how different brain regions communicate. Scientists have disagreed for decades about how the brain processes metaphors, those figures of speech that liken one thing to another without using "like" or "as." One camp claims that when we hear a metaphor—a friend tells us she's had a rough day—we understand the expression only because we've heard it so many times. The brain learns that "rough" means both "abrasive" and "bad," this camp says, and it toggles from one definition to the other. The other camp claims the brain calls on sensory experiences, such as what roughness feels like, to comprehend the metaphor. Researchers from both camps have scanned the brain for signs of sensory activity triggered by metaphors, but these past studies, which tested a variety of metaphors without targeting specific senses or regions of the brain, have come up dry. Neurologist Krish Sathian of Emory University in Atlanta wondered whether using metaphors specific to only one of the senses might be a better strategy. He and his colleagues settled on touch and asked seven college students to distinguish between different textures while their brains were scanned using functional magnetic resonance imaging. This enabled them to map the brain regions each subject used to feel and classify textures. Then they scanned the subjects' brains again as they listened to a torrent of textural metaphors and their literal counterparts: "he is wet behind the ears" versus "he is naïve," for example, or "it was a hairy situation" versus "it was a precarious situation." © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16363 - Posted: 02.09.2012

by Helen Thomson When you read this sentence to yourself, it's likely that you hear the words in your head. Now, in what amounts to technological telepathy, others are on the verge of being able to hear your inner dialogue too. By peering inside the brain, it is possible to reconstruct speech from the activity that takes place when we hear someone talking. Because this brain activity is thought to be similar whether we hear a sentence or think the same sentence, the discovery brings us a step closer to broadcasting our inner thoughts to the world without speaking. The implications are enormous – people made mute through paralysis or locked-in syndrome could regain their voice. It might even be possible to read someone's mind. Imagine a musician watching a piano being played with no sound, says Brian Pasley at the University of California, Berkeley. "If a pianist were watching a piano being played on TV with the sound off, they would still be able to work out what the music sounded like because they know what key plays what note," Pasley says. His team has done something analogous with brain waves, matching neural areas to their corresponding noises. How the brain converts speech into meaningful information is a bit of a puzzle. The basic idea is that sound activates sensory neurons, which then pass this information to different areas of the brain where various aspects of the sound are extracted and eventually perceived as language. Pasley and colleagues wondered whether they could identify where some of the most vital aspects of speech are extracted by the brain. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 5: The Sensorimotor System
Link ID: 16329 - Posted: 02.02.2012

By Carrie Arnold Put on a pair of headphones and turn up the volume so that you can’t even hear yourself speak. For those who stutter, this is when the magic happens. Without the ability to hear their own voice, people with this speech impediment no longer stumble over their words—as was recently portrayed in the movie The King’s Speech. This simple trick works because of the unusual way the brain of people who stutter is organized—a neural setup that affects other actions besides speech, according to a new study. Normal speech requires the brain to control movement of the mouth and vocal chords using the sound of the speaker’s own voice as a guide. This integration of movement and hearing typically happens in the brain’s left hemisphere, in a region of the brain known as the premotor cortex. In those who stutter, however, the process occurs in the right hemisphere—prob­ably because of a slight defect on the left side, according to past brain-imaging studies. Singing requires a similar integration of aural input and motor control, but the processing typically occurs in the right hemi­sphere, which may explain why those who stutter can sing as well as anyone else. (In a related vein, The King’s Speech also mentioned the common belief that people who stutter are often left-handed, but studies have found no such link.) In the new study, published in the September issue of Cortex, re­searchers found that the unusual neural organization underlying a stutter also includes motor tasks completely unrelated to speech. A group of 30 adults, half of whom stuttered and half of whom did not, tapped a finger in time to a metronome. When the sci­entists interfered with the function of their left hemisphere using trans­cranial magnetic stimulation, a non­invasive technique that temporarily dampens brain activity, nonstutterers found themselves unable to tap in time—but those who stuttered were unaffected. When the researchers interfered with the right hemisphere, the results were reversed: the stut­tering group was impaired, and the nonstutterers were fine. © 2012 Scientific American,

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 13: Memory, Learning, and Development
Link ID: 16289 - Posted: 01.24.2012

Sean O'Neill, contributor Sue Savage-Rumbaugh is a primatologist who works at the Great Ape Trust in Des Moines, Iowa, where she explores the mental and linguistic skills of our primate cousins. As a graduate, you were all set to do a postdoc in psychology at Harvard University. What happened? Yes, I was due to go to Harvard to work with behaviourist B. F. Skinner and his famous pigeons. But before I left I happened to sit in on a class by primate researcher Roger Fouts, who brought a chimpanzee named Booee to class. Roger held up objects like a hat, a key and a pair of shoes, and Booee would make what Roger said were signs for those objects. I saw a chimpanzee doing what seemed to be a symbolic task and I was hooked. I said to myself: "Wait a minute, people are teaching chimpanzees human language, and I'm going to Harvard to study pigeons? You need to stay here, this is where it's at if you are interested in the origins of the human mind." I have worked with apes ever since. Your work on the linguistic capabilities of apes has taken you into uncharted territory... Yes, for better or for worse, I have gone to a place that other researchers have not. If I had had any inkling into the huge degree of linguistic, conceptual and social similarity between ourselves and bonobos when I started the work I would have been scared to death to do it. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16277 - Posted: 01.21.2012

By Joan Raymond For years, the conventional wisdom was that babies learned how to talk by listening to their parents. But a new study in the Proceedings of the National Academy of Sciences shows that our little angels are using more than their ears to acquire language. They’re using their eyes, too, and are actually pretty good lip readers. The finding could lead to earlier diagnosis and intervention for autism spectrum disorders, estimated, on average, to affect 1 in 110 children in the United States alone. In the study, researchers from Florida Atlantic University tested groups of infants, ranging from four to 12 months of age and a group of adults for comparison. advertisement The babies watched videos of women speaking either in English, the native language used in the home, or in Spanish, a language foreign to them. Using an eye tracker device to study eye movements, the researchers looked at developmental changes in attention to the eyes and mouth. Results showed that at four months of age, babies focused almost solely on the women’s eyes. But by six to eight months of age, when the infants entered the so-called “babbling” stage of language acquisition and reached a milestone of cognitive development in which they can direct their attention to things they find interesting, their focus shifted to the women’s mouths. They continue to “lip read” until about 10 months of age, a point when they finally begin mastering the basic features of their native language. At this point, infants also begin to shift their attention back to the eyes. © 2012 msnbc.com

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Language and Our Divided Brain
Link ID: 16265 - Posted: 01.17.2012

By Victoria Gill Science reporter, BBC Nature Chimpanzees appear to consider who they are "talking to" before they call out. Researchers found that wild chimps that spotted a poisonous snake were more likely to make their "alert call" in the presence of a chimp that had not seen the threat. This indicates that the animals "understand the mindset" of others. The insight into the primates' remarkable intelligence will be published in the journal Current Biology. The University of St Andrews scientists, who carried out the work, study primate communication to uncover some of the origins of human language. To find out how the animals "talked to each other" about potential threats, they placed plastic snakes - models of rhino and gaboon vipers - into the paths of wild chimpanzees and monitored the primates' reactions. "These [snake species] are well camouflaged and they have a deadly bite," explained Dr Catherine Crockford from University of St Andrews, who led the research. "They also tend to sit in one place for weeks. So if a chimp discovers a snake, it makes sense for that animal to let everyone else know where [it] is." The scientists put the snake on a path that the chimps were using regularly, secreting the plastic models in the leaves. "When [the chimps] saw the model, they would be quite close to it and would leap away, but they wouldn't call," she told BBC Nature. BBC © 2011

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16205 - Posted: 01.03.2012

by Robert Kowalski What is the relationship between language and thought? The quest to create artificial intelligence may have come up with some unexpected answers THE idea of machines that think and act as intelligently as humans can generate strong emotions. This may explain why one of the most important accomplishments in the field of artificial intelligence has gone largely unnoticed: that some of the advances in AI can be used by ordinary people to improve their own natural intelligence and communication skills. Chief among these advances is a form of logic called computational logic. This builds and improves on traditional logic, and can be used both for the original purpose of logic - to improve the way we think - and, crucially, to improve the way we communicate in natural languages, such as English. Arguably, it is the missing link that connects language and thought. According to one school of philosophy, our thoughts have a language-like structure that is independent of natural language: this is what students of language call the language of thought (LOT) hypothesis. According to the LOT hypothesis, it is because human thoughts already have a linguistic structure that the emergence of common, natural languages was possible in the first place. The LOT hypothesis contrasts with the mildly contrary view that human thinking is actually conducted in natural language, and thus we could not think intelligently without it. It also contradicts the ultra-contrary view that human thinking does not have a language-like structure at all, implying that our ability to communicate in natural language is nothing short of a miracle. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 16131 - Posted: 12.10.2011

by Robert Krulwich Here's something you should know about yourself. Vowels control your brain. "I"s make you see things differently than "O"s. Here's how. Say these words out loud: Bean Mint Slim These "I" and "E" vowels are formed by putting your tongue forward in the mouth. That's why they're called "front" vowels. Now, say: Large Pod Or Ought With these words, your tongue depresses and folds back a bit. So "O", "A" and "U" are called "back" of the throat vowels. OK, here's the weird part. When comparing words across language groups, says Stanford linguistics professor Dan Jurafsky, a curious pattern shows up: Words with front vowels ("I" and "E") tend to represent small, thin, light things. Back vowels ("O" "U" and some "A"s ) show up in fat, heavy things. It's not always true, but it's a tendency that you can see in any of the stressed vowels in words like little, teeny or itsy-bitsy (all front vowels) versus humongous or gargantuan (back vowels). Or the i vowel in Spanish chico (front vowel meaning small) versus gordo (back vowel meaning fat). Or French petit (front vowel) versus grand (back vowel). Copyright 2011 NPR

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 16126 - Posted: 12.10.2011