Chapter 15. Brain Asymmetry, Spatial Cognition, and Language

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 2523

By Sarah Witman Nicole Dodds first noticed her son, Rowan, was having trouble using the right side of his body when he was about 6 months old. Babies typically use both hands to pick up toys and lift their chest off the floor at that age, but Rowan was mostly using his left arm and hand, keeping his right hand balled in a fist. That started a string of doctor visits. Around Rowan’s first birthday, doctors did an MRI and diagnosed his one-sided weakness as hemiplegia, probably caused by a stroke he sustained in utero. This surprised Dodds, since as far as she knew she’d had a totally normal pregnancy and birth Perinatal stroke — when an infant loses blood supply to the brain in late pregnancy, during birth or in the first month of life — is one of the most common causes of hemiplegia in infants, affecting anywhere from 1 in 2,500 to 1 in 4,000 live births in the United States every year. Like adult stroke, perinatal stroke is usually caused by a blood clot that jams brain arteries, or else by bleeding in or around the infant’s brain. Babies with heart disease, clotting disorders such as hemophilia, and bacterial infection among other factors have a higher risk of perinatal stroke, but the exact cause is often unknown. As in the case with Rowan, there are often no outward signs for up to a year that something is amiss, resulting in delayed or inconclusive diagnosis. It’s nearly impossible to detect a stroke in utero, or even in the first few weeks after birth, since the symptoms can seem within the norm for infants: favoring one side, extreme sleepiness, mild seizures that seem like shivering or sudden stiffening. More obvious behaviors such as trouble walking and talking don’t usually become apparent until the child turns 2, and are associated with other childhood problems.

Keyword: Stroke; Development of the Brain
Link ID: 27069 - Posted: 02.25.2020

By Katherine Kornei Imagine a frog call, but with a metallic twang—and the intensity of a chainsaw. That’s the “boing” of a minke whale. And it’s a form of animal communication in danger of being drowned out by ocean noise, new research shows. By analyzing more than 42,000 minke whale boings, scientists have found that, as background noise intensifies, the whales are losing their ability to communicate over long distances. This could limit their ability to find mates and engage in important social contact with other whales. Tyler Helble, a marine acoustician at the Naval Information Warfare Center Pacific, and colleagues recorded minke whale boings over a 1200-square-kilometer swathe of the U.S. Navy’s Pacific Missile Range Facility near the Hawaiian island of Kauai from 2012 to 2017. By measuring when a single boing arrived at various underwater microphones, the team pinpointed whale locations to within 10 to 20 meters. The researchers then used these positions, along with models of how sound propagates underwater, to calculate the intensity of each boing when it was emitted. The team compared these measurements with natural ambient noise, including waves, wind, and undersea earthquakes (no military exercises were conducted nearby during the study period). They found that minke whale boings grew louder in louder conditions. That’s not surprising—creatures across the animal kingdom up their volume when there’s background noise. (This phenomenon, dubbed the Lombard effect, holds true for humans, too—think of holding a conversation at a loud concert.) © 2019 American Association for the Advancement of Science.

Keyword: Animal Communication; Hearing
Link ID: 27051 - Posted: 02.19.2020

By Laura Sanders Injecting a swarm of nanoparticles into the blood of someone who has suffered a brain injury may one day help to limit the damage — if experimental results in mice can be translated to humans. In mice, these nanoparticles seemed to reduce dangerous swelling by distracting immune cells from rushing to an injured brain. The results, described online January 10 in the Annals of Neurology, hint that the inflammation-fighting nanoparticles might someday make powerful medicine, says John Kessler, a neurologist at Northwestern Medicine in Chicago. “All the data we have now suggest that they’re going to be safe, and they’re likely to work” for people, Kessler says. “But we don’t know that yet.” After an injury, tissue often swells as immune cells flock to the damage. Swelling of the brain can be dangerous because the brain is contained within the skull and “there’s no place to go,” Kessler says. The resulting pressure can be deadly. But nanoparticles might serve as an immune-cell distraction, the results in mice suggest. Two to three hours after a head injury, mice received injections of tiny biodegradable particles made of an FDA-approved polymer — the same sort that’s used in some dissolving sutures. Instead of rushing toward the brain, a certain type of immune cell called monocytes began turning their sights on these invaders. These monocytes engulfed the nanoparticles, and the cells and their cargo got packed off to the spleen for elimination, the researchers found. Because these nanoparticles are quickly taken out of circulation, the researchers injected the mice again one and two days later, in an effort to ease inflammation that might crop back up in the days after the injury. © Society for Science & the Public 2000–2020

Keyword: Brain Injury/Concussion
Link ID: 27022 - Posted: 02.05.2020

Joanna McKittrick, Jae-Young Jung Slamming a beak against the trunk of a tree would seem like an activity that would cause headaches, jaw aches and serious neck and brain injuries. Yet woodpeckers can do this 20 times per second and suffer no ill effects. Woodpeckers are found in forested areas worldwide, except in Australia. These birds have the unusual ability to use their beaks to hammer into the trunks of trees to make holes to extract insects and sap. Even more impressive they do this without hurting themselves. We are materials scientists who study biological substances like bones, skins, feathers and shells found in nature. We are interested in the skull and tongue bone structure of woodpeckers, because we think their unusual anatomy could yield insights that could help researchers develop better protective head gear for humans. Concussions in people Woodpeckers endure many high impact shocks to their heads as they peck. They have strong tail feathers and claws that help them keep their balance as their head moves toward the tree trunk at 7 meters – 23 feet – per second. Then, when their beak strikes, their heads slow down at about 1,200 times the force of gravity (g). All of this occurs without the woodpecker sustaining concussions or brain damage. A concussion is a form of traumatic brain injury caused by repeated blows to the head. It is a common occurrence and happens frequently during contact sports like football or hockey. Repeated traumatic brain injury eventually causes a progressive brain disorder, chronic traumatic encephalopathy (CTE), which is irreversible and results in symptoms such as memory loss, depression, impulsivity, aggressiveness and suicidal behavior. The National Football League says concussions in football players occur at 80 g. So how do woodpeckers survive repeated 1,200 g impacts without harming their brain? © 2010–2020, The Conversation US, Inc.

Keyword: Brain Injury/Concussion; Evolution
Link ID: 27014 - Posted: 02.01.2020

By Elizabeth Pennisi It’s been a bad couple of weeks for behavioral ecologist Jonathan Pruitt—the holder of one of the prestigious Canada 150 Research Chairs—and it may get a lot worse. What began with questions about data in one of Pruitt’s papers has flared into a social media–fueled scandal in the small field of animal personality research, with dozens of papers on spiders and other invertebrates being scrutinized by scores of students, postdocs, and other co-authors for problematic data. Already, two papers co-authored by Pruitt, now at McMaster University, have been retracted for data anomalies; Biology Letters is expected to expunge a third within days. And the more Pruitt’s co-authors look, the more potential data problems they find. All papers using data collected or curated by Pruitt, a highly productive researcher who specialized in social spiders, are coming under scrutiny and those in his field predict there will be many retractions. The furor has even earned a Twitter hashtag—#PruittData. Yet even one of the researchers who initially probed Pruitt’s data cautions that what has happened remains unclear. “There is no hard evidence that [Pruitt’s] data are fabricated,” says behavioral ecologists Niels Dingemanse of Ludwig Maximilian University of Munich (LMU). © 2019 American Association for the Advancement of Science.

Keyword: Emotions; Evolution
Link ID: 27012 - Posted: 02.01.2020

By Leah Shaffer Football’s concussion crisis has been part of the NFL for almost two decades. But the pros aren’t the only ones reevaluating their relationship with the game. Now, studies are finding that parents of younger children are increasingly concerned about the long-term impacts of playing football. A national survey from 2015 found that 25 percent of parents do not let their kids play contact sports due to fear of concussions, while an Aspen Institute report recently found that participation in tackle football declined by 12 percent among children ages 6 to 12 between 2016 and 2017. The research into the risks of youth football is still coming into shape, and there’s disagreement about just how universal and severe the risks are. Some researchers think football is dangerous for everybody; others are finding evidence that some kids might be more predisposed to health consequences than others. In the last two years, some researchers have shown that head hits in youth sports increase the risk of developing chronic traumatic encephalopathy, or CTE, an untreatable degenerative brain disease with symptoms ranging from memory loss to progressive dementia. Other studies have shown that the longer a person plays football, the higher the risk they have for developing symptoms associated with CTE. So, case closed, right? No — football is not the only risk factor in developing symptoms of CTE. The same study that found an association between repetitive head impact and dementia in CTE also found that cardiovascular disease and dementia in CTE were correlated. And a separate study of some 10,000 people found no association between participation in contact sports and later cognitive decline or increase in symptoms of depression. © 2020 ABC News Internet Ventures

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 27010 - Posted: 01.31.2020

By Aimee Cunningham A concussion diagnosis depends upon a careful assessment of symptoms. Now the largest study to date of sports-related concussion points to a potential medical assist when evaluating a college athlete for this injury. Certain proteins in the blood are elevated after a concussion, researchers report online January 24 in JAMA Network Open. That discovery may one day help with distinguishing athletes who have suffered this brain injury from those who haven’t. Researchers took blood samples pre- and post-injury from 264 college athletes who had concussions while playing football, rugby and other contact sports from 2015 to mid-2018. Blood levels for three proteins were higher than they were before the injury occurred, the researchers found. Each of the three proteins can serve as a sign that damage has occurred to a different type of brain cell, says Michael McCrea, a neuropsychologist at the Medical College of Wisconsin in Milwaukee. Glial fibrillary acidic protein is released in response to injury to glial cells, which provide support to nerve cells in the brain. Ubiquitin C-terminal hydrolase-L1 signals that nerve cells have been injured, and tau is a sign of damage to axons, which transmit nerve impulses. These proteins have been evaluated in past research as potential makers of more severe traumatic brain injury. McCrea’s team also measured these proteins in 138 athletes who played contact sports but were not concussed, and in 102 athletes who did not have the injury and played noncontact sports. The protein levels for these two groups remained steady throughout the study. If there had been large variability in the protein levels in non-concussed athletes, McCrea says, that would have undermined the association between the proteins and concussion. © Society for Science & the Public 2000–2020

Keyword: Brain Injury/Concussion
Link ID: 26997 - Posted: 01.27.2020

By Will Hobson In 2017, Bennet Omalu traveled the globe to accept a series of honors and promote his autobiography, “Truth Doesn’t Have A Side.” In a visit to an Irish medical school, he told students he was a “nobody” who “discovered a disease in America’s most popular sport.” In an appearance on a religious cable TV show, he said he named the disease chronic traumatic encephalopathy, or CTE, because “it sounded intellectually sophisticated, with a very good acronym.” And since his discovery, Omalu told Sports Illustrated, researchers have uncovered evidence that shows adolescents who participate in football, hockey, wrestling and mixed martial arts are more likely to drop out of school, become addicted to drugs, struggle with mental illness, commit violent crimes and kill themselves. A Ni­ger­ian American pathologist portrayed by Will Smith in the 2015 film, “Concussion,” Omalu is partly responsible for the most important sports story of the 21st century. Since 2005, when Omalu first reported finding widespread brain damage in a former NFL player, concerns about CTE have inspired a global revolution in concussion safety and fueled an ongoing existential crisis for America’s most popular sport. Omalu’s discovery — initially ignored and then attacked by NFL-allied doctors — inspired an avalanche of scientific research that forced the league to acknowledge a link between football and brain disease. Nearly 15 years later, Omalu has withdrawn from the CTE research community and remade himself as an evangelist, traveling the world selling his frightening version of what scientists know about CTE and contact sports. In paid speaking engagements, expert witness testimony and in several books he has authored, Omalu portrays CTE as an epidemic and himself as a crusader, fighting against not just the NFL but also the medical science community, which he claims is too corrupted to acknowledge clear-cut evidence that contact sports destroy lives.

Keyword: Brain Injury/Concussion
Link ID: 26986 - Posted: 01.23.2020

Hannah Devlin Science correspondent The death in 2002 of the former England and West Bromwich Albion striker Jeff Astle from degenerative brain disease placed the spotlight firmly on the possibility of a link between heading footballs and the risk of dementia. The coroner at the inquest ruled that Astle, 59, died from an “industrial disease” brought on by the repeated trauma of headers, and a later examination of Astle’s brain appeared to bear out this conclusion. At that time there was sparse scientific data on the issue, but since then the balance of evidence has steadily tipped further in favour of a link. It has been shown that even single episodes of concussion can have lifelong consequences. Children in Scotland could be banned from heading footballs over dementia link Read more A 2016 study based on health records of more than 100,000 people in Sweden found that after a single diagnosed concussion people were more likely to have mental health problems and less likely to graduate from high school and college. Other research has shown that people in prison or homeless are more likely to have had a past experience of concussion. In 2017, researchers from University College London examined postmortem the brains of six former footballers who had developed dementia. They found signs of brain injury called chronic traumatic encephalopathy (CTE) in four cases. Last year a study by a team at Glasgow University found that former professional footballers were three and a half times more likely to die from dementia and other serious neurological diseases. The study was the largest ever, based on the health records of 7,676 ex-players and 23,000 members of the public, and was possibly the trigger for the Scottish FA’s plan to follow US soccer in banning heading the ball for young players. © 2020 Guardian News & Media Limited

Keyword: Brain Injury/Concussion
Link ID: 26969 - Posted: 01.17.2020

There are differences in the way English and Italian speakers are affected by dementia-related language problems, a small study suggests. While English speakers had trouble pronouncing words, Italian speakers came out with shorter, simpler sentences. The findings could help ensure accurate diagnoses for people from different cultures, the researchers said. Diagnostic criteria are often based on English-speaking patients. In the University of California study of 20 English-speaking patients and 18 Italian-speaking patients, all had primary progressive aphasia - a neuro-degenerative disease which affects areas of the brain linked to language. It is a feature of Alzheimer's disease and other dementia disorders. Brain scans and tests showed similar levels of cognitive function in people in both language groups. But when the researchers asked participants to complete a number of linguistic tests, they picked up obvious differences between the two groups in the challenges they faced. 'Easier to pronounce' "We think this is specifically because the consonant clusters that are so common in English pose a challenge for a degenerating speech-planning system," said study author Maria Luisa Gorno-Tempini, professor of neurology and psychiatry. "In contrast, Italian is easier to pronounce, but has much more complex grammar, and this is how Italian speakers with [primary progressive aphasia] tend to run into trouble." As a result, the English speakers tended to speak less while the Italian speakers had fewer pronunciation problems, but simplified what they did say. English is a Germanic language while Italian is a Romance language, derived from Latin along with French, Spanish and Portuguese. The researchers, writing in Neurology, are concerned that many non-native English speakers may not be getting the right diagnosis "because their symptoms don't match what is described in clinical manuals based on studies of native English speakers". The San Francisco research team says it now wants to repeat the research in larger groups of patients, and look for differences between speakers of other languages, such as Chinese and Arabic. © 2020 BBC

Keyword: Alzheimers; Language
Link ID: 26954 - Posted: 01.13.2020

By Kelly Servick The dark, thumping cavern of an MRI scanner can be a lonely place. How can scientists interested in the neural activity underlying social interactions capture an engaged, conversing brain while its owner is so isolated? Two research teams are advancing a curious solution: squeezing two people into one scanner. One such MRI setup is under development with new funding from the U.S. National Science Foundation (NSF), and another has undergone initial testing described in a preprint last month. These designs have yet to prove that their scientific payoff justifies their cost and complexity, plus the requirement that two people endure a constricted almost-hug, in some cases for 1 hour or more. But the two groups hope to open up new ways to study how brains exchange subtle social and emotional cues bound up in facial expressions, eye contact, and physical touch. The tool could “greatly expand the range of investigations possible,” says Winrich Freiwald, a neuroscientist at Rockefeller University. “This is really exciting.” Functional magnetic resonance imaging (fMRI), which measures blood oxygenation to estimate neural activity, is already a common tool for studying social processes. But compared with real social interaction, these experiments are “reduced and artificial,” says Lauri Nummenmaa, a neuroscientist at the University of Turku in Finland. Participants often look at static photos of faces or listen to recordings of speech while lying in a scanner. But photos can’t show the subtle flow of emotions across people’s faces, and recordings don’t allow the give and take of real conversation. © 2019 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 26949 - Posted: 01.10.2020

By Lisa Sanders, M.D. The 67-year-old woman had just flown back to her old hometown, Eugene, Ore., to pick up one more load of boxes to move them to her new hometown, Homer, Alaska. As usual, the shuttle to long-term parking was nowhere in sight, so she pulled out the handles of her bags and wheeled them down the now-familiar airport road. It was a long walk — maybe half a mile — but it was a beautiful afternoon for it. A lone woman walking down this rarely used road in the airport caught the attention of Diana Chappell, an off-duty emergency medical technician, on her way to catch her own flight. She watched as the woman approached a building where some airport E.M.T.s were stationed. Suddenly the woman stopped. She rose to her toes and turned gracefully, then toppled over like a felled tree and just lay there. Chappell jumped out of the car and ran to the woman. She was awake but couldn’t sit up. Chappell helped her move to the side of the road and took a quick visual survey. The woman had a scrape over her left eye where her glasses had smashed into her face. Her left knee was bleeding, and her left wrist was swelling. She’d dropped the handle of one of her rolling bags, the woman explained. When she tried to pick it up, she fell. But she felt fine now. As she spoke, Chappell noticed that her speech was slightly slurred and that the left side of her mouth wasn’t moving normally. “I don’t know you, but your speech sounds a little slurred,” she said. “Have you been drinking?” Not at all, the woman answered — surprised by the question. Chappell introduced herself, then asked the woman if she could do a few quick tests to make sure she was O.K. Chappell asked her to smile, but the left side of the patient’s mouth did not cooperate; she asked her to shrug her shoulders, and the left side wouldn’t stay up. You need to go to the hospital, she told the woman. The woman protested; she felt fine. At least let me call my E.M.T. pals to check your blood pressure, Chappell insisted. After a fall like that, it could be high. The woman reluctantly agreed, and Chappell called her colleagues. The woman on the ground was embarrassed by the flashing lights of the emergency vehicle but allowed her blood pressure to be taken. It was sky-high. She really did need to go to the hospital. © 2020 The New York Times Company

Keyword: Stroke
Link ID: 26927 - Posted: 01.02.2020

By Catherine Matacic Falling in love is never easy. But do it in a foreign language, and complications pile up quickly, from your first fumbling attempts at deep expression to the inevitable quarrel to the family visit punctuated by remarks that mean so much more than you realize. Now, a study of two dozen terms related to emotion in nearly 2500 languages suggests those misunderstandings aren’t all in your head. Instead, emotional concepts like love, shame, and anger vary in meaning from culture to culture, even when we translate them into the same words. “I wish I had thought of this,” says Lisa Feldman Barrett, a neuroscientist and psychologist at Northeastern University in Boston. “It’s a very, very well-reasoned, clever approach.” People have argued about emotions since the ancient Greeks. Aristotle suggested they were essential to virtue. The stoics called them antithetical to reason. And in his “forgotten” masterpiece, The Expression of the Emotions in Man and Animals, Charles Darwin wrote that they likely had a single origin. He thought every culture the world over shared six basic emotions: happiness, sadness, fear, anger, surprise, and disgust. Since then, psychologists have looked for traces of these emotions in scores of languages. And although one common experiment, which asks participants to identify emotions from photographs of facial expressions, has led to many claims of universality, critics say an overreliance on concepts from Western, industrialized societies dooms such attempts from the start. © 2019 American Association for the Advancement of Science.

Keyword: Emotions; Language
Link ID: 26907 - Posted: 12.21.2019

Thomas R. Sawallis and Louis-Jean Boë Sound doesn’t fossilize. Language doesn’t either. Even when writing systems have developed, they’ve represented full-fledged and functional languages. Rather than preserving the first baby steps toward language, they’re fully formed, made up of words, sentences and grammar carried from one person to another by speech sounds, like any of the perhaps 6,000 languages spoken today. So if you believe, as we linguists do, that language is the foundational distinction between humans and other intelligent animals, how can we study its emergence in our ancestors? Happily, researchers do know a lot about language – words, sentences and grammar – and speech – the vocal sounds that carry language to the next person’s ear – in living people. So we should be able to compare language with less complex animal communication. And that’s what we and our colleagues have spent decades investigating: How do apes and monkeys use their mouth and throat to produce the vowel sounds in speech? Spoken language in humans is an intricately woven string of syllables with consonants appended to the syllables’ core vowels, so mastering vowels was a key to speech emergence. We believe that our multidisciplinary findings push back the date for that crucial step in language evolution by as much as 27 million years. The sounds of speech Say “but.” Now say “bet,” “bat,” “bought,” “boot.” The words all begin and end the same. It’s the differences among the vowel sounds that keep them distinct in speech. © 2010–2019, The Conversation US, Inc.

Keyword: Language; Evolution
Link ID: 26893 - Posted: 12.12.2019

By Nicholas Bakalar Sleeping a lot may increase the risk for stroke, a new study has found. Chinese researchers followed 31,750 men and women whose average age was 62 for an average of six years, using physical examinations and self-reported data on sleep. They found that compared with sleeping (or being in bed trying to sleep) seven to eight hours a night, sleeping nine or more hours increased the relative risk for stroke by 23 percent. Sleeping less than six hours a night had no effect on stroke incidence. The study, in Neurology, also found that midday napping for more than 90 minutes a day was associated with a 25 percent increased risk for stroke compared with napping 30 minutes or less. And people who both slept more than nine hours and napped more than 90 minutes were 85 percent more likely to have a stroke. The study controlled for smoking, drinking, exercise, family history of stroke, body mass index and other health and behavioral characteristics. The reason for the association is unclear, but long sleep duration is associated with increased inflammation, unfavorable lipid profiles and increased waist circumference, factors known to increase cardiovascular risk. © 2019 The New York Times Company

Keyword: Stroke; Sleep
Link ID: 26890 - Posted: 12.12.2019

By Viorica Marian Psycholinguistics is a field at the intersection of psychology and linguistics, and one if its recent discoveries is that the languages we speak influence our eye movements. For example, English speakers who hear candle often look at a candy because the two words share their first syllable. Research with speakers of different languages revealed that bilingual speakers not only look at words that share sounds in one language but also at words that share sounds across their two languages. When Russian-English bilinguals hear the English word marker, they also look at a stamp, because the Russian word for stamp is marka. Even more stunning, speakers of different languages differ in their patterns of eye movements when no language is used at all. In a simple visual search task in which people had to find a previously seen object among other objects, their eyes moved differently depending on what languages they knew. For example, when looking for a clock, English speakers also looked at a cloud. Spanish speakers, on the other hand, when looking for the same clock, looked at a present, because the Spanish names for clock and present—reloj and regalo—overlap at their onset. The story doesn’t end there. Not only do the words we hear activate other, similar-sounding words—and not only do we look at objects whose names share sounds or letters even when no language is heard—but the translations of those names in other languages become activated as well in speakers of more than one language. For example, when Spanish-English bilinguals hear the word duck in English, they also look at a shovel, because the translations of duck and shovel—pato and pala, respectively—overlap in Spanish. © 2019 Scientific American

Keyword: Language; Attention
Link ID: 26875 - Posted: 12.06.2019

By Virginia Morell Say “sit!” to your dog, and—if he’s a good boy—he’ll likely plant his rump on the floor. But would he respond correctly if the word were spoken by a stranger, or someone with a thick accent? A new study shows he will, suggesting dogs perceive spoken words in a sophisticated way long thought unique to humans. “It’s a very solid and interesting finding,” says Tecumseh Fitch, an expert on vertebrate communication at the University of Vienna who was not involved in the research. The way we pronounce words changes depending on our sex, age, and even social rank. Some as-yet-unknown neural mechanism enables us to filter out differences in accent and pronunciation, helping us understand spoken words regardless of the speaker. Animals like zebra finches, chinchillas, and macaques can be trained to do this, but until now only humans were shown to do this spontaneously. In the new study, Holly Root-Gutteridge, a cognitive biologist at the University of Sussex in Brighton, U.K., and her colleagues ran a test that others have used to show dogs can recognize other dogs from their barks. The researchers filmed 42 dogs of different breeds as they sat with their owners near an audio speaker that played six monosyllabic, noncommand words with similar sounds, such as “had,” “hid,” and “who’d.” The words were spoken—not by the dog’s owner—but by several strangers, men and women of different ages and with different accents. © 2019 American Association for the Advancement of Science.

Keyword: Language; Evolution
Link ID: 26866 - Posted: 12.04.2019

Nicola Davis Dolphins, like humans, have a dominant right-hand side, according to research. About 90% of humans are right-handed but we are not the only animals that show such preferences: gorillas tend to be right-handed, kangaroos are generally southpaws, and even cats have preferences for a particular side – although which is favoured appears to depend on their sex. Now researchers have found common bottlenose dolphins appear to have an even stronger right-side bias than humans. “I didn’t expect to find it in that particular behaviour, and I didn’t expect to find such a strong example,” said Dr Daisy Kaplan, co-author of the study from the Dolphin Communication Project, a non-profit organisation in the US. Researchers studying common bottlenose dolphins in the Bahamas say the preference shows up in crater feeding, whereby dolphins swim close to the ocean floor, echolocating for prey, before shoving their beaks into the sand to snaffle a meal. Writing in the journal Royal Society Open Science, Kaplan and colleagues say the animals make a sharp and sudden turn before digging in with their beaks. Crucially, however, they found this turn is almost always to the left, with the same direction taken in more than 99% of the 709 turns recorded between 2012 and 2018. The researchers say the findings indicate a right-side bias, since a left turn keeps a dolphin’s right eye and right side close to the ocean floor. The team found only four turns were made to the right and all of these were made by the same dolphin, which had an oddly shaped right pectoral fin. However the Kaplan said it was unlikely this fin was behind the right turns: two other dolphins had an abnormal or missing right fin yet still turned left.

Keyword: Laterality; Evolution
Link ID: 26861 - Posted: 12.02.2019

By James Gorman TEMPE, Ariz. — Xephos is not the author of “Dog Is Love: Why and How Your Dog Loves You,” one of the latest books to plumb the nature of dogs, but she helped inspire it. And as I scratched behind her ears, it was easy to see why. First, she fixed on me with imploring doggy eyes, asking for my attention. Then, every time I stopped scratching she nudged her nose under my hand and flipped it up. I speak a little dog, but the message would have been clear even if I didn’t: Don’t stop. We were in the home office of Clive Wynne, a psychologist at Arizona State University who specializes in dog behavior. He belongs to Xephos, a mixed breed that the Wynne family found in a shelter in 2012. Dr. Wynne’s book is an extended argument about what makes dogs special — not how smart they are, but how friendly they are. Xephos’ shameless and undiscriminating affection affected both his heart and his thinking. As Xephos nose-nudged me again, Dr. Wynne was describing genetic changes that occurred at some point in dog evolution that he says explain why dogs are so sociable with members of other species. “Hey,” Dr. Wynne said to her as she tilted her head to get the maximum payoff from my efforts, “how long have you had these genes?” No one disputes the sociability of dogs. But Dr. Wynne doesn’t agree with the scientific point of view that dogs have a unique ability to understand and communicate with humans. He thinks they have a unique capacity for interspecies love, a word that he has decided to use, throwing aside decades of immersion in scientific jargon. © 2019 The New York Times Company

Keyword: Sexual Behavior; Language
Link ID: 26848 - Posted: 11.23.2019

Jon Hamilton When we hear a sentence, or a line of poetry, our brains automatically transform the stream of sound into a sequence of syllables. But scientists haven't been sure exactly how the brain does this. Now, researchers from the University of California, San Francisco, think they've figured it out. The key is detecting a rapid increase in volume that occurs at the beginning of a vowel sound, they report Wednesday in Science Advances. "Our brain is basically listening for these time points and responding whenever they occur," says Yulia Oganian, a postdoctoral scholar at UCSF. The finding challenges a popular idea that the brain monitors speech volume continuously to detect syllables. Instead, it suggests that the brain periodically "samples" spoken language looking for specific changes in volume. The finding is "in line" with a computer model designed to simulate the way a human brain decodes speech, says Oded Ghitza, a research professor in the biomedical engineering department at Boston University who was not involved in the study. Detecting each rapid increase in volume associated with a syllable gives the brain, or a computer, an efficient way to deal with the "stream" of sound that is human speech, Ghitza says. And syllables, he adds, are "the basic Lego blocks of language." Oganian's study focused on a part of the brain called the superior temporal gyrus. "It's an area that has been known for about 150 years to be really important for speech comprehension," Oganian says. "So we knew if you can find syllables somewhere, it should be there." The team studied a dozen patients preparing for brain surgery to treat severe epilepsy. As part of the preparation, surgeons had placed electrodes over the area of the brain involved in speech. © 2019 npr

Keyword: Language; Hearing
Link ID: 26841 - Posted: 11.21.2019