Chapter 15. Language and Lateralization
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Jennifer Couzin-Frankel Even a mild concussion can cause disconcerting and sometimes lasting symptoms, such as trouble concentrating and dizziness. But can it make someone more likely to commit a crime? After all, a disproportionate number of people in the criminal justice system previously suffered a traumatic brain injury (TBI). But according to new research into the medical and juvenile justice records of Danish teenagers who suffered a blow to head as children, such injuries don’t cause criminal behavior. Although TBI and criminality often travel together, the researchers found in this Danish population it’s a case of correlation, not causation. “I think this study very clearly indicates that you can’t just [say], ‘Hey, my kid has a mild TBI, he or she is screwed,” says Joseph Schwartz, a criminologist at Florida State University who has studied the issue in juveniles and adults. At the same time, he cautions that there are important variables this study wasn’t designed to capture, such as the treatment received, the effect of repeat TBIs, and the circumstances surrounding the injury. All of these, he says, could influence criminal behavior in some people. Beyond showing high rates of past TBI among those charged with or convicted of crimes, research into this topic has been limited. Studies have found that mild TBI is associated with later behavioral problems, including impulsivity and inattentiveness, which are also linked with criminal behavior. At the same time, it’s well known that “the risk factors in the child and the family for TBIs are the same as the risk factors for delinquency,” including poverty and parental substance abuse, says Sheilagh Hodgins, a clinical psychologist at the University of Montreal. She notes, too, that impulsivity and attention and conduct disorders heighten the risk of sustaining a mild TBI in the first place. © 2024 American Association for the Advancement of Science.
Keyword: Brain Injury/Concussion; Aggression
Link ID: 29503 - Posted: 10.02.2024
By Joanna Thompson, Hakai Magazine From January to May each year, Qeqertarsuaq Tunua, a large bay on Greenland’s west coast, teems with plankton. Baleen whales come to feast on the bounty, and in 2010, two bowhead whales entered the bay to gorge. As the pair came within 100 kilometers (about 60 miles) of one another, they were visually out of range, but could likely still hear one another. That’s when something extraordinary happened: They began to synchronize their dives. Researchers had never scientifically documented this behavior before, and the observation offers potential proof for a 53-year-old theory. Baleen whales are often thought of as solitary — islands unto themselves. However, some scientists believe they travel in diffuse herds, communicating over hundreds of kilometers. Legendary biologist Roger Payne and oceanographer Douglas Webb first floated the concept of acoustic herd theory (or should it be heard theory?) in 1971. This story is from Hakai Magazine, an online publication about science and society in coastal ecosystems, and is republished here with permission. Payne, who helped discover and record humpback whale song a few years prior, was struck by the fact that many toothed cetaceans such as killer whales and dolphins are highly social and move together in tight-knit family groups. These bands provide safety from predators and allow the animals to raise their young communally. Payne speculated that the larger baleen whales might travel in groups, too, but on a broader geographic scale. And perhaps the behemoths signaled acoustically to keep in touch across vast distances. Webb and Payne’s original paper on acoustic herd theory demonstrated that fin whale vocalizations — low-frequency sounds that carry long distances — could theoretically travel an astonishing 700 kilometers (over 400 miles) in certain areas of the ocean. However, it’s been easier to show that a whale is making a call than to prove the recipient is a fellow cetacean hundreds of kilometers away, says Susan Parks, a behavioral ecologist at Syracuse University in New York who studies animal acoustics.
Keyword: Animal Communication; Evolution
Link ID: 29502 - Posted: 10.02.2024
By Katarina Zimmer If we could talk with whales, should we? When scientists in Alaska recently used pre-recorded whale sounds to engage in a 20-minute back-and-forth with a local humpback whale, some hailed it as the first “conversation” with the cetaceans. But the interaction between an underwater speaker mounted on the research boat and the whale, which was described last year in the journal PeerJ, also stimulated a broader discussion around the ethics of communicating with other species. After the whale circled the boat for a while, the puffs from her blowhole sounded wheezier than usual, suggesting to the scientists aboard that she was aroused in some way—perhaps curious, frustrated, or bored. Nevertheless, Twain—as scientists had nicknamed her—continued to respond to the speaker’s calls until they stopped. Twain called back three more times, but the speaker on the boat had fallen silent. She swam away. Scientists have used recorded calls to study animal behavior and communication for decades. But new efforts—and technology such as artificial intelligence—are striving not just to deafly mimic animal communication, but also to more deeply understand it. And while the potential extension of this research that has most captured public excitement—producing our own coherent whale sounds and meaningfully communicating with them—is still firmly in the realm of science fiction, this kind of research might just bring us a small step closer. The work to decipher whale vocalizations was inspired by the research on humpback whale calls by the biologist Roger Payne and played an important role in protecting the species. In the 1960s, Payne discovered that male humpbacks sing—songs so intricate and powerful it was hard to imagine they have no deeper meaning. His album of humpback whale songs became an anthem to the “Save the Whales” movement and helped motivate the creation of the Marine Mammal Protection Act in 1972 in the United States. © 2024 NautilusNext Inc.,
Keyword: Animal Communication; Evolution
Link ID: 29501 - Posted: 10.02.2024
By Emily Anthes The common marmoset is a certified chatterbox. The small, South American monkey uses an array of chirps, whistles and trills to defend its territory, flag the discovery of food, warn of impending danger and find family members hidden by dense forest foliage. Marmosets also use distinct calls to address different individuals, in much the same way that people use names, new research suggests. The findings make them the first nonhuman primates known to use name-like vocal labels for individuals. Until this year, only humans, dolphins and parrots were known to use names when communicating. In June, however, scientists reported that African elephants appeared to use names, too; researchers made the discovery by using artificial intelligence-powered software to detect subtle patterns in the elephants’ low-pitched rumbles. In the new study, which was published in Science last month, a different team of researchers also used A.I. to uncover name-like labels hiding in the calls of common marmosets. The discovery, which is part of a burgeoning scientific effort to use sophisticated computational tools to decode animal communication, could help shed light on the origins of language. And it raises the possibility that name-bestowing behavior may be more widespread in the animal kingdom than scientists once assumed. “I think what it’s telling us is that it’s likely that animals actually have names for each other a lot more than maybe we ever conceived,” said George Wittemyer, a conservation biologist at Colorado State University who led the recent elephant study but was not involved in the marmoset research. “We just never were really looking properly.” Marmosets are highly social, forming long-term bonds with their mates and raising their offspring cooperatively in small family groups. They produce high-pitched, whistle-like “phee calls” to communicate with other marmosets who might be hidden among the treetops. “They start to exchange phee calls when they lose eyesight of each other,” said David Omer, a neuroscientist at the Hebrew University of Jerusalem who led the new study. © 2024 The New York Times Company
Keyword: Animal Communication; Language
Link ID: 29480 - Posted: 09.14.2024
By Darren Incorvaia Imagine being a male firefly when suddenly the telltale flashing of a female catches your eye. Enthralled, you speed toward love’s embrace — only to fly headfirst into a spider’s web. That flashy female was in fact another male firefly, himself trapped in the web, and the spider may have manipulated his light beacon to lure you in. This high-stakes drama plays out nightly in the Jiangxia District of Wuhan, China. There, researchers have found that male fireflies caught in the webs of the orb-weaver spider Araneus ventricosus flash their light signals more like females do, which leads other males to get snagged in the same web. And weirdly, the spiders might be making them do this, almost like hunters blowing a duck call to attract prey. “The idea that a spider can manipulate the signaling of a prey species is very intriguing,” said Dinesh Rao, a spider biologist at the University of Veracruz in Mexico. “They show clearly that a trapped firefly in the web attracts more fireflies.” Dr. Rao was not involved in the research, but served as a peer reviewer of the paper published Monday in the journal Current Biology. Xinhua Fu, a zoologist at Huazhong Agricultural University in Wuhan, was in the field surveying firefly diversity when he first noticed that male fireflies seemed to end up ensnared in orb-weaver spider webs more often than females. Wondering if the spiders were somehow specifically attracting males, he teamed up with Daiqin Li and Shichang Zhang, animal behavior experts from nearby Hubei University, to get to the bottom of this sticky mystery. Working near paddy fields and ponds, the researchers observed the flashing of trapped male fireflies and saw that it more closely resembled that of females than of free-flying males. Trapped males flashed using only one of their two bioluminescent lantern organs, and they made one flash at a time rather than multiple flashes in quick succession, the same lighting signals females send when trying to attract males. © 2024 The New York Times Company
Keyword: Animal Communication; Sexual Behavior
Link ID: 29443 - Posted: 08.21.2024
Julia Kollewe Oran Knowlson, a British teenager with a severe type of epilepsy called Lennox-Gastaut syndrome, became the first person in the world to trial a new brain implant last October, with phenomenal results – his daytime seizures were reduced by 80%. “It’s had a huge impact on his life and has prevented him from having the falls and injuring himself that he was having before,” says Martin Tisdall, a consultant paediatric neurosurgeon at Great Ormond Street Hospital (Gosh) in London, who implanted the device. “His mother was talking about how he’s had such a improvement in his quality of life, but also in his cognition: he’s more alert and more engaged.” Oran’s neurostimulator sits under the skull and sends constant electrical signals deep into his brain with the aim of blocking abnormal impulses that trigger seizures. The implant, called a Picostim and about the size of a mobile phone battery, is recharged via headphones and operates differently between day and night. The video player is currently playing an ad. You can skip the ad in 5 sec with a mouse or keyboard “The device has the ability to record from the brain, to measure brain activity, and that allows us to think about ways in which we could use that information to improve the efficacy of the stimulation that the kids are getting,” says Tisdall. “What we really want to do is to deliver this treatment on the NHS.” As part of a pilot, three more children with Lennox-Gastaut syndrome will be fitted with the implant in the coming weeks, followed by a full trial with 22 children early next year. If this goes well, the academic sponsors – Gosh and University College London – will apply for regulatory approval. Tim Denison – a professor of engineering science at Oxford University and co-founder and chief engineer of London-based Amber Therapeutics, which developed the implant with the university – hopes the device will be available on the NHS in four to five years’ time, and around the world. © 2024 Guardian News & Media Limite
Keyword: Robotics; Epilepsy
Link ID: 29442 - Posted: 08.19.2024
By Sara Talpos Nervous system disorders are among the leading causes of death and disability globally. Conditions such as paralysis and aphasia, which affects the ability to understand and produce language, can be devastating to patients and families. Significant investment has been put toward brain research, including the development of new technologies to treat some conditions, said Saskia Hendriks, a bioethicist at the U.S. National Institutes of Health. These technologies may very well improve lives, but they also raise a host of ethical issues. That’s in part because of the unique nature of the brain, said Hendriks. It’s “the seat of many functions that we think are really important to ourselves, like consciousness, thoughts, memories, emotions, perceptions, actions, perhaps identity.” Saskia Hendriks, a bioethicist at the U.S. National Institutes of Health, recently co-authored an essay on the emerging ethical questions in highly innovative brain research. In a June essay in The New England Journal of Medicine, Hendriks and a co-author, Christine Grady, outlined some of the thorny ethical questions related to brain research: What is the best way to protect the long-term interests of people who receive brain implants as part of a clinical trial? As technology gets better at decoding thoughts, how can researchers guard against violations of mental privacy? And what best way to prepare for the far-off possibility that consciousness may one day arise from work derived from human stem cells? Hendriks spoke about the essay in a Zoom interview. Our conversation has been edited for length and clarity.
Keyword: Robotics
Link ID: 29441 - Posted: 08.19.2024
By Carl Zimmer After analyzing decades-old videos of captive chimpanzees, scientists have concluded that the animals could utter a human word: “mama.” It’s not exactly the expansive dialogue in this year’s “Kingdom of the Planet of the Apes.” But the finding, published on Thursday in the journal Scientific Reports, may offer some important clues as to how speech evolved. The researchers argue that our common ancestors with chimpanzees had brains already equipped with some of the building blocks needed for talking. Adriano Lameira, an evolutionary psychologist at the University of Warwick in Britain and one of the authors of the study, said that the ability to speak is perhaps the most important feature that sets us apart from other animals. Talking to each other allowed early humans to cooperate and amass knowledge over generations. “It is the only trait that explains why we’ve been able to change the face of the earth,” Dr. Lameira said. “We would be an unremarkable ape without it.” Scientists have long wondered why we can speak and other apes cannot. Beginning in the early 1900s, that curiosity led to a series of odd — and cruel — experiments. A few researchers tried raising apes in their own homes to see if living with humans could lead the young animals to speak. In 1947, for example, the psychologist Keith Hayes and his wife, Catherine, adopted an infant chimpanzee. They named her Viki, and, when she was five months old, they started teaching her words. After two years of training, the couple later claimed, Viki could say “papa,” “mama,” “up” and “cup.” By the 1980s, many scientists had dismissed the experiences of Viki and other adopted apes. For one, separating babies from their mothers was likely traumatic. “It’s not the sort of thing you could fund anymore, and with good reason,” said Axel Ekstrom, a speech scientist at the KTH Royal Institute of Technology in Stockholm. © 2024 The New York Times Company
Keyword: Language; Evolution
Link ID: 29408 - Posted: 07.27.2024
By Cathleen O’Grady Human conversations are rapid-fire affairs, with mere milliseconds passing between one person’s utterance and their partner’s response. This speedy turn taking is universal across cultures—but now it turns out that chimpanzees do it, too. By analyzing thousands of gestures from chimpanzees in five different communities in East Africa, researchers found that the animals take turns while communicating, and do so as quickly as we do. The speedy gestural conversations are also seen across chimp communities, just like in humans, the authors report today in Current Biology. The finding is “very exciting” says Maël Leroux, an evolutionary biologist at the University of Rennes who was not involved with the work. “Language is the hallmark of our species … and a central feature of language is our ability to take turns.” Finding a similar behavior in our closest living relative, he says, suggests we may have inherited this ability from our shared common ancestor. When chimps gesture—such as reaching out an arm in a begging gesture—they are most often making a request, says Gal Badihi, an animal communication researcher at the University of St Andrews. This can include things such as “groom me,” “give me,” or “travel with me.” Most of the time, the chimp’s partner does the requested behavior. But sometimes, the second chimp will respond with its own gestures instead—for instance, one chimp requesting grooming, and the other indicating where they would like to be groomed, essentially saying “groom me first.” To figure out whether these interactions resemble human turn taking, Badihi and colleagues combed through hundreds of hours of footage from a massive database of chimpanzee gestural interactions recorded by multiple researchers across decades of fieldwork in East Africa. The scientists studied the footage, describing the precise movements each chimp made when gesturing, the response of other chimps, the duration of the gestures, and other details. © 2024 American Association for the Advancement of Science.
Keyword: Language; Evolution
Link ID: 29403 - Posted: 07.23.2024
By Sara Reardon By eavesdropping on the brains of living people, scientists have created the highest-resolution map yet of the neurons that encode the meanings of various words1. The results hint that, across individuals, the brain uses the same standard categories to classify words — helping us to turn sound into sense. The study is based on words only in English. But it’s a step along the way to working out how the brain stores words in its language library, says neurosurgeon Ziv Williams at the Massachusetts Institute of Technology in Cambridge. By mapping the overlapping sets of brain cells that respond to various words, he says, “we can try to start building a thesaurus of meaning”. The brain area called the auditory cortex processes the sound of a word as it enters the ear. But it is the brain’s prefrontal cortex, a region where higher-order brain activity takes place, that works out a word’s ‘semantic meaning’ — its essence or gist. Previous research2 has studied this process by analysing images of blood flow in the brain, which is a proxy for brain activity. This method allowed researchers to map word meaning to small regions of the brain. But Williams and his colleagues found a unique opportunity to look at how individual neurons encode language in real time. His group recruited ten people about to undergo surgery for epilepsy, each of whom had had electrodes implanted in their brains to determine the source of their seizures. The electrodes allowed the researchers to record activity from around 300 neurons in each person’s prefrontal cortex. © 2024 Springer Nature Limited
Keyword: Language; Brain imaging
Link ID: 29383 - Posted: 07.06.2024
By Dave Philipps David Metcalf’s last act in life was an attempt to send a message — that years as a Navy SEAL had left his brain so damaged that he could barely recognize himself. He died by suicide in his garage in North Carolina in 2019, after nearly 20 years in the Navy. But just before he died, he arranged a stack of books about brain injury by his side, and taped a note to the door that read, in part, “Gaps in memory, failing recognition, mood swings, headaches, impulsiveness, fatigue, anxiety, and paranoia were not who I was, but have become who I am. Each is worsening.” Then he shot himself in the heart, preserving his brain to be analyzed by a state-of-the-art Defense Department laboratory in Maryland. The lab found an unusual pattern of damage seen only in people exposed repeatedly to blast waves. The vast majority of blast exposure for Navy SEALs comes from firing their own weapons, not from enemy action. The damage pattern suggested that years of training intended to make SEALs exceptional was leaving some barely able to function. But the message Lieutenant Metcalf sent never got through to the Navy. No one at the lab told the SEAL leadership what the analysis had found, and the leadership never asked. It was not the first time, or the last. At least a dozen Navy SEALs have died by suicide in the last 10 years, either while in the military or shortly after leaving. A grass-roots effort by grieving families delivered eight of their brains to the lab, an investigation by The New York Times has found. And after careful analysis, researchers discovered blast damage in every single one. It is a stunning pattern with important implications for how SEALs train and fight. But privacy guidelines at the lab and poor communication in the military bureaucracy kept the test results hidden. Five years after Lieutenant Metcalf’s death, Navy leaders still did not know. Until The Times told the Navy of the lab’s findings about the SEALs who died by suicide, the Navy had not been informed, the service confirmed in a statement. © 2024 The New York Times Company
Keyword: Brain Injury/Concussion; Depression
Link ID: 29378 - Posted: 07.03.2024
By Carl Zimmer For thousands of years, philosophers have argued about the purpose of language. Plato believed it was essential for thinking. Thought “is a silent inner conversation of the soul with itself,” he wrote. Many modern scholars have advanced similar views. Starting in the 1960s, Noam Chomsky, a linguist at M.I.T., argued that we use language for reasoning and other forms of thought. “If there is a severe deficit of language, there will be severe deficit of thought,” he wrote. As an undergraduate, Evelina Fedorenko took Dr. Chomsky’s class and heard him describe his theory. “I really liked the idea,” she recalled. But she was puzzled by the lack of evidence. “A lot of things he was saying were just stated as if they were facts — the truth,” she said. Dr. Fedorenko went on to become a cognitive neuroscientist at M.I.T., using brain scanning to investigate how the brain produces language. And after 15 years, her research has led her to a startling conclusion: We don’t need language to think. “When you start evaluating it, you just don’t find support for this role of language in thinking,” she said. When Dr. Fedorenko began this work in 2009, studies had found that the same brain regions required for language were also active when people reasoned or carried out arithmetic. But Dr. Fedorenko and other researchers discovered that this overlap was a mirage. Part of the trouble with the early results was that the scanners were relatively crude. Scientists made the most of their fuzzy scans by combining the results from all their volunteers, creating an overall average of brain activity. © 2024 The New York Times Company
Keyword: Language; Consciousness
Link ID: 29376 - Posted: 07.03.2024
Elephants call out to each other using individual names that they invent for their fellow pachyderms, according to a new study. While dolphins and parrots have been observed addressing each other by mimicking the sound of others from their species, elephants are the first non-human animals known to use names that do not involve imitation, the researchers suggested. For the new study published on Monday, a team of international researchers used an artificial intelligence algorithm to analyse the calls of two wild herds of African savanna elephants in Kenya. The research “not only shows that elephants use specific vocalisations for each individual, but that they recognise and react to a call addressed to them while ignoring those addressed to others”, the lead study author, Michael Pardo, said. The video player is currently playing an ad. “This indicates that elephants can determine whether a call was intended for them just by hearing the call, even when out of its original context,” the behavioural ecologist at Colorado State University said in a statement. The researchers sifted through elephant “rumbles” recorded at Kenya’s Samburu national reserve and Amboseli national park between 1986 and 2022. Using a machine-learning algorithm, they identified 469 distinct calls, which included 101 elephants issuing a call and 117 receiving one. Elephants make a wide range of sounds, from loud trumpeting to rumbles so low they cannot be heard by the human ear. Names were not always used in the elephant calls. But when names were called out, it was often over a long distance, and when adults were addressing young elephants. Adults were also more likely to use names than calves, suggesting it could take years to learn this particular talent. The most common call was “a harmonically rich, low-frequency sound”, according to the study in the journal Nature Ecology & Evolution. © 2024 Guardian News & Media Limited
Keyword: Animal Communication; Language
Link ID: 29352 - Posted: 06.11.2024
Ian Sample Science editor Five children who were born deaf now have hearing in both ears after taking part in an “astounding” gene therapy trial that raises hopes for further treatments. The children were unable to hear because of inherited genetic mutations that disrupt the body’s ability to make a protein needed to ensure auditory signals pass seamlessly from the ear to the brain. Doctors at Fudan University in Shanghai treated the children, aged between one and 11, in both ears in the hope they would gain sufficient 3D hearing to take part in conversations and work out which direction sounds were coming from. Within weeks of receiving the therapy, the children had gained hearing, could locate the sources of sounds, and recognised speech in noisy environments. Two of the children were recorded dancing to music, the researchers reported in Nature Medicine. A child facing away from the camera towards a panel of auditory testing equipment with script in the top left corner Dr Zheng-Yi Chen, a scientist at Massachusetts Eye and Ear, a Harvard teaching hospital in Boston that co-led the trial, said the results were “astounding”, adding that researchers continued to see the children’s hearing ability “dramatically progress”. The therapy uses an inactive virus to smuggle working copies of the affected gene, Otof, into the inner ear. Once inside, cells in the ear use the new genetic material as a template to churn out working copies of the crucial protein, otoferlin. Video footage of the patients shows a two-year-old boy responding to his name three weeks after the treatment and dancing to music after 13 weeks, having shown no response to either before receiving the injections. © 2024 Guardian News & Media Limited
Keyword: Hearing; Genes & Behavior
Link ID: 29347 - Posted: 06.06.2024
By Gemma Conroy Researchers have developed biodegradable, wireless sensors that can monitor changes in the brain following a head injury or cancer treatment, without invasive surgery. In rats and pigs, the soft sensors performed just as well as conventional wired sensors for up to a month after being injected under the skull. The gel-based sensors measure key health markers, including temperature, pH and pressure. “It is quite likely this technology will be useful for people in medical settings,” says study co-author Yueying Yang, a biomedical engineer at Huazhong University of Science and Technology (HUST) in Wuhan, China. The findings were published today in Nature1. “It’s a very comprehensive study,” says Christopher Reiche, who develops implantable microdevices at the University of Utah in Salt Lake City. For years, scientists have been developing brain sensors that can be implanted inside the skull. But many of these devices rely on wires to transmit data to clinicians. The wires are difficult to insert and remove, and create openings in the skin for viruses and bacteria to enter the body. Wireless sensors offer a solution to this problem, but are thwarted by their limited communication range and relatively large size. Developing sensors that can access and monitor the brain is “extremely difficult”, says Omid Kavehei, a biomedical engineer who specializes in neurotechnology at the University of Sydney in Australia. To overcome these challenges, Yang and her colleagues created a set of 2-millimetre cube-shaped sensors out of hydrogel, a soft, flexible material that’s often used in tissue regeneration and drug delivery. The gel sensors change shape under different temperatures, pressures and pH conditions, and respond to vibrations caused by variations in blood flow in the brain. When the sensors are implanted under the skull and scanned with an ultrasound probe — a tool that is already used to image the human brain in clinics — these changes are detectable in the form of ultrasonic waves that pass through the skull. The tiny gel-cubes completely dissolve in saline solution after around four months, and begin to break down in the brain after five weeks. © 2024 Springer Nature Limited
Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 29346 - Posted: 06.06.2024
By George Musser Had you stumbled into a certain New York University auditorium in March 2023, you might have thought you were at pure neuroscience conference. In fact, it was a workshop on artificial intelligence—but your confusion could have been readily forgiven. Speakers talked about “ablation,” a procedure of creating brain lesions, as commonly done in animal model experiments. They mentioned “probing,” like using electrodes to tap into the brain’s signals. They presented linguistic analyses and cited long-standing debates in psychology over nature versus nurture. Plenty of the hundred or so researchers in attendance probably hadn’t worked with natural brains since dissecting frogs in seventh grade. But their language choices reflected a new milestone for their field: The most advanced AI systems, such as ChatGPT, have come to rival natural brains in size and complexity, and AI researchers are studying them almost as if they were studying a brain in a skull. As part of that, they are drawing on disciplines that traditionally take humans as their sole object of study: psychology, linguistics, philosophy of mind. And in return, their own discoveries have started to carry over to those other fields. These various disciplines now have such closely aligned goals and methods that they could unite into one field, Grace Lindsay, assistant professor of psychology and data science at New York University, argued at the workshop. She proposed calling this merged science “neural systems understanding.” “Honestly, it’s neuroscience that would benefit the most, I think,” Lindsay told her colleagues, noting that neuroscience still lacks a general theory of the brain. “The field that I come from, in my opinion, is not delivering. Neuroscience has been around for over 100 years. I really thought that, when people developed artificial neural systems, they could come to us.” © 2024 Simons Foundation
Keyword: Consciousness; Language
Link ID: 29344 - Posted: 06.06.2024
By Amorina Kingdon Like most humans, I assumed that sound didn’t work well in water. After all, Jacques Cousteau himself called the ocean the “silent world.” I thought, beyond whales, aquatic animals must not use sound much. How wonderfully wrong I was. In water a sound wave travels four and a half times faster, and loses less energy, than in air. It moves farther and faster and carries information better. In the ocean, water exists in layers and swirling masses of slightly different densities, depending on depth, temperature, and saltiness. The physics-astute reader will know that the density of the medium in which sound travels influences its speed. So, as sound waves spread through the sea, their speed changes, causing complex reflection or refraction and bending of the sound waves into “ducts” and “channels.” Under the right circumstances, these ducts and channels can carry sound waves hundreds and even thousands of kilometers. What about other sensory phenomena? Touch and taste work about the same in water as in air. But the chemicals that tend to carry scent move slower in water than in air. And water absorbs light very easily, greatly diminishing visibility. Even away from murky coastal waters, in the clearest seas, light vanishes below several hundred meters and visibility below several dozen. So sound is often the best, if not only, way for ocean and freshwater creatures to signal friends, detect enemies, and monitor the world underwater. And there is much to monitor: Earthquakes, mudslides, and volcanic activity rumble through the oceans, beyond a human’s hearing range. Ice cracks, booms, and scrapes the seafloor. Waves hiss and roar. Raindrops plink. If you listen carefully, you can tell wind speed, rainfall, even drop size, by listening to the ocean as a storm passes. Even snowfall makes a sound. © 2024 NautilusNext Inc.,
Keyword: Animal Communication; Sexual Behavior
Link ID: 29341 - Posted: 06.04.2024
By Sumeet Kulkarni As spring turns to summer in the United States, warming conditions have started to summon enormous numbers of red-eyed periodical cicadas out of their holes in the soil across the east of the country. This year sees an exceptionally rare joint emergence of two cicada broods: one that surfaces every 13 years and another with a 17-year cycle. They last emerged together in 1803, when Thomas Jefferson was US president. This year, billions or even trillions of cicadas from these two broods — each including multiple species of the genus Magicicada — are expected to swarm forests, fields and urban neighbourhoods. To answer readers’ cicada questions, Nature sought help from three researchers. Katie Dana is an entomologist affiliated with the Illinois Natural History Survey at the University of Illinois at Urbana-Champaign. John Lill is an insect ecologist at George Washington University in Washington DC. Fatima Husain is a cognitive neuroscientist at the University of Illinois at Urbana-Champaign. Their answers have been edited for length and clarity. Why do periodical cicadas have red eyes? JL: We’re not really sure. We do know that cicadas’ eyes turn red in the winter before the insects come out. The whole coloration pattern in periodical cicadas is very bright: red eyes, black and orange wings. They’re quite different from the annual cicadas, which are green and black, and more camouflaged. It’s a bit of an enigma why the periodical ones are so brightly coloured, given that it just makes them more obvious to predators. There are no associated defences with being brightly coloured — it kind of flies in the face of what we know about bright coloration in a lot of other animals, where usually it’s some kind of signal for toxicity. There also exist mutants with brown, orange, golden or even blue eyes. People hunt for blue-eyed ones; it’s like trying to find a four-leaf clover. © 2024 Springer Nature Limited
Keyword: Animal Communication; Sexual Behavior
Link ID: 29339 - Posted: 06.04.2024
Sacha Pfeiffer A few weeks ago, at about 6:45 in the morning, I was at home, waiting to talk live on the air with Morning Edition host Michel Martin about a story I'd done, when I suddenly heard a loud metallic hammering. It sounded like a machine was vibrating my house. It happened again about 15 seconds later. And again after that. This rhythmic clatter seemed to be coming from my basement utility closet. Was my furnace breaking? Or my water heater? I worried that it might happen while I was on the air. Luckily, the noise stopped while I spoke with Michel, but restarted later. This time I heard another sound, a warbling or trilling, possibly inside my chimney. Was there an animal in there? I ran outside, looked up at my roof — and saw a woodpecker drilling away at my metal chimney cap. I've seen and heard plenty of woodpeckers hammer on trees. But never on metal. So to find out why the bird was doing this, I called an expert: Kevin McGowan, an ornithologist at the Cornell Lab of Ornithology who recently created a course called "The Wonderful World of Woodpeckers." McGowan said woodpeckers batter wood to find food, make a home, mark territory and attract a mate. But when they bash away at metal, "what the birds are trying to do is make as big a noise as possible," he said, "and a number of these guys have found that — you know what? If you hammer on metal, it's really loud!" Woodpeckers primarily do this during the springtime breeding season, and their metallic racket has two purposes, "basically summarized as: All other guys stay away, all the girls come to me," McGowan said. "And the bigger the noise, the better." © 2024 npr
Keyword: Sexual Behavior; Animal Communication
Link ID: 29333 - Posted: 06.02.2024
By Liqun Luo The brain is complex; in humans it consists of about 100 billion neurons, making on the order of 100 trillion connections. It is often compared with another complex system that has enormous problem-solving power: the digital computer. Both the brain and the computer contain a large number of elementary units—neurons and transistors, respectively—that are wired into complex circuits to process information conveyed by electrical signals. At a global level, the architectures of the brain and the computer resemble each other, consisting of largely separate circuits for input, output, central processing, and memory.1 Which has more problem-solving power—the brain or the computer? Given the rapid advances in computer technology in the past decades, you might think that the computer has the edge. Indeed, computers have been built and programmed to defeat human masters in complex games, such as chess in the 1990s and recently Go, as well as encyclopedic knowledge contests, such as the TV show Jeopardy! As of this writing, however, humans triumph over computers in numerous real-world tasks—ranging from identifying a bicycle or a particular pedestrian on a crowded city street to reaching for a cup of tea and moving it smoothly to one’s lips—let alone conceptualization and creativity. So why is the computer good at certain tasks whereas the brain is better at others? Comparing the computer and the brain has been instructive to both computer engineers and neuroscientists. This comparison started at the dawn of the modern computer era, in a small but profound book entitled The Computer and the Brain, by John von Neumann, a polymath who in the 1940s pioneered the design of a computer architecture that is still the basis of most modern computers today.2 Let’s look at some of these comparisons in numbers (Table 1). © 2024 NautilusNext Inc.,
Keyword: Stroke
Link ID: 29331 - Posted: 05.29.2024