Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Gareth Cook What are the most intelligent creatures on the planet? Humans come first. (Though there are days when we have to wonder.) After Homo sapiens, most people might answer chimpanzees, and then maybe dogs and dolphins. But what of birds? The science writer Jennifer Ackerman offers a lyrical testimony to the wonders of avian intelligence in her new book, “The Genius of Birds.” There have long been hints of bird smarts, but it’s become an active field of scientific inquiry, and Ackerman serves as tour guide. She answered questions from Mind Matters editor Gareth Cook. What drew you to birds? I’ve watched birds for most of my life. I admire all the usual things about them. Their plumage and song. Their intense way of living. Their flight. I also admire their resourcefulness and pluck. I’ve always been intrigued by their apparently smart behavior, whether learned or innate. I grew up in Washington, D.C. — the second youngest in a gaggle of five girls. My parents had precious little time for one-on-one. Especially my dad, who had a demanding government job. So when he asked me if I wanted to go birdwatching with him one spring morning when I was seven or eight, I jumped at the chance. It was magical, going out in the dark woods along the C&O canal and listening for bird song. My father had learned his calls and songs in Boy Scout camp from an expert, an elderly Greek man named Apollo, so he was pretty good at identifying birds, even the shy woodland species. Eventually he gave me my own copy of Peterson’s Field Guide, along with a small pair of binoculars. I’ve loved birds ever since. My first run in with a clever bird was on our dining room table. We had a pet parakeet, a budgerigar named Gre-Gre, who was allowed to fly around the dining room and perch on our head or shoulders. He had a kind of social genius. He made you love him. But at breakfast, it was impossible to eat your cereal without his constant harassment. He liked to perch on the edge of my bowl and peck at the cereal, flapping his wings frantically to keep his balance, splashing my milk. I’d build a barricade of boxes around my place setting, but he always found a way in, moving a box or popping over the top. He was a good problem-solver. © 2016 Scientific American
By Virginia Morell Moths have an almost fatal attraction to lights—so much so that we say people are drawn to bad ends “like moths to a flame.” But in this age of global light pollution, that saying has a new poignancy: Moths, which are typically nocturnal insects, are dying in droves at artificial lights. The high levels of mortality should have evolutionary consequences, leading to moths that avoid lights, biologists say. To find out, two scientists tested the flight-to-light behavior of 1048 adult ermine moths (Yponomeuta cagnagella, shown above) in Europe. The researchers collected the insects in 2007 as larvae that had just completed their first molt. Three hundred and twenty came from populations that lived where the skies were largely dark; 728 were gathered in light polluted areas. They were raised in a lab with 16 hours of daylight and 8 hours of darkness daily while they completed their life stages. Two to 3 days after emerging as moths, they were released in a flight cage with a fluorescent tube at one side. Moths from high light pollution areas were significantly less attracted to the light than those from the darker zones, the scientists report in today’s issue of Biology Letters. Overall, moths from the light-polluted populations had a 30% reduction in the flight-to-light behavior, indicating that this species is evolving, as predicted, to stay away from artificial lights. That change should increase these city moths’ reproductive success. But their success comes at a cost: To avoid the lights, the moths are likely flying less, say the scientists, so they aren’t pollinating as many flowers or feeding as many spiders and bats. © 2016 American Association for the Advancement of Science.
Link ID: 22100 - Posted: 04.13.2016
Sam Doernberg and Joe DiPietro It’s the first day of class, and we—a couple of instructors from Cornell—sit around a table with a few of our students as the rest trickle in. Anderson, one of the students seated across from us, smiles and says, “I’m going to get an A+ in your class.” “No,” VanAntwerp retorts, “I’m getting the A+.” You might think that this scene is typical of classes at a school like Cornell University, where driven students compete for top marks. But this didn’t happen on a college campus: It took place in a maximum-security prison. To the outside world, they are inmates, but in the classroom, they are students enrolled in the Cornell Prison Education Program, or “CPEP.” Per New York State Department of Corrections rules, we have permission to use the inmates’ last names only—which is also often how we know them best. Those who graduate from the program—taught by Cornell instructors—will receive an associate’s degree from Cayuga Community College. Before teaching neuroscience to prison inmates, we taught it to Cornell undergraduates as part of the teaching staff for Cornell’s Introduction to Neuroscience course. Most Cornell neuroscience students are high-achieving biology majors and premeds, who are well prepared to succeed in a demanding course. They generally have gone from one academic success to another, and it is no secret that they expect a similar level of success in a neuroscience class. © 2016 by The Atlantic Monthly Group
Keyword: Learning & Memory
Link ID: 22093 - Posted: 04.12.2016
By FRANS de WAAL TICKLING a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself. The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter. Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal. The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners. Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. Everything boiled down to genes and reinforcement. To think otherwise opened you up to ridicule, which is what happened to Wolfgang Köhler, the German psychologist who, a century ago, was the first to demonstrate flashes of insight in chimpanzees. © 2016 The New York Times Company
By Neuroskeptic Do you want to be more successful? Happier? More intelligent? Don’t despair. The answer, we’re told, is right in front of your nose—or rather, right behind it. It’s your own brain. Thanks to neuroscience, you can hack your gray matter. According to the sales pitch, almost anything is possible, if you can master your brain—and if you can afford to buy the products that promise to help you do that. But how many of these neuroproducts are neurobullshit? And what makes neuroscience so attractive to people with something to sell? I’m a neuroscientist who has been blogging about the brain for the past eight years. Over this time I’ve noticed a steady increase in the number of neuroscience-themed commercial products. There are brain pills to optimize your mental focus. There are futuristic-looking headbands that promise to measure or stimulate your neural activity in order to make you smarter, or help you sleep better, or even meditate better. There is no end of “brain training” apps and neuroscience-themed self-help books. These products tend to have names based around “Neuro” or “Brain.” And they will come advertised as being “created by neuroscientists,” “based on the latest brain research,” or at least endorsed by some leading brain expert. Once you look beyond the “neuro” gloss, however, you’ll see that many of these products aren’t new at all, but just old products in new packaging. A recent, and notorious, example of this was “Fifth Quarter Fresh,” a brand of chocolate milk.
Link ID: 22090 - Posted: 04.11.2016
Carl Zimmer Five days a week, you can tune into “Paternity Court,” a television show featuring couples embroiled in disputes over fatherhood. It’s entertainment with a very old theme: Uncertainty over paternity goes back a long way in literature. Even Shakespeare and Chaucer cracked wise about cuckolds, who were often depicted wearing horns. But in a number of recent studies, researchers have found that our obsession with cuckolded fathers is seriously overblown. A number of recent genetic studies challenge the notion that mistaken paternity is commonplace. “It’s absolutely ridiculous,” said Maarten H.D. Larmuseau, a geneticist at the University of Leuven in Belgium who has led much of this new research. The term cuckold traditionally refers to the husband of an adulteress, but Dr. Larmuseau and other researchers focus on those cases that produce a child, which scientists politely call “extra-pair paternity.” Until the 20th century, it was difficult to prove that a particular man was the biological father of a particular child. In 1304 a British husband went to court to dispute the paternity of his wife’s child, born while he was abroad for three years. Despite the obvious logistical challenges, the court rejected the husband’s objection. “The privity between a man and his wife cannot be known,” the judge ruled. Modern biology lifted the veil from this mystery, albeit slowly. In the early 1900s, researchers discovered that people have distinct blood types inherited from their parents. In a 1943 lawsuit, Charlie Chaplin relied on blood-type testing to prove that he was not the father of the actress Joan Barry’s child. (The court refused to accept the evidence and forced Chaplin to pay child support anyway.) © 2016 The New York Times Company
Keyword: Sexual Behavior
Link ID: 22089 - Posted: 04.09.2016
Modern humans diverged from Neanderthals some 600,000 years ago – and a new study shows the Y chromosome might be what kept the two species separate. It seems we were genetically incompatible with our ancient relatives – and male fetuses conceived through sex with Neanderthal males would have miscarried. We knew that some cross-breeding between us and Neanderthals happened more recently – around 100,000 to 60,000 years ago. Neanderthal genes have been found in our genomes, on X chromosomes, and have been linked to traits such as skin colour, fertility and even depression and addiction. Now, an analysis of a Y chromosome from a 49,000-year-old male Neanderthal found in El Sidrón, Spain, suggests the chromosome has gone extinct seemingly without leaving any trace in modern humans. This could simply be because it drifted out of the human gene pool or, as the new study suggests, it could be because genetic differences meant that hybrid offspring who had this chromosome were infertile – a genetic dead end. Fernando Mendez of Stanford University, and his colleagues compared the Neanderthal Y chromosome with that of chimps, and ancient and modern humans. They found mutations in four genes that could have prevented the passage of Y chromosome down the paternal line to the hybrid children. “Some of these mutations could have played a role in the loss of Neanderthal Y chromosomes in human populations,” says Mendez. © Copyright Reed Business Information Ltd.
For decades, it was thought that scar-forming cells called astrocytes were responsible for blocking neuronal regrowth across the level of spinal cord injury, but recent findings challenge this idea. According to a new mouse study, astrocyte scars may actually be required for repair and regrowth following spinal cord injury. The research was funded by the National Institutes of Health, and published in Nature. “At first, we were completely surprised when our early studies revealed that blocking scar formation after injury resulted in worse outcomes. Once we began looking specifically at regrowth, though, we became convinced that scars may actually be beneficial,” said Michael V. Sofroniew, M.D., Ph.D., professor of neurobiology at the University of California, Los Angeles, and senior author of the study. “Our results suggest that scars may be a bridge and not a barrier towards developing better treatments for paralyzing spinal cord injuries.” Neurons communicate with one another by sending messages down long extensions called axons. When axons in the brain or spinal cord are severed, they do not grow back automatically. For example, damaged axons in the spinal cord can result in paralysis. When an injury occurs, astrocytes become activated and go to the injury site, along with cells from the immune system and form a scar. Scars have immediate benefits by decreasing inflammation at the injury site and preventing spread of tissue damage. However, long-term effects of the scars were thought to interfere with axon regrowth.
By Catherine Matacic How does sign language develop? A new study shows that it takes less than five generations for people to go from simple, unconventional pantomimes—essentially telling a story with your hands—to stable signs. Researchers asked a group of volunteers to invent their own signs for a set of 24 words in four separate categories: people, locations, objects, and actions. Examples included “photographer,” “darkroom,” and “camera.” After an initial group made up the signs—pretending to shoot a picture with an old-fashioned camera for “photographer,” for example—they taught the signs to a new generation of learners. That generation then played a game where they tried to guess what sign another player in their group was making. When they got the answer right, they taught that sign to a new generation of volunteers. After a few generations, the volunteers stopped acting out the words with inconsistent gestures and started making them in ways that were more systematic and efficient. What’s more, they added markers for the four categories—pointing to themselves if the category were “person” or making the outline of a house if the category were “location,” for example—and they stopped repeating gestures, the researchers reported last month at the Evolution of Language conference in New Orleans, Louisiana. So in the video above, the first version of “photographer” is unpredictable and long, compared with the final version, which uses the person marker and takes just half the time. The researchers say their finding supports the work of researchers in the field, who have found similar patterns of development in newly emerging sign languages. The results also suggest that learning and social interaction are crucial to this development. © 2016 American Association for the Advancement of Science
Link ID: 22084 - Posted: 04.09.2016
JUST say no. That’s supposed to be our reaction to recreational drugs. The trouble is, lots of people say yes please. As a result, the world’s governments have been waging a war on drugs for more than a century. Since 1961, the battle has been orchestrated via international treaties targeting all parts of the supply chain, from the producers to the smugglers, the sellers to the buyers. Yet this supposedly united front has developed some conspicuous cracks. Now those countries backing a different approach have called a UN meeting later this month to make the case for change. The question is whether the UN is ready to soften its stance or whether it will plough on despite mountains of evidence suggesting its zero-tolerance approach has failed. As the reformers collate this to present at the meeting, New Scientist looks at how the approaches taken by different countries stack up (see “Drugs around the world”, below), and asks what can happen next. Some nations are already taking change into their own hands. Portugal allows personal use of any drug – including cocaine and heroin – and several South and Central American countries are moving in the same direction. As for cannabis, the number of places where its open sale has been decriminalised in some form grows ever larger. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 22082 - Posted: 04.07.2016
By Chris Brown, Chris Corday, Canada's infatuation with getting a legal high may soon lead straight to Mary Jean Dunsdon's Vancouver kitchen. The self-described diva of cooking with cannabis has been baking and selling intoxicating edibles for the better part of 20 years. "I've easily sold 700,000 to one million cookies," she told CBC News recently in her kitchen. To her customers, Dunsdon, best known by her nickname Watermelon, is a trusted brand. Package "I've done it all: 'nice cream cones', marijuana bacon, I've made 'weedish meatballs'," she said. With legalization on the way in Canada, Dunsdon is hoping her underground bakery and the goodies she sells to a loyal base of medical and recreational customers will finally emerge from the shadows and capture a slice of a new market for marijuana edibles. She has good reason to be optimistic about her future in the business of bud. In the U.S. states where recreational marijuana is already legal, edibles — basically any food or drinks containing marijuana — are the fastest growing segment of the market. New Frontier Financials, which tracks the growth of the U.S. marijuana industry, says Washington state's sale of about 280,000 units of edible marijuana in March is double what it was just 10 months ago. For Canada, it's a trend line that offers a glimpse into the future and also a cautionary tale. "Edibles will be more popular. Way more popular than smoking," said Dunsdon. Watermelon During our visit, Dunsdon ground up marijuana leaf and bud and sprinkled the herb mixture over a fillet of wild B.C. chinook salmon. The topping bears a striking resemblance to pesto. "If you eat it, and eat just the right amount, it's probably the nicest thing you've ever felt," she said. ©2016 CBC/Radio-Canada.
Keyword: Drug Abuse
Link ID: 22081 - Posted: 04.07.2016
By DAN BILEFSKY LONDON — The model in the Gucci ad is young and waiflike, her frail body draped in a geometric-pattern dress as she leans back in front of a wall painted with a tree branch that appears to mimic the angle of her silhouette. On Wednesday, the Advertising Standards Authority of Britain ruled that the ad was “irresponsible” and that the model looked “unhealthily thin,” fanning a perennial debate in the fashion industry over when thin is too thin. The regulator said that the way the woman in the image had posed elongated her torso and accentuated her waist, so that it appeared to be very small. It said her “somber facial expression and dark makeup, particularly around her eyes, made her face look gaunt.” It said the offending image — a still photograph of the model that appeared in an online video posted on the website of The Times of London in December — should not appear again in its current form. The specific image was removed from the video on Gucci’s YouTube channel, though the model still appears in the ad directed by Glen Luchford. The image deemed "irresponsible" by the Advertising Standards Authority of Britain appeared at the end of this online video, but has been taken out. Video by Gucci The Italian fashion brand, for its part, had defended the ad, saying it was part of a video that portrayed a dance party and that was aimed at an older and sophisticated audience. Nowhere in the ads were any models’ bones visible, it said, and they were all “toned and slim.” It noted that “it was, to some extent, a subjective issue as to whether a model looked unhealthily thin,” according to the authority. The decision by the advertising authority, an independent industry regulatory group, barred Gucci from using the image in advertisements in Britain. The ruling comes amid a longstanding debate on both sides of the Atlantic about the perils of overly thin models projecting an unhealthy body image for women. As when critics lashed out against idealized images of “heroin chic” in the early 1990s, some have voiced concern that fashion houses are encouraging potentially hazardous behaviors by glamorizing models who are rail-thin. © 2016 The New York Times Company
Keyword: Anorexia & Bulimia
Link ID: 22080 - Posted: 04.07.2016
by Sarah Zielinski Spring has finally arrived, and birds’ nests all over the country will soon be filling up with eggs and then nestlings. Watch a nest long enough (the Science News staff is partial to the DC Eagle Cam) and you’ll see itty bitty baby birds begging for a meal. But mama birds don’t always reward that begging with food. In some species, like the tree swallow, birds that beg more will get more food. But in others, like the hoopoe, mom ignores who is begging and gives more food to the biggest chicks, researchers have found. This lack of an overall pattern has confounded ornithologists, but it seems that they may have been missing a key piece of the puzzle. A new study finds that the quality of the birds’ environment determines whether a mama bird can afford to feed all of her kids or if she has to ignore some to make sure the others survive. The study appears March 29 in Nature Communications. Stuart West of the University of Oxford and colleagues compiled data from 306 studies that looked at 143 bird species. When the birds were living in a good environment — one that had plenty of resources or a high amount of predictability — then mom would feed the chicks that beg the most, which were often the ones that needed the most help. But when the environment was poor in quality or unpredictable, then mama bird responded less to begging. |© Society for Science & the Public 2000 - 2016.
Keyword: Sexual Behavior
Link ID: 22079 - Posted: 04.07.2016
Laura Sanders NEW YORK — Lip-readers’ minds seem to “hear” the words their eyes see being formed. And the better a person is at lipreading, the more neural activity there is in the brain’s auditory cortex, scientists reported April 4 at the annual meeting of the Cognitive Neuroscience Society. Earlier studies have found that auditory brain areas are active during lipreading. But most of those studies focused on small bits of language — simple sentences or even single words, said study coauthor Satu Saalasti of Aalto University in Finland. In contrast, Saalasti and colleagues studied lipreading in more natural situations. Twenty-nine people read the silent lips of a person who spoke Finnish for eight minutes in a video. “We can all lip-read to some extent,” Saalasti said, and the participants, who had no lipreading experience, varied widely in their comprehension of the eight-minute story. In the best lip-readers, activity in the auditory cortex was quite similar to that evoked when the story was read aloud, brain scans revealed. The results suggest that lipreading success depends on a person’s ability to “hear” the words formed by moving lips, Saalasti said. Citations J. Alho et al. Similar brain responses to lip-read, read and listened narratives. Cognitive Neuroscience Society annual meeting, New York City, April 4, 2016. Further Reading © Society for Science & the Public 2000 - 2016.
Link ID: 22077 - Posted: 04.07.2016
Emily Anthes Type 'depression' into the Apple App Store and a list of at least a hundred programs will pop up on the screen. There are apps that diagnose depression (Depression Test), track moods (Optimism) and help people to “think more positive” (Affirmations!). There's Depression Cure Hypnosis (“The #1 Depression Cure Hypnosis App in the App Store”), Gratitude Journal (“the easiest and most effective way to rewire your brain in just five minutes a day”), and dozens more. And that's just for depression. There are apps pitched at people struggling with anxiety, schizophrenia, post-traumatic stress disorder (PTSD), eating disorders and addiction. This burgeoning industry may meet an important need. Estimates suggest that about 29% of people will experience a mental disorder in their lifetime1. Data from the World Health Organization (WHO) show that many of those people — up to 55% in developed countries and 85% in developing ones — are not getting the treatment they need. Mobile health apps could help to fill the gap (see 'Mobilizing mental health'). Given the ubiquity of smartphones, apps might serve as a digital lifeline — particularly in rural and low-income regions — putting a portable therapist in every pocket. “We can now reach people that up until recently were completely unreachable to us,” says Dror Ben-Zeev, who directs the mHealth for Mental Health Program at the Dartmouth Psychiatric Research Center in Lebanon, New Hampshire. Public-health organizations have been buying into the concept. In its Mental Health Action Plan 2013–2020, the WHO recommended “the promotion of self-care, for instance, through the use of electronic and mobile health technologies.” And the UK National Health Service (NHS) website NHS Choices carries a short list of online mental-health resources, including a few apps, that it has formally endorsed. © 2016 Nature Publishing Grou
By Kj Dell’Antonia If you tell your child’s pediatrician that your child is having trouble sleeping, she might respond by asking you how well you sleep yourself. A team of Finnish researchers found that parents with poor sleep quality tended to report more sleep-related difficulties in their children than parents who slept well. But when the researchers looked at an objective monitor of the children’s sleep, using a bracelet similar to a commercial fitness tracker that monitored movement acceleration, a measure of sleep quality, they found that the parents were often reporting sleep problems in their children that didn’t seem to be there. “The only thing that was associated with sleeping problems, as reported by the parents, was their own reported sleeping problems,” said Marko Elovainio, a professor of psychology at the University of Helsinki and one of the authors of the study, which was published this month in the journal Pediatrics. The study was relatively small, involving 100 families with children aged 2 to 6. But the findings suggest that parents’ report of sleep problems in their children are influenced by their own attitudes and behaviors surrounding sleep. The researchers were inspired to do their study, in part, by research showing that mothers with depression over-report behavioral problems in their children, seeing issues that teachers do not see. In pediatrics, the researchers noted, doctors rely heavily on parental reports for information — and if that information is biased by a parent’s own experience, diagnosis becomes more difficult. “Sleep is a good measure of stress,” said Dr. Elovaino, and it is one tool doctors use to evaluate how much stress a child is experiencing. But when making a diagnosis involving a child’s sleeping patterns, “we can’t rely on reports of parents. We need to use more objective measures.” One reason to look at sleep in this context, he said, is that unlike other possible markers of stress, it can be measured objectively. © 2016 The New York Times Company
Link ID: 22073 - Posted: 04.06.2016
by Daniel Galef Footage from a revolutionary behavioural experiment showed non-primates making and using tools just like humans. In the video, a crow is trying to get food out of a narrow vessel, but its beak is too short for it to reach through the container. Nearby, the researchers placed a straight wire, which the crow bent against a nearby surface into a hook. Then, holding the hook in its beak, it fished the food from the bottle. Corvids—the family of birds that includes crows, ravens, rooks, jackdaws, and jays—are pretty smart overall. Although not to the level of parrots and cockatoos, ravens can also mimic human speech. They also have a highly developed system of communication and are believed to be among the most intelligent non-primate animals in existence. McGill Professor Andrew Reisner recalls meeting a graduate student studying corvid intelligence at Oxford University when these results were first published in 2015. “I had read early in the year that some crows had been observed making tools, and I mentioned this to him,” Reisner explained. “He said that he knew about that, as it had been he who had first observed it happening. Evidently the graduate students took turns watching the ‘bird box,’ […] and the tool making first occurred there on his shift.”
By Roni Caryn Rabin Alzheimer’s disease is a progressive brain disorder that causes dementia, destroying memory, cognitive skills, the ability to care for oneself, speak and walk, said Ruth Drew, director of family and information services at the Alzheimer’s Association. “And since the brain affects everything, Alzheimer’s ultimately affects everything,” she said, “including the ability to swallow, cough and breathe.” Once patients reach the advanced stages of Alzheimer’s, they may stop eating and become weak and susceptible to infections, said Dr. Jason Karlawish, a professor of medicine at the University of Pennsylvania. Unable to swallow or cough, they are at high risk of choking, aspirating food particles or water into the lungs and developing pneumonia, which is often the immediate cause of death, he said. “You see a general decline in the contribution the brain makes, not just in thinking, but in maintaining the body’s homeostasis,” Dr. Karlawish said. Using a feeding tube to nourish patients and hospitalizing them for infections does not significantly extend life at the advanced stages of the disease and is discouraged because it can prolong suffering with no hope of recovery, he said. Alzheimer's is the sixth leading cause of death in the United States, according to the Centers for Disease Control and Prevention, but that figure may underestimate the actual number of cases, Dr. Karlawish said, since some deaths may be attributed to other causes like pneumonia. © 2016 The New York Times Company
Link ID: 22071 - Posted: 04.06.2016
Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group
Feel like you haven’t slept in ages? If you’re one of the 5 per cent of the population who has severe insomnia – trouble sleeping for more than a month – then your brain’s white matter might be to blame. The cell bodies and synapses of our brain cells make up our brain’s grey matter, while bundles of their tails that connect one brain region to another make up the white matter. These nerve cell tails – axons – are cloaked in a fatty myelin sheath that helps transmit signals. Radiologist Shumei Li from Guangdong No. 2 Provincial People’s Hospital in Guangzhou, China, and her team, scanned the brains of 30 healthy sleepers and 23 people with severe insomnia using diffusion tensor imaging MRI. This imaging technique lights up the white matter circuitry. Axons unsheathed They found that in the brains of the people with severe insomnia, the regions in the right hemisphere that support learning, memory, smell and emotion were less well connected compared with healthy sleepers. They attribute this break down in circuitry to the loss of the myelin sheath in the white matter. A study in November suggested that smoking could be one cause for myelin loss. The team also found that the insomniacs had poorer connections in the white matter of the thalamus, a brain region that regulates consciousness, alertness and sleep. The study proposes a potential mechanism for insomnia but there could be other factors, says Max Wintermark, a radiologist at Stanford. He says it’s not possible to say whether the poor connections are the cause of result of insomnia. © Copyright Reed Business Information Ltd.
Link ID: 22069 - Posted: 04.05.2016