Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
You may have read that having a male brain will earn you more money. Or maybe that female brains are better at multitasking. But there is no such thing as a female or male brain, according to the first search for sex differences across the entire human brain. It reveals that most people have a mix of male and female brain features. And it also supports the idea that gender is non-binary, and that gender classifications in many situations are meaningless. “This evidence that human brains cannot be categorised into two distinct classes is new, convincing, and somehow radical,” says Anelis Kaiser at the University of Bern, Switzerland. The idea that people have either a “female” or “male” brain is an old one, says Daphna Joel at Tel Aviv University in Israel. “The theory goes that once a fetus develops testicles, they secrete testosterone which masculinises the brain,” she says. “If that were true, there would be two types of brain.” To test the theory, Joel and her colleagues looked for differences in brain scans taken from 1400 people aged between 13 and 85. The team looked for variations in the size of brain regions as well as the connections between them. In total, the group identified 29 brain regions that generally seem to be different sizes in self-identified males and females. These include the hippocampus, which is involved in memory, and the inferior frontal gyrus, which is thought to play a role in risk aversion. When the group looked at each individual brain scan, however, they found that very few people had all of the brain features they might be expected to have, based on their sex. Across the sample, between 0 and 8 per cent of people had “all-male” or “all-female” brains, depending on the definition. “Most people are in the middle,” says Joel. © Copyright Reed Business Information Ltd.
Keyword: Sexual Behavior
Link ID: 21670 - Posted: 12.01.2015
By Kelli Whitlock Burton Evolutionarily speaking, we are born to make babies. Our bodies—and brains—don’t fall apart until we come to the end of our child-bearing years. So why are grandmothers, who don’t reproduce and who contribute little to food production, still around and still mentally sound? A new study offers an intriguing genetic explanation. Scientists have proposed several explanations for why our species lives as long and as healthily as it does. One idea is that grandmothers help out with child rearing. A 1998 study found, for example, that a Hadza group of hunter-gatherers in Tanzania had more babies if grandmothers helped feed their newly-weaned young grandchildren. The researchers speculated this kind of care freed up young mothers to reproduce, and ensured that the caregiver grandmother’s genes were passed on to more young. They called their theory the “grandmother hypothesis.” But grandmothers need to have all their wits about them to help out in this way, and the new study may explain how this happens. Physician-scientist Ajit Varki and evolutionary biologist Pascal Gagneux of the University of California, San Diego, arrived at the findings accidentally. The pair was studying a gene that helps control the body’s inflammatory and immune response to injury or infection. Previous studies have linked two forms of the gene—CD33—to Alzheimer’s disease. While one CD33 variant, or allele, predisposes a person to the disease, the other appears to protect against it by preventing the formation of protein clumps in the brain. © 2015 American Association for the Advancement of Science.
By Diana Kwon The human brain is unique: Our remarkable cognitive capacity has allowed us to invent the wheel, build the pyramids and land on the moon. In fact, scientists sometimes refer to the human brain as the “crowning achievement of evolution.” But what, exactly, makes our brains so special? Some leading arguments have been that our brains have more neurons and expend more energy than would be expected for our size, and that our cerebral cortex, which is responsible for higher cognition, is disproportionately large—accounting for over 80 percent of our total brain mass. Suzana Herculano-Houzel, a neuroscientist at the Institute of Biomedical Science in Rio de Janeiro, debunked these well-established beliefs in recent years when she discovered a novel way of counting neurons—dissolving brains into a homogenous mixture, or “brain soup.” Using this technique she found the number of neurons relative to brain size to be consistent with other primates, and that the cerebral cortex, the region responsible for higher cognition, only holds around 20 percent of all our brain’s neurons, a similar proportion found in other mammals. In light of these findings, she argues that the human brain is actually just a linearly scaled-up primate brain that grew in size as we started to consume more calories, thanks to the advent of cooked food. Other researchers have found that traits once believed to belong solely to humans also exist in other members of the animal kingdom. Monkeys have a sense of fairness. Chimps engage in war. Rats show altruism and exhibit empathy. In a study published last week in Nature Communications, neuroscientist Christopher Petkov and his group at Newcastle University found that macaques and humans share brain areas responsible for processing the basic structures of language. © 2015 Scientific American
In Greek mythology, the Hydra was a gigantic, snake-like monster with nine heads and poisonous blood and breath, which lurked in the swamps of Lerna. Heracles was sent to destroy the beast as one of his twelve labours, but when he decapitated one of its heads, two more grew back in its place. He eventually defeated it with the help of his trusty nephew Iolaus, however, by burning out the severed roots with firebrands to prevent the regrowth, then decapitating its one immortal head and burying it under a heavy rock. The real Hydra has regenerative capacities that surpass those of its mythological namesake. When it is dismembered, any fragment of its body can regenerate to form a completely new individual, and it can even remain alive after its entire nervous system has been lost. Researchers in Switzerland now report that it does so by adapting its skin cells to make them behave more like neurons. Their findings provide clues about how nerve cells first evolved, billions of years ago. Hydra is a small freshwater polyp with a tubular body consisting of just two layers of cells, and a network of nerves that controls its movements, feeding, and its light-sensitive stinging tentacles. The central region of its body contains specialized, multi-purpose skin cells which can contract and detect mechanical stimuli. These so-called ‘i-cells’ also act as stem cells, continuously renewing themselves, while also producing immature nerve cells that migrate out to the extremities, where they differentiate to form the dense nerve net. © 2015 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 21663 - Posted: 11.28.2015
By David Noonan The 63-year-old chief executive couldn't do his job. He had been crippled by migraine headaches throughout his adult life and was in the middle of a new string of attacks. “I have but a little moment in the morning in which I can either read, write or think,” he wrote to a friend. After that, he had to shut himself up in a dark room until night. So President Thomas Jefferson, in the early spring of 1807, during his second term in office, was incapacitated every afternoon by the most common neurological disability in the world. The co-author of the Declaration of Independence never vanquished what he called his “periodical head-ach,” although his attacks appear to have lessened after 1808. Two centuries later 36 million American migraine sufferers grapple with the pain the president felt. Like Jefferson, who often treated himself with a concoction brewed from tree bark that contained quinine, they try different therapies, ranging from heart drugs to yoga to herbal remedies. Their quest goes on because modern medicine, repeatedly baffled in attempts to find the cause of migraine, has struggled to provide reliable relief. Now a new chapter in the long and often curious history of migraine is being written. Neurologists believe they have identified a hypersensitive nerve system that triggers the pain and are in the final stages of testing medicines that soothe its overly active cells. These are the first ever drugs specifically designed to prevent the crippling headaches before they start, and they could be approved by the U.S. Food and Drug Administration next year. If they deliver on the promise they have shown in studies conducted so far, which have involved around 1,300 patients, millions of headaches may never happen. © 2015 Scientific American
Keyword: Pain & Touch
Link ID: 21662 - Posted: 11.28.2015
by Sarah Zielinski Call someone a “bird brain” and they are sure to be offended. After all, it’s just another way of calling someone “stupid.” But it’s probably time to retire the insult because scientists are finding more and more evidence that birds can be pretty smart. Consider these five species: We may call pigeons “flying rats” for their penchant for hanging out in cities and grabbing an easy meal. (Long before there was “pizza rat,” you know there had to be “pizza pigeons” flying around New York City.) But there may be more going on in their brains than just where to find a quick bite. Richard Levenson of the University of California, Davis Medical Center and colleagues trained pigeons to recognize images of human breast cancers. In tests, the birds proved capable of sorting images of benign and malignant tumors. In fact, they were just as good as humans, the researchers report November 18 in PLOS ONE. In keeping with the pigeons’ reputation, though, food was the reward for their performance. No one would suspect the planet’s second-best toolmakers would be small black birds flying through mountain forests on an island chain east of Australia. But New Caledonian crows have proven themselves not only keen toolmakers but also pretty good problem-solvers, passing some tests that even dogs (and pigeons) fail. For example, when scientists present an animal with a bit of meat on a long string dangling down, many animals don’t ever figure out how to get the meat. Pull it up with one yank, and the meat is still out of reach. Some animals will figure out how to get it through trial and error, but a wild New Caledonian crow solved the problem — pull, step on string, pull some more — on its first try. © Society for Science & the Public 2000 - 2015
By Lenny Bernstein BALTIMORE — Deep into a three-day heroin binge at a local hotel, Samantha told the newbie he was shooting too much. He wasn’t accustomed to heroin, she said, and hadn’t waited long enough since his last injection. “But he didn’t listen,” she said. Sure enough, he emerged from a visit to the bathroom, eyes glazed, and collapsed from an overdose. Samantha, who declined to give her last name to avoid trouble with her bosses at a nearby strip club, said she grabbed her naloxone, the fast-acting antidote to opioid overdoses. She was too panicked to place the atomizer on the end of the syringe, but her boyfriend wasn’t. He sprayed the mist into the nose of the unconscious drug user, who awoke minutes later. “I always have it because I’m scared to death,” said Samantha, who said she has been shooting heroin for 22 years. “I don’t want to be helpless.” As the opioid epidemic has exploded in small towns and suburbs in recent years, officials have scrambled to put naloxone in the hands of drug users’ families and friends, and to make it more widely available by equipping police officers with the drug. At the same time, thousands of lives are being saved by giving the antidote to drug users. More than 80 percent of overdose victims revived by “laypeople” were rescued by other users, most of them in the past few years, according to one national survey published in June.
Keyword: Drug Abuse
Link ID: 21654 - Posted: 11.24.2015
By Karen Weintraub Essential tremor is involuntary shaking – usually of the hands, but sometimes also of the neck, jaw, voice or legs. “Any fine tasks with the hands can be very difficult when the tremor is pronounced,” said Dr. Albert Hung, center director of the Massachusetts General Hospital National Parkinson Foundation Center of Excellence. Essential tremor can affect balance, walking, hearing and cognition, and can get worse over time, said Dr. Elan Louis, chief of the division of movement disorders at Yale School of Medicine. People with essential tremor run almost twice the risk of developing Alzheimer’s as the general population. Essential tremor appears with movement; if people let their hands sit still, they don’t tremble. That is the big difference between an essential tremor and the tremor of Parkinson’s disease, which can occur while at rest, Dr. Louis said. Essential tremor also tends to strike both hands while Parkinson’s is more one-sided at first, said Dr. Hung. The cause of essential tremor remains a mystery, though it seems to run in families. People of any age or sex can have the condition, though it is more common as people grow older. Roughly 4 percent of 40-year-olds have essential tremor, compared with about 20 percent of 90-year-olds, Dr. Louis said. Available treatments “aren’t great,” Dr. Louis said. Two medications – the beta blocker propranolol and the epilepsy drug primidone, sold under the brand name Mysoline – can reduce tremors by 10 to 30 percent, he said, but they work only in about half of patients. Deep brain stimulation – implanting electrodes into the brain to override faulty electrical signals – has been shown to markedly reduce hand tremor severity, he said. But the treatment can worsen cognitive and balance problems and “doesn’t cure the underlying disease. It merely and temporarily lessons a single symptom, which is the tremor.” © 2015 The New York Times Company
Keyword: Movement Disorders
Link ID: 21653 - Posted: 11.24.2015
By Nicholas Bakalar Several studies have shown that there is an association between shift work and an increased risk for heart disease and diabetes. Now a new study, in the Journal of Clinical Endocrinology & Metabolism, has found a similar association in people whose sleeping schedules change on the weekend. For seven days, 447 men and women ages 30 to 54 wore devices that measured movement and tracked when they fell asleep and woke. Almost 85 percent of the group went to sleep and woke later on their days off than during the workweek. The researchers found that the greater the mismatch in sleep timing between weekdays and weekends, the higher the metabolic risk. Sleeping late on days off was linked to lower HDL (good) cholesterol, higher triglycerides, higher insulin resistance and higher body mass index. The associations persisted after controlling for physical activity, caloric intake, alcohol use and other factors. “It’s not clear yet that this is a long-term effect,” said the lead author, Patricia M. Wong, a graduate student at the University of Pittsburgh. “But we think of this as people having to sleep and work out of sync with their internal clock, and that having to be out of sync may be having these health effects.” © 2015 The New York Times Company
Link ID: 21651 - Posted: 11.21.2015
Human DNA is 1 to 2% Neandertal, or more, depending on where your ancestors lived. Svante Pääbo, founder of the field of paleogenetics and winner of a 2016 Breakthrough Prize, explains why that matters © 2015 Scientific American
Ian Sample Science editor Humans buy flowers. Capuchins throw stones. Giant tortoises bellow. But the blue-capped cordon bleu, a small finch found in Africa, really knows how to win over a mate. The three-inch-high omnivores perform energetic cabaret acts to woo their partners, rattling through routines that feature head-bobbing, singing and tap dance, and often all three at once. The birds were known to sing and nod their heads to impress the opposite sex, but high speed video footage has now revealed that they spice up their displays with nifty footwork that adds percussion to their repertoire and sends vibrations racing down their perches. Scientists at Hokkaido University filmed the birds as they tried their luck with cagemates, and found that both males and females turned to tap to seduce their targets. The steps have not been seen before because they are too fast for the naked eye to spot. “Like humans, males and females of cordon-bleus are mutually choosy and both sexes need to show off,” said Masayo Soma who lead the research. “They show tap dancing throughout the courtship display, and they sometimes add songs to tap dancing.” Whether the steps and songs are coordinated is the focus of ongoing research. Footage of the birds in cabaret mode showed that an entire routine could include more than 200 steps in bursts of anything from five seconds to more than a minute. Both males and females danced more vigorously when their mate was on the same perch. Males danced more often and tapped their feet faster, but apart from that, the sexes had similar moves. © 2015 Guardian News and Media Limited
By Christopher Intagliata Back in ancient times, philosophers like Aristotle were already speculating about the origins of taste, and how the tongue sensed elemental tastes like sweet, bitter, salty and sour. "What we discovered just a few years ago is that there are regions of the brain—regions of the cortex—where particular fields of neurons represent these different tastes again, so there's a sweet field, a bitter field, a salty field, etcetera." Nick Ryba [pron. Reba], a sensory neuroscientist at the National Institutes of Health. Ryba and his colleagues found that you can actually taste without a tongue at all, simply by stimulating the "taste" part of the brain—the insular cortex. They ran the experiment in mice with a special sort of brain implant—a fiber-optic cable that turns neurons on with a pulse of laser light. And by switching on the "bitter" sensing part of the brain, they were able to make mice pucker up, as if they were tasting something bitter—even though absolutely nothing bitter was touching the tongues of the mice. In another experiment, the researchers fed the mice a bitter flavoring on their tongues—but then made it more palatable by switching on the "sweet" zone of the brain. "What we were doing here was adding the sweetness, but only adding it in the brain, not in what we were giving to the mouse." Think adding sugar to your coffee—but doing it only in your mind. The findings appear in the journal Nature. © 2015 Scientific American
Keyword: Chemical Senses (Smell & Taste)
Link ID: 21648 - Posted: 11.20.2015
The town of Yarumal in Colombia is famous for all the wrong reasons: it has the world’s largest population of people with Alzheimer’s disease. In Yarumal and the surrounding state of Antioquia, 5000 people carry a gene mutation which causes early-onset Alzheimer’s – half of them will be diagnosed by the age of 45, and the other half will succumb by the time they are 65. Locals call the disease La Bobera, “the foolishness”, and the village bears uncanny parallels with the fictional Macondo in Gabriel Garcia Marquez’s novel One Hundred Years of Solitude, where people suffer memory disorders and hallucinations. But while Yarumal’s “curse” is well known, no one knew how the mutation first appeared. Now researchers have traced the ancestry of the mutation, concluding that it was probably introduced by a Spanish conquistador early in the 17th century. Ken Kosik at the University of California, Santa Barbara, and colleagues collected blood samples from 102 people in Antioquia and sequenced their genomes. The mutation causing this form of early-onset Alzheimer’s is called E280A and is found in a gene on chromosome 14 – 74 people had the mutation. Because Kosik’s team had information on the genome sequence around the mutation, they could use something called identity-by-descent analysis to determine how the people in the study were related. The analysis suggested the mutation arose from a common ancestor around 375 years ago. © Copyright Reed Business Information Ltd.
Susan Milius Certain species of the crawling lumps of mollusk called chitons polka-dot their armor-plated backs with hundreds of tiny black eyes. But mixing protection and vision can come at a price. The lenses are rocky nuggets formed mostly of aragonite, the same mineral that pearls and abalone shells are made of. New analyses of these eyes support previous evidence that they form rough images instead of just sensing overall lightness or darkness, says materials scientist Ling Li of Harvard University. Adding eyes to armor does introduce weak spots in the shell. Yet the positioning of the eyes and their growth habits show how chitons compensate for that, Li and his colleagues report in the November 20 Science. Li and coauthor Christine Ortiz of MIT have been studying such trade-offs in biological materials that serve multiple functions. Human designers often need substances that multitask, and the researchers have turned to evolution’s solutions in chitons and other organisms for inspiration. Biologists had known that dozens of chiton species sprinkle their armored plates with simple-seeming eye spots. (The armor has other sensory organs: pores even tinier than the eyes.) But in 2011, a research team showed that the eyes of the West Indian fuzzy chiton (Acanthopleura granulata) were much more remarkable than anyone had realized. Their unusual aragonite lens can detect the difference between a looming black circle and a generally gray field of vision. Researchers could tell because chitons clamped their shells defensively to the bottom when a scary circle appeared but not when an artificial sky turned overall shadowy. © Society for Science & the Public 2000 - 2015
By Seth Fletcher To solve the mysteries of the brain, scientists need to delicately, precisely monitor neurons in living subjects. Brain probes, however, have generally been brute-force instruments. A team at Harvard University led by chemist Charles Lieber hopes that silky soft polymer mesh implants will change this situation. So far the researchers have tested the mesh, which is embedded with electronic sensors, in living mice. Once it has been proved safe, it could be used in people to study how cognition arises from the action of individual neurons and to treat diseases such as Parkinson's. © 2015 Scientific American
Keyword: Brain imaging
Link ID: 21645 - Posted: 11.20.2015
By Jonathan Webb Science reporter, BBC News A study of 153 brain scans has linked a particular furrow, near the front of each hemisphere, to hallucinations in schizophrenia. This fold tends to be shorter in those patients who hallucinate, compared with those who do not. It is an area of the brain that appears to have a role in distinguishing real perceptions from imagined ones. Researchers say the findings, published in Nature Communications, might eventually help with early diagnosis. The brain wrinkle, called the paracingulate sulcus or PCS, varies considerably in shape between individuals. It is one of the final folds to develop, appearing in the brain only just before birth. "The brain develops throughout life, but aspects such as whether the PCS is going to be a particularly prominent fold - or not -may be apparent in the brain at an early stage," said Jon Simons, a neuroscientist at the University of Cambridge, UK. "It might be that a reduction in this brain fold gives somebody a predisposition towards developing something like hallucinations later on in life." If further work shows that the difference can be detected before the onset of symptoms, for example, Dr Simons said it might be possible to offer extra support to people who face that elevated risk. But he stressed that schizophrenia is a complicated phenomenon. Hallucinations are one of the main symptoms, but some patients are diagnosed on the basis of other irregular thought processes. "We've known for some time that disorders like schizophrenia are not down to a single region of the brain. Changes are seen throughout various different areas. "To be able to pin such a key symptom to a relatively specific part of the brain is quite unusual." © 2015 BBC.
Link ID: 21644 - Posted: 11.18.2015
by Bethany Brookshire Many people perceive cocaine as one of the most intense stimulant drugs available: It’s illegal, highly addictive and dangerous. Caffeine, in contrast, is the kinder, cuddlier stimulant. It’s legal, has mild effects and in forms such as coffee, it might even be good for your health. But caffeine in combination with cocaine is another story. In South America, drug distributors have started “cutting” their cocaine with caffeine. This cheaper substitute might, at first glance, seem to make the cocaine less potent. After all, there’s less of the drug there. But new data shows that when combined, cocaine and caffeine make a heck of a drug. Coca paste is a popular form of cocaine in South American countries. A smoked form of cocaine, coca paste is the intermediate product in the extraction process used to get pure cocaine out of coca leaves. Because it is smoked, the cocaine in the coca paste hits the brain very quickly, making the drug highly addictive, explains Jose Prieto, a neurochemist at the Biological Research Institute Clemente Stable in Montevideo, Uruguay. Much of the time, Coca paste isn’t acting alone, however. In a 2011 study published in Behavioral Brain Research, Prieto and his colleagues examined the contents of coca paste from police seizures. “Nearly 80 percent of the coca paste samples” were adulterated, Prieto says, “most with caffeine.” Caffeine adulteration ranged from 1 to 15 percent of the drug volume. © Society for Science & the Public 2000 - 2015.
Keyword: Drug Abuse
Link ID: 21643 - Posted: 11.18.2015
Angus Chen If you peek into classrooms around the world, a bunch of bespectacled kids peek back at you. In some countries such as China, as much as 80 percent of children are nearsighted. As those kids grow up, their eyesight gets worse, requiring stronger and thicker eyeglasses. But a diluted daily dose of an ancient drug might slow that process. The drug is atropine, one of the toxins in deadly nightshade and jimsonweed. In the 19th and early 20th centuries, atropine was known as belladonna, and fancy Parisian ladies used it to dilate their pupils, since big pupils were considered alluring at the time. A few decades later, people started using atropine to treat amblyopia, or lazy eye, since it blurs the stronger eye's vision and forces the weaker eye to work harder. As early as the 1990s, doctors had some evidence that atropine can slow the progression of nearsightedness. In some countries, notably in Asia, a 1 percent solution of atropine eyedrops is commonly prescribed to children with myopia. It's not entirely clear how atropine works. Because people become nearsighted when their eyeballs get too elongated, it's generally thought that atropine must be interfering with that unwanted growth. But as Parisians discovered long ago, the drug can have some inconvenient side effects. © 2015 npr
Laura Sanders Faced with a shortage of the essential nutrient selenium, the brain and the testes duke it out. In selenium-depleted male mice, testes hog the trace element, leaving the brain in the lurch, scientists report in the Nov. 18 Journal of Neuroscience. The results are some of the first to show competition between two organs for trace nutrients, says analytical neurochemist Dominic Hare of the University of Technology Sydney and the Florey Institute of Neuroscience and Mental Health in Melbourne. In addition to uncovering this brain-testes scuffle, the study “highlights that selenium in the brain is something we can’t continue to ignore,” he says. About two dozen proteins in the body contain selenium, a nonmetallic chemical element. Some of these proteins are antioxidants that keep harmful molecules called free radicals from causing trouble. Male mice without enough selenium have brain abnormalities that lead to movement problems and seizures, neuroscientist Matthew Pitts of the University of Hawaii at Manoa and colleagues found. In some experiments, Pitts and his colleagues depleted selenium by interfering with genes. Male mice engineered to lack two genes that produce proteins required for the body to properly use selenium had trouble balancing on a rotating rod and moving in an open field. In their brains, a particular group of nerve cells called parvalbumin interneurons didn’t mature normally. © Society for Science & the Public 2000 - 2015.
Link ID: 21640 - Posted: 11.18.2015
Jon Hamilton Patterns of gene expression in human and mouse brains suggest that cells known as glial cells may have helped us evolve brains that can acquire language and solve complex problems. Scientists have been dissecting human brains for centuries. But nobody can explain precisely what allows people to use language, solve problems or tell jokes, says Ed Lein, an investigator at the Allen Institute for Brain Science in Seattle. "Clearly we have a much bigger behavioral repertoire and cognitive abilities that are not seen in other animals," he says. "But it's really not clear what elements of the brain are responsible for these differences." Research by Lein and others provides a hint though. The difference may involve brain cells known as glial cells, once dismissed as mere support cells for neurons, which send and receive electrical signals in the brain. Lein and a team of researchers made that finding after studying which genes are expressed, or switched on, in different areas of the brain. The effort analyzed the expression of 20,000 genes in 132 structures in brains from six typical people. Usually this sort of study is asking whether there are genetic differences among brains, Lein says. "And we sort of flipped this question on its head and we asked instead, 'What's really common across all individuals and what elements of this seem to be unique to the human brain?' " he says. It turned out the six brains had a lot in common. © 2015