Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Sarah Boseley Health editor A man who was paralysed from below the neck after crashing his bike into a truck can once again drink a cup of coffee and eat mashed potato with a fork, after a world-first procedure to allow him to control his hand with the power of thought. Bill Kochevar, 53, has had electrical implants in the motor cortex of his brain and sensors inserted in his forearm, which allow the muscles of his arm and hand to be stimulated in response to signals from his brain, decoded by computer. After eight years, he is able to drink and feed himself without assistance. “I think about what I want to do and the system does it for me,” Kochevar told the Guardian. “It’s not a lot of thinking about it. When I want to do something, my brain does what it does.” The experimental technology, pioneered by the Case Western Reserve University in Cleveland, Ohio, is the first in the world to restore brain-controlled reaching and grasping in a person with complete paralysis. For now, the process is relatively slow, but the scientists behind the breakthrough say this is proof of concept and that they hope to streamline the technology until it becomes a routine treatment for people with paralysis. In the future, they say, it will also be wireless and the electrical arrays and sensors will all be implanted under the skin and invisible.
Link ID: 23423 - Posted: 03.29.2017
Rae Ellen Bichell Exposure to lead as a child can affect an adult decades later, according to a study out Tuesday that suggests a link between early childhood lead exposure and a dip in a person's later cognitive ability and socioeconomic status. Lead in the United States can come from lots of sources: old, peeling paint; contaminated soil; or water that's passed through lead pipes. Before policies were enacted to get rid of lead in gasoline, it could even come from particles in the fumes that leave car tailpipes. "It's toxic to many parts of the body, but in particular in can accumulate in the bloodstream and pass through the blood brain barrier to reach the brain," says the study's first author, Aaron Reuben, a graduate student in clinical psychology at Duke University. Reuben and his colleagues published the results of a long-term study on the lingering effects of lead. Researchers had kept in touch with about 560 people for decades — starting when they were born in Dunedin, New Zealand, in the 1970s, all the way up to the present. As children, the study participants were tested on their cognitive abilities; researchers determined IQ scores based on tests of working memory, pattern recognition, verbal comprehension and ability to solve problems, among other skills. When the kids were 11 years old, researchers tested their blood for lead. (That measurement is thought to be a rough indicator of lead exposure in the few months before the blood draw.) Then, when they turned 38 years old, the cognitive ability of these study participants was tested again. As Reuben and his colleagues write in this week's issue of JAMA, the journal of the American Medical Association, they found a subtle but worrisome pattern in the data. © 2017 npr
By KATIE THOMAS The Food and Drug Administration approved on Tuesday the first drug to treat a severe form of multiple sclerosis, offering hope to patients who previously had no other options to combat a relentless disease that leads to paralysis and cognitive decline. The federal agency also cleared the drug to treat people with the more common, relapsing form of the disease. “I think that this is a very big deal,” said Dr. Stephen Hauser, the chairman of the neurology department at the University of California, San Francisco, and leader of the steering committee that oversaw the late-stage clinical trials of the drug, ocrelizumab. “The magnitude of the benefits that we’ve seen with ocrelizumab in all forms of M.S. are really quite stunning.” The drug, which will be sold under the brand name Ocrevus by Genentech, showed the most notable results in patients with relapsing multiple sclerosis, appearing to halt progression of the disease with few serious side effects. In patients with the more severe form, primary progressive multiple sclerosis, the drug only modestly slowed patients’ decline, but medical experts described it as an important first step. “This sort of opens the door for us,” said Dr. Fred D. Lublin, who was a crucial investigator for the clinical trial and is director of the Corinne Goldsmith Dickinson Center for Multiple Sclerosis at Mount Sinai Hospital in New York. “Once we open that door, then we do better and better and better. It’s a very encouraging result.” Genentech, which is owned by the Swiss pharmaceutical giant Roche, said Tuesday that it would charge a list price of $65,000 a year, which — though expensive — is 25 percent less than an existing drug, Rebif, that was shown to be clinically inferior to Ocrevus in the two clinical trials that led to Ocrevus’s approval. © 2017 The New York Times Company
Amy Maxmen Before his 33-year-old son became bedridden with chronic fatigue syndrome, biochemist Ronald Davis created technologies to analyse genes and proteins faster, better and more cheaply. Now he aims his inventions at a different target: the elusive inner workings of his son’s malady. In his office at the Stanford Genome Technology Center in Palo Alto, California, Davis holds a nanofabricated cube the size of a gaming die. It contains 2,500 electrodes that measure electrical resistance to evaluate the properties of human cells. When Davis exposed immune cells from six people with chronic fatigue syndrome to a stressor — a splash of common salt — the cube revealed that they couldn’t recover as well as cells from healthy people could. Now his team is fabricating 100 more devices to repeat the experiment, and testing a cheaper alternative — a paper-thin nanoparticle circuit that costs less than a penny to make on an inkjet printer. Davis’s findings, although preliminary, are helping to propel research on chronic fatigue syndrome, also called myalgic encephalomyelitis (ME/CFS), into the scientific mainstream. Physicians used to dismiss the disease as psychosomatic, but studies now suggest that it involves problems in the chemical reactions, or pathways, within cells. “We now have a great deal of evidence to support that this is not only real, but a complex set of disorders,” says Ian Lipkin, an epidemiologist at Columbia University in New York City. “We are gathering clues that will lead to controlled clinical trials.” © 2017 Macmillan Publishers Limited,
By Kate Yandell In a 1971 paper published in Science, biologist Roger Payne, then at Rockefeller University, and Scott McVay, then an administrator at Princeton University, described the “surprisingly beautiful sounds” made by humpback whales (Megaptera novaeangliae; Science, 173:585-97). Analyzing underwater recordings made by a Navy engineer, the duo found that these whale sounds were intricately repetitive. “Because one of the characteristics of bird songs is that they are fixed patterns of sounds that are repeated, we call the fixed patterns of humpback sounds ‘songs,’” they wrote. OCEAN SONGS: Humpback whales make diverse, broadband sounds that travel miles through the ocean. Their function, however, remains somewhat murky.PLOS ONE, dx.doi.org/10.1371/journal.pone.0079422, 2013 It’s now clear that, in addition to simpler calls, several baleen whale species—including blue, fin, and bowhead—make series of sounds known as song. Humpback song is the most complex and by far the best studied. Units of humpback songs form phrases, series of similar phrases form themes, and multiple themes form songs. All the males in a given population sing the same song, which evolves over time. When whale groups come into contact, songs can spread. But why do whales sing? “The short answer is, we don’t know,” says Alison Stimpert, a bioacoustician at Moss Landing Marine Laboratories in California. Humpback songs are only performed by males and are often heard on breeding grounds, so the dominant hypothesis is that these songs are a form of courtship. The quality of a male’s performance could be a sign of his fitness, for example. But female whales do not tend to approach singing males. Alternatively, whale researchers have proposed that the male whales sing to demarcate territory or to form alliances with other males during mating season. © 1986-2017 The Scientist
By Erik Vance The world’s smallest arachnid, the Samoan moss spider, is at a third of a millimeter nearly invisible to the human eye. The largest spider in the world is the goliath birdeater tarantula, which weighs 5 ounces and is about the size of a dinner plate. For reference, that is about the same difference in scale between that same tarantula and a bottlenose dolphin. And yet the bigger spider does not act in more complex ways than its tiny counterpart. “Insects and spiders and the like—in terms of absolute size—have among the tiniest brains we’ve come across,” says William Wcislo, a scientist at the Smithsonian Tropical Research Institute in Panama City. “But their behavior, as far as we can see, is as sophisticated as things that have relatively large brains. So then there’s the question: How do they do that?” No one would argue that a tarantula is as smart as a dolphin or having a really big brain is not an excellent way to perform complicated tasks. But a growing number of scientists are asking the question: Is it the only way? Do you need a big brain to hunt elusive prey, design complicated structures or produce complex social dynamics? For generations scientists have wondered how intelligent creatures developed large brains to perform complicated tasks. But Wcislo is part of a small community of scientists less interested in how brains have grown than how they have shrunk and yet shockingly still perform tasks as well or better than similar species that are much larger in size. In other words, it’s what scientists call brain miniaturization, not unlike the scaling down in size of the transistors in a computer chip. This research, in fact, may hold clues to innovative design strategies that engineers might incorporate in future generations of computers. © 2017 Scientific American
Link ID: 23418 - Posted: 03.29.2017
Laurel Hamers SAN FRANCISCO — Millennials, rejoice: A winking-face emoji is worth a slew of ironic words. The brain interprets irony or sarcasm conveyed by an emoji in the same way as it does verbal banter, researchers reported March 26 in San Francisco at the Cognitive Neuroscience Society’s annual meeting. Researchers measured brain electrical activity of college students reading sentences ending in various emojis. For example, the sentence “You are such a jerk” was followed by an emoji that matched the words’ meaning (a frowning face), contradicted the words (a smiling face) or implied sarcasm (a winking face). Then the participants assessed the veracity of the sentence—was the person actually a jerk? Some participants read the sentence literally no matter what, said Benjamin Weissman, a linguist at the University of Illinois at Urbana-Champaign. But people who said emojis influenced their interpretation showed different brain activity in response to sentences with a winking emoji than ones with other emojis. A spike in electrical activity occurred 200 milliseconds after reading winky-face sentences, followed by another spike at 600 milliseconds. A similar electrical pattern has been noted in previous studies in which people listened to sentences where intonation conveyed a sarcastic rather than literal interpretation of the words. That peak at 600 milliseconds has been linked to reassessment. It’s as if the brain reads the sentence one way, sees the emoji and then updates its interpretation to fit the new information, Weissman said. |© Society for Science & the Public 2000 - 2017
Link ID: 23417 - Posted: 03.29.2017
By Lizzie Wade Ask any biologist what makes primates special, and they’ll tell you the same thing: big brains. Those impressive noggins make it possible for primates from spider monkeys to humans to use tools, find food, and navigate the complex relationships of group living. But scientists disagree on what drove primates to evolve big brains in the first place. Now, a new study comes to an unexpected conclusion: fruit. “The paper is enormously valuable,” says Richard Wrangham, a biological anthropologist at Harvard University who was not involved in the work. For the last 20 years, many scientists have argued that primates evolved bigger brains to live in bigger groups, an idea known as the “social brain hypothesis.” The new study’s large sample size and robust statistical methods suggest diet and ecology deserve more attention, Wrangham says. But not everyone is convinced. Others say that although a nutrient-rich diet allows for bigger brains, it wouldn’t be enough by itself to serve as a selective evolutionary pressure. When the authors compare diet and social life, “they’re comparing apples and oranges,” says Robin Dunbar, an evolutionary psychologist at the University of Oxford in the United Kingdom and one of the original authors of the social brain hypothesis. Alex DeCasien, the new study’s author, didn’t set out to shake up this decades-long debate. The doctoral student in biological anthropology at New York University in New York City wanted to tease out whether monogamous primates had bigger or smaller brains than more promiscuous species. She collected data about the diets and social lives of more than 140 species across all four primate groups—monkeys, apes, lorises, and lemurs—and calculated which features were more likely to be associated with bigger brains. To her surprise, neither monogamy nor promiscuity predicted anything about a primate’s brain size. Neither did any other measure of social complexity, such as group size. The only factor that seemed to predict which species had larger brains was whether their diets were primarily leaves or fruit, DeCasien and her colleagues report today in Nature Ecology & Evolution. © 2017 American Association for the Advancement of Science
By MATT RICHTEL LOS ANGELES — Nine days after Nikolas Michaud’s latest heroin relapse, the skinny 27-year-old sat on a roof deck at a new drug rehabilitation clinic here. He picked up a bong, filled it with a pinch of marijuana, lit the leaves and inhaled. All this took place in plain view of the clinic’s director. “The rules here are a little lax,” Mr. Michaud said. In almost any other rehab setting in the country, smoking pot would be a major infraction and a likely cause for being booted out. But here at High Sobriety — the clinic with a name that sounds like the title of a Cheech and Chong comeback movie — it is not just permitted, but part of the treatment. The new clinic is experimenting with a concept made possible by the growing legalization of marijuana: that pot, rather than being a gateway into drugs, could be a gateway out. A small but growing number of pain doctors and addiction specialists are overseeing the use of marijuana as a substitute for more potent and dangerous drugs. Dr. Mark Wallace, chairman of the division of pain medicine in the department of anesthesia at the University of California, San Diego, said over the last five years he has used marijuana to help several hundred patients transition off opiates. “The majority of patients continue to use it,” he said of marijuana. But he added that they tell him of the opiates: “I feel like I was a slave to that drug. I feel like I have my life back.” Dr. Wallace is quick to note that his evidence is anecdotal and more study is needed. Research in rats, he said, supports the idea that the use of cannabinoids can induce withdrawal from heavier substances. But in humans? © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23415 - Posted: 03.28.2017
By Des Bieler Brain injuries are a danger in many sports, but for none more than football and its most profitable enterprise, the National Football League. The NFL is spending hundreds of millions of dollars on a concussion-lawsuit settlement and has poured tens of millions into research on measuring and preventing head trauma. Now some scientists are using an NFL-backed technology to examine blood samples for proteins that have been shown to correlate with concussion and other injuries. One of the most intriguing of these proteins, which could help create better tests for traumatic brain injury, is called neurofilament light — or, as it’s known for short, NFL. That’s right, a protein called “NFL” may wind up helping the NFL address its most vexing medical problem. “It's just a remarkable coincidence,” said Kevin Hrusovsky, chief executive of Quanterix, a company that has received $800,000 in grant money from the NFL through the league's “Head Health Challenge” partnership with GE. Quanterix's technology allows users to zero in on molecules with such precision that Hrusovsky likened it to “being able to see a grain of sand in 2,000 Olympic-size swimming pools.” That is crucial, because only tiny amounts of the proteins, referred to as “biomarkers,” dribble across the blood-brain barrier from the cerebrospinal fluid around the brain, where they would be found in larger quantities. The ability to spot sub-concussion injuries is important because they often go undetected by conventional methods and yet are increasingly seen as major threats to long-term health. The problem with simply sampling athletes' cerebrospinal fluid, of course, is that requires a lumbar puncture, or spinal tap, which is a lot to ask in the middle of a football game (or in any other time and place, for that matter). Pricking an athlete's finger for a blood test and getting the results 15 to 20 minutes later makes for a much more reasonable process, albeit one still a long way from implementation. © 1996-2017 The Washington Post
Keyword: Brain Injury/Concussion
Link ID: 23414 - Posted: 03.28.2017
By RONI CARYN RABIN Television ads for “low T” have sparked a rise in the use of testosterone gels, patches and injections by older men in recent years, according to a new report. But anyone hoping that a dose of testosterone will provide an easy antidote for sagging muscles, flagging energy and a retiring sex drive may find the results of recent government studies of the sex hormone sobering. The latest clinical trials, published over the past year, are the first rigorous ones to assess the potential beneficial effects of testosterone treatment for older men with abnormally low levels of the hormone. Scientists followed 790 men age 65 and older who had blood testosterone levels below 275 nanograms per deciliter of blood, well below the average for healthy young men and lower than would be expected with normal aging. The men also had symptoms reflecting their low hormone levels, like loss of sex drive. Half the participants were treated with testosterone gel, and half were given a placebo gel. The studies reported mixed results, finding that over the yearlong study period, testosterone therapy corrected anemia, or low levels of red blood cells, which can cause fatigue, and increased bone density. But a study to see if testosterone improved memory or cognitive function found no effects. Meanwhile, a red flag warning of possible risks to the heart emerged from the studies: Imaging tests found a greater buildup of noncalcified plaque in the coronary arteries of men treated with testosterone for a year, an indicator of cardiac risk, compared with those who were given a placebo gel. The findings of plaque were not a complete surprise; many reports have tied testosterone use to an increase in heart attacks, and the Food and Drug Administration already requires testosterone products to carry warnings of an increased risk of heart attacks and stroke (men at high risk of cardiovascular disease were not allowed to participate in the latest trials). But observational studies, which are weaker, have yielded mixed results over all, with one study published last month finding that men taking testosterone actually had fewer heart problems. © 2017 The New York Times Company
By C. CLAIBORNE RAY Q. When four of us shared memories of our very young lives, not one of us could recall events before the age of 4 or possibly 3. Is this common? A. Yes. For adults, remembering events only after age 3½ or 4 is typical, studies have found. The phenomenon was named childhood amnesia by Freud and identified late in the 19th century by the pioneering French researcher Victor Henri and his wife, Catherine. The Henris published a questionnaire on early memories in 1895, and the results from 123 people were published in 1897. Most of the participants’ earliest memories came from when they were 2 to 4 years old; the average was age 3. Very few participants recalled events from the first year of life. Many subsequent studies found similar results. Several theories have been offered to explain the timing of laying down permanent memories. One widely studied idea relates the formation of children’s earliest memories to when they start talking about past events with their mothers, suggesting a link between memories and the age of language acquisition. More recent studies, in 2010 and 2014, found discrepancies in the accuracy of young children’s estimates of when things had occurred in their lives. Another 2014 study found a progressive loss of recall as a child ages, with 5-, 6- and 7-year-olds remembering 60 percent or more of some early-life events that were discussed at age 3, while 8- and 9-year-olds remembered only 40 percent of these events. © 2017 The New York Times Company
Laurel Hamers SAN FRANCISCO — When faced with simple math problems, people who get jittery about the subject may rely more heavily on certain brain circuitry than math-savvy people do. The different mental approach could help explain why people with math anxiety struggle on more complicated problems, researchers reported March 25 at the Cognitive Neuroscience Society’s annual meeting. While in fMRI machines, adults with and without math anxiety evaluated whether simple arithmetic problems, such as 9+2=11, were correct or incorrect. Both groups had similar response times and accuracy on the problems, but brain scans turned up differences. Specifically, in people who weren’t anxious about math, lower activation of the frontoparietal attention network was linked to better performance. That brain network is involved in working memory and problem solving. Math-anxious people showed no correlation between performance and frontoparietal network activity. People who used the circuit less were probably getting ahead by automating simple arithmetic, said Hyesang Chang, a cognitive neuroscientist at the University of Chicago. Because math-anxious people showed more variable brain activity overall, Chang speculated that they might instead be using a variety of computationally demanding strategies. This scattershot approach works fine for simple math, she said, but might get maxed out when the math is more challenging. Citations H. Chang et al. Simple arithmetic: Not so simple for highly math anxious individuals. Cognitive Neuroscience Society Annual Meeting, San Francisco, March 25, 2017. |© Society for Science & the Public 2000 - 2017.
By Erin Blakemore It’s scientific canard so old it’s practically cliché: When people lose their sight, other senses heighten to compensate. But are there really differences between the senses of blind and sighted people? It’s been hard to prove, until now. As George Dvorsky reports for Gizmodo, new research shows that blind people’s brains are structurally different than those of sighted people. In a new study published in the journal PLOS One, researchers reveal that the brains of people who are born blind or went blind in early childhood are wired differently than people born with their sight. The study is the first to look at both structural and functional differences between blind and sighted people. Researchers used MRI scanners to peer at the brains of 12 people born with “early profound blindness”—that is, people who were either born without sight or lost it by age three, reports Dvorsky. Then they compared the MRI images to images of the brains of 16 people who were born with sight and who had normal vision (either alone or with corrective help from glasses). The comparisons showed marked differences between the brains of those born with sight and those born without. Essentially, the brains of blind people appeared to be wired differently when it came to things like structure and connectivity. The researchers noticed enhanced connections between some areas of the brain, too—particularly the occipital and frontal cortex areas, which control working memory. There was decreased connectivity between some areas of the brain, as well.
By Scott Barry Kaufman Rarely do I read a scientific paper that overwhelms me with so much excitement, awe, and reverence. Well, a new paper in Psychological Science has really got me revved up, and I am bursting to share their findings with you! Most research on mind-wandering and daydreaming draws on either two methods: strict, laboratory conditions that ask people to complete boring, cognitive tasks and retrospective surveys that ask people to recall how often they daydream in daily life. It has been rather difficult to compare these results to each other; laboratory tasks aren't representative of how we normally go about our day, and surveys are prone to memory distortion. In this new, exciting study, Michael Kane and colleagues directly compared laboratory mind-wandering with real-life mind wandering within the same person, and used an important methodology called "experience-sampling" that allows the researcher to capture people's ongoing stream of consciousness. For 7 days, 8 times a day, the researchers randomly asked 274 undergraduates at North Carolina at Greensboro whether they were mind-wandering and the quality of their daydreams. They also asked them to engage in a range of tasks in the laboratory that assessed their rates of mind-wandering, the contents of their off-task thoughts, and their "executive functioning" (a set of skills that helps keep things in memory despite distractions and focus on the relevant details). What did they find? © 2017 Scientific American
Link ID: 23409 - Posted: 03.27.2017
By Melissa Banigan Twenty years ago, I started experiencing what turned into a long list of seemingly unrelated health issues. Headaches, depression, insomnia, peripheral neuropathy, fatigue, joint pain, chest pain, shortness of breath, a lesion on my spine and a variety of uncomfortable gastrointestinal ailments. Over the past five years, things went from bad to worse as I also became lactose-intolerant, developed mild vitiligo (a condition that leads to loss of skin pigmentation) and major vertigo, experienced a series of low-grade fevers and started to have some memory loss that I referred to as brain fogs. Doctors told me that as an overworked single mother of 40, I might just need to figure out ways to get more sleep and relax. Some of what was happening, they said, might be attributed to the normal processes of aging. What was happening, however, didn’t feel normal. Always a voracious reader and a writer by profession, I could no longer focus on work, read even a page of a book or grip a pen long enough to write a grocery list. I often felt too exhausted to keep plans with friends. When I did pull myself off my couch to see them, I couldn’t concentrate on conversations, so I sequestered myself in my apartment and let my friendships fade. I had been a runner, a swimmer and a hiker, but just walking up a flight of stairs made me lose my breath so completely that I succumbed to inactivity. I did everything the doctors asked me to do. I changed my diet and sleep schedule, went to a physical therapist and saw specialists in neurology and rheumatology and even a mental-health therapist. I then also turned to massage therapists, herbalists and an acupuncturist. © 1996-2017 The Washington Post
Austin Frakt By middle age, the lenses in your eyes harden, becoming less flexible. Your eye muscles increasingly struggle to bend them to focus on this print. But a new form of training — brain retraining, really — may delay the inevitable age-related loss of close-range visual focus so that you won’t need reading glasses. Various studies say it works, though no treatment of any kind works for everybody. The increasing difficulty of reading small print that begins in middle age is called presbyopia, from the Greek words for “old man” and “eye.” It’s exceedingly common, and despite the Greek etymology, women experience it, too. Every five years, the average adult over 30 loses the ability to see another line on the eye reading charts used in eye doctors’ offices. By 45, presbyopia affects an estimated 83 percent of adults in North America. Over age 50, it’s nearly universal. It’s why my middle-aged friends are getting fitted for bifocals or graduated lenses. There are holdouts, of course, who view their cellphones and newspapers at arm’s length to make out the words. The decline in vision is inconvenient, but it’s also dangerous, causing falls and auto accidents. Bifocals or graduated lenses can help those with presbyopia read, but they also contribute to falls and accidents because they can impair contrast sensitivity (the ability to distinguish between shades of gray) and depth perception. I’m 45. I don’t need to correct my vision for presbyopia yet, but I can tell it’s coming. I can still read the The New York Times print edition with ease, but to read text in somewhat smaller fonts, I have to strain. Any year now, I figured my eye doctor would tell me it was time to talk about bifocals. Or so I thought. Then I undertook a monthslong, strenuous regimen designed to train my brain to correct for what my eye muscles no longer can manage. © 2017 The New York Times Company
Link ID: 23407 - Posted: 03.27.2017
By STEPH YIN For animals that hibernate, making it to spring is no small feat. Torpor — the state of reduced bodily activity that occurs during hibernation — is not restful. By the time they emerge, hibernating animals are often sleep-deprived: Most expend huge bursts of energy to arouse themselves occasionally in the winter so their body temperatures don’t dip too low. This back-and-forth is exhausting, and hibernators do it with little to no food and water. By winter’s end, some have shed more than half their body weight. But just because it’s spring doesn’t mean it’s time to celebrate. Spring means getting ready for the full speed of summer — and after spending a season in slow motion, that requires some ramping up. Here’s a look at what different animals have on the agenda after coming out of winter’s slumber. Black bears emerge from their dens in April, but stay lethargic for weeks. During this so-called walking hibernation, they sleep plenty and don’t roam very far. Though they have lost up to one-third of their body weight over winter, they don’t have a huge appetite right away — their metabolism is not yet back to normal. They snack mostly on pussy willows and bunches of snow fleas. In January or February, some females give birth, typically to two or three cubs. New mothers continue to hibernate, but they go in and out of torpor, staying alert enough to respond to their cubs’ cries. When they emerge from their dens, mama bears find trees with rough bark that her cubs can easily climb for safety. “Slowly, the bears’ metabolism gears up to normal, active levels,” said Lynn Rogers, a bear expert and principal biologist at the Wildlife Research Institute, a nonprofit in Minnesota. “When plants start sprouting on the forest floor, that’s when they start really moving around.” © 2017 The New York Times Company
Keyword: Biological Rhythms
Link ID: 23406 - Posted: 03.25.2017
By Diana Kwon Astrocytes, star-shape glial cells in the brain, were once simply considered support cells for neurons. However, neuroscientists have recently realized they have many other functions: studies have shown that astrocytes are involved in metabolism, learning, and more. In the latest study to investigate astrocytes’ roles in the brain, researchers confirmed these cells played a key role in regulating mouse circadian rhythms. The team’s results were published today (March 23) in Current Biology. “Recent results have indicated that [glia] are actively modulating synaptic transmission, the development of the nervous system, and changes in the nervous system in response to changes in the environment,” said coauthor Erik Herzog, a neuroscientist at Washington University in St. Louis. “So circadian biologists recognized that the system that we study in the brain also had astrocytes and have wondered what role that they might play.” In 2005, Herzog’s team published a seminal study linking glia to mammalian circadian rhythms. By investigating rat and mouse astrocytes in a dish, the researchers discovered that these glial cells showed circadian rhythms in gene expression. Since then, several independent groups have reported evidence to suggest that astrocytes help regulate daily rhythms. However, linking astrocytes to circadian behaviors in mice remained difficult. “I had several folks in the lab over many years [who] were unable to find the tools that would allow us to answer the question definitively: Do astrocytes play a role in scheduling our day?” Herzog recalled. “Then, within the last year or so, some new tools . . . became available for us.”. © 1986-2017 The Scientist
USA Today Network Josh Hafner , For college students, new parents and employees dogged by deadlines, the all-nighter is nothing new. But going without sleep leaves you basically drunk, putting you at the equivalent of a .1% blood alcohol content as you drive to work, make decisions and interact with others. “The first thing that goes is your ability to think," said Joseph Ojile, M.D., a board member with the National Sleep Foundation. Judgement, memory and concentration all suffer impairment by the body's 17th hour without sleep, he said. “We know at 17 hours, you're at .08% blood alcohol level," he said, the legal standard for drunk driving. "At 24 hours, you’re at 0.1%." Coordination deteriorates as well in those intervening hours, said Ojile, a professor at Saint Louis University School of Medicine. Irritability sets in, too. Pain becomes more acute and the immune system suffers, he said, leaving the body more open to infection. "Here’s the worst part about the lack of judgement," Ojile said. "The person is unaware of their impairment. How scary is that? ‘I’m fine, I’ll just drive home. I’ll do my work at the nuclear plant, no problem. Or fly the plane, no problem.’" It's not entirely clear how the effects worsen past 24 hours, Ojile said, other than they do. The brain starts shutting down in trance-like microsleeps, 15- to 30-second spells that occur without the person noticing. Eventually, not sleeping results in death.
Link ID: 23404 - Posted: 03.25.2017