Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Robin McKie New visions of the brain and body’s detailed operations will be unveiled by a suite of medical scanners being opened this week. The newly refurbished Wolfson Brain Imaging Centre in the University of Cambridge has been equipped with some of the world’s most powerful magnetic resonance imaging (MRI) and positron emission tomography (PET) scanners and will give its researchers unprecedented power to make images of cancers, study the precise makeup of the cortex and analyse how chemicals in the brain – known as neurotransmitters – underpin the development of schizophrenia and depression. “It is a remarkable set of machines,” says Professor Ed Bullmore, head of neuroscience at Cambridge University. “We will be able to address clinical issues such as the detailed progression of Parkinson’s disease. At the same time, we will be able to address basic issues about the mind. How does the brain develop? How does the adult brain perform its functions?” At the heart of the refurbished centre – funded by the Medical Research Council, Wellcome Trust and Cancer Research UK – are three groundbreaking devices. Only a handful of these exist at institutions outside Cambridge and no institution – other than Cambridge – has all three. “The devices we have assembled are primarily for studying humans and will have a strong research focus,” Bullmore says. A key example is provided by the 7T MRI scanner. Current devices have magnetic fields that have strengths of around 3T (tesla) and can see structures 2-3 mm in size. By contrast, the new Cambridge scanner with its 7T field will have a resolution of around 0.5mm. © 2016 Guardian News and Media Limited
Keyword: Brain imaging
Link ID: 22782 - Posted: 10.24.2016
By KATE MURPHY Eavesdrop on any conversation or pay close attention to your own and you’ll hear laughter. From explosive bursts to muffled snorts, some form of laughter punctuates almost all verbal communication. Electronic communication, too, LOL. You’ll probably also notice that, more often than not, the laughter is in response to something that wasn’t very funny — or wasn’t funny at all. Observational studies suggest this is the case 80 percent to 90 percent of the time. Take Hillary Clinton’s strategic laughter during heated exchanges with Donald J. Trump during the presidential debates. Or Jimmy Fallon’s exaggerated laughter when interviewing guests on “The Tonight Show.” Or employees at Fox News reporting that they tried to “laugh off” unwanted sexual advances by Roger Ailes and others within the organization. How laughter went from a primal signal of safety (the opposite of a menacing growl) to an odd assortment of vocalizations that smooth as much as confuse social interactions is poorly understood. But researchers who study laughter say reflecting on when and why you titter, snicker or guffaw is a worthy exercise, given that laughter can harm as much as help you. “It’s a hall of mirrors of inferences and intentions every time you encounter laughter,” said Sophie Scott, a neuroscientist at University College London who studies how the brain produces and processes laughter. “You think it’s so simple. It’s just jokes and ha-ha but laughter is really sophisticated and complicated.” Laughter at its purest and most spontaneous is affiliative and bonding. To our forebears it meant, “We’re not going to kill each other! What a relief!” But as we’ve developed as humans so has our repertoire of laughter, unleashed to achieve ends quite apart from its original function of telling friend from foe. Some of it is social lubrication — the warm chuckles we give one another to be amiable and polite. Darker manifestations include dismissive laughter, which makes light of something someone said sincerely, and derisive laughter, which shames. © 2016 The New York Times Company
Ian Sample Science editor Experiments with a fake body part have revealed how the brain becomes confused during a party trick known as the rubber hand illusion. Researchers in Italy performed the trick on a group of volunteers to explore how the mind combines information from the senses to create a feeling of body ownership. Under the illusion, people feel that a rubber hand placed on the table before them is their own, a bizarre but convincing shift in perception that is accompanied by a sense of disowning their real hand. The scientists launched the study after noticing that some stroke patients in their care experienced similar sensations, at times becoming certain that a paralysed limb was not their own, and even claiming ownership over other people’s appendages. “It is a very strong belief,” said Francesca Garbarini at the University of Turin. “We know that the feeling of body ownership can be dramatically altered after brain damage.” For the study, healthy volunteers sat with their forearms resting on a table and their right hand hidden inside a box. A lifelike rubber hand was then placed in front of them and lined up with their right shoulder. A cloth covered the stump of the hand, but the fingers remained visible. To induce the illusion, one of the researchers stroked the middle finger of the participant’s real hand while simultaneously stroking the same finger on the rubber hand. © 2016 Guardian News and Media Limited
Keyword: Pain & Touch
Link ID: 22780 - Posted: 10.24.2016
by Bethany Brookshire Most of us spend our careers trying to meet — and hopefully exceed — expectations. Scientists do too. But the requirements for success in a job in academic science don’t always line up with the best scientific methods. The net result? Bad science doesn’t just happen — it gets selected for. What does it mean to be successful in science? A scientist gets a job and funding by publishing a lot of high-impact papers with novel findings. Those papers and findings beget awards and funding to do more science — and publish more papers. “The problem that we face is that the incentive system is focused almost entirely on getting research published, rather than on getting research right,” says Brian Nosek, a psychologist at the University of Virginia in Charlottesville. This idea of success has become so ingrained that scientists are even introduced when they give talks by the number of papers they have published or the amount of grant funding they have, says Marc Edwards, a civil engineer at Virginia Polytechnic Institute and State University in Blacksburg. But rewarding researchers for the number of papers they publish results in a “natural selection” of sloppy science, new research shows. The idea of scientific “success” equated as number of publications promotes not just lazy science but also unethical science, another paper argues. Both articles proclaim that it’s time for a culture shift. But with many scientific labs to fund and little money to do it, what does a new, better scientific enterprise look like? © Society for Science & the Public 2000 - 2016
Link ID: 22779 - Posted: 10.24.2016
By Kensy Cooperrider, Rafael Núñez “What is the difference between yesterday and tomorrow?” The Yupno man we were interviewing, Danda, paused to consider his answer. A group of us sat on a hillside in the Yupno Valley, a remote nook high in the mountains of Papua New Guinea. Only days earlier we had arrived on a single-engine plane. After a steep hike from the grass airstrip, we found ourselves in the village of Gua, one of about 20 Yupno villages dotting the rugged terrain. We came all the way here because we are interested in time—in how Yupno people understand concepts such as past, present and future. Are these ideas universal, or are they products of our language, our culture and our environment? As we interviewed Danda and others in the village, we listened to what they said about time, but we paid even closer attention to what they did with their hands as they spoke. Gestures can be revealing. Ask English speakers about the difference between yesterday and tomorrow, and they might thrust a hand over the shoulder when referring to the past and then forward when referring to the future. Such unreflective movements reveal a fundamental way of thinking in which the past is at our backs, something that we “leave behind,” and the future is in front of us, something to “look forward” to. Would a Yupno speaker do the same? Danda was making just the kinds of gestures we were hoping for. As he explained the Yupno word for “yesterday,” his hand swept backward; as he mentioned “tomorrow,” it leaped forward. We all sat looking up a steep slope toward a jagged ridge, but as the light faded, we changed the camera angle, spinning around so that we and Danda faced in the opposite direction, downhill. With our backs now to the ridge, we looked over the Yupno River meandering toward the Bismarck Sea. “Let's go over that one more time,” we suggested. © 2016 Scientific American,
Link ID: 22778 - Posted: 10.22.2016
Bret Stetka Every day in the United States, millions of expectant mothers take a prenatal vitamin on the advice of their doctor. The counsel typically comes with physical health in mind: folic acid to help avoid fetal spinal cord problems; iodine to spur healthy brain development; calcium to be bound like molecular Legos into diminutive baby bones. But what about a child's future mental health? Questions about whether ADHD might arise a few years down the road or whether schizophrenia could crop up in young adulthood tend to be overshadowed by more immediate parental anxieties. As a friend with a newborn daughter recently fretted over lunch, "I'm just trying not to drop her!" Yet much as pediatricians administer childhood vaccines to guard against future infections, some psychiatrists now are thinking about how to shift their treatment-centric discipline toward one that also deals in early prevention. In 2013, University of Colorado psychiatrist Robert Freedman and colleagues recruited 100 healthy, pregnant women from greater Denver to study whether giving the B vitamin choline during pregnancy would enhance brain growth in the developing fetus. The moms-to-be were randomly given either a placebo or a form of choline called phosphatidylcholine. Choline itself is broken down by bacteria in the gut; by giving it in this related form the supplement can more effectively be absorbed into the bloodstream. © 2016 npr
By Laura Wright, Researchers have the clearest-ever picture of the receptor that gives humans the 'high' from marijuana, which could lead to a better understanding of how the drug affects humans. Scientists have long known that molecules from THC, the psychoactive component of marijuana, bind to and activate the receptor known as CB1. But now they know that it has a three-dimensional crystal structure. The authors of the paper, which was published Thursday in the journal Cell, say this information is crucial to improve our understanding of this receptor as marijuana use becomes widespread and, in many places, legalized. Now that they know the shape of the receptor, they can get a better idea of how different molecules bind to it, which is what causes reactions in humans. "What is important is to understand how different molecules bind to the receptor, how they control the receptor function, and how this can affect different people," said Raymond Stevens, co-author of the study. Dr. Mark Ware, the executive director of the Canadian Consortium for the Investigation of Cannabinoids and the director of clinical research at the Alan Edwards pain management unit at the McGill University Health Centre, called the discovery a "breakthrough." "Suddenly we've been given the design of the building," he explained. "We can work out ways to get in the building, we know where the windows and doors and stairs are, and we know kind of how the building is structured now." They both said that knowing the receptor's design can lead to better drug design. K2 synthetic pot It's also a key step to understanding the differences between natural cannabinoids, found in the marijuana plant, and synthetic cannabinoids, made in labs. ©2016 CBC/Radio-Canada.
Keyword: Drug Abuse
Link ID: 22776 - Posted: 10.22.2016
By Nathaniel P. Morris When meeting new people, I'm often asked what I do for work. Depending on how I phrase my answer, I receive very different reactions."I'm a doctor specializing in mental health" elicits fascination. People's faces brighten and they say, "Very cool!" But If I instead say, "I'm a psychiatrist," the conversation falls quiet. They get uncomfortable and change the subject. Mental health has made great strides in recent years. Every week, people across the country participate in walks to support mental health causes. The White House now designates May as National Mental Health Awareness Month. In the presidential race, Hillary Clinton released a comprehensive plan to invest in mental health care. Yet psychiatry—the medical specialty focused on mental health—remains looked down upon in nearly every corner of our society. The public often doesn’t regard psychiatrists as medical doctors. Many view psychiatric treatments as pseudoscience at best and harmful at worst. Even among health professionals, it’s one of the least respected medical specialties. The field is in serious decline. Academic papers abound with titles like “Is psychiatry dying?” and “Are psychiatrists an endangered species?” Despite growing mental health needs nationwide, fewer medical students are applying into the field, and the number of psychiatrists in the US is falling. Patients too often refuse treatment because of stigma related to the field. © 2016 Scientific American
By Agata Blaszczak-Boxe Some rodents have a sweet tooth. And sometimes, you need to get crafty to reach your sugar fix. Rats have been filmed for the first time using hooked tools to get chocolate cereal – a manifestation of their critter intelligence. Akane Nagano and Kenjiro Aoyama, of Doshisha University in Kyotanabe, Japan, placed eight brown rats in a transparent box and trained them to pull small hooked tools to obtain the cereal that was otherwise beyond their reach. In one experiment they gave them two similar hooked tools, one of which worked well for the food retrieval task, and the other did not. The rats quickly learned to choose the correct tool for the job, selecting it 95 per cent of the time. The experiments showed that the rats understood the spatial arrangement between the food and the tool. The team’s study is the first to demonstrate that rats are able to use tools, says Nagano. The rats did get a little confused in the final experiment. When the team gave them a rake that looked the part but with a bottom was too soft and flimsy to move the cereal, they still tried to use it as much as the working tool that was also available. But, says Nagano, it is possible their eyesight was simply not good enough for them to tell that the flimsy tool wasn’t up to the task. The rodents’ crafty feat places them in the ever-growing club of known tool-using animals such as chimps, bearded capuchin monkeys, New Caledonian crows, alligators and even some fish. © Copyright Reed Business Information Ltd.
Laura Sanders Pain is contagious, at least for mice. After encountering bedding where mice in pain had slept, other mice became more sensitive to pain themselves. The experiment, described online October 19 in Science Advances, shows that pain can move from one animal to another — no injury or illness required. The results “add to a growing body of research showing that animals communicate distress and are affected by the distress of others,” says neuroscientist Inbal Ben-Ami Bartal of the University of California, Berkeley. Neuroscientist Andrey Ryabinin and colleagues didn’t set out to study pain transfer. But the researchers noticed something curious during their experiments on mice who were undergoing alcohol withdrawal. Mice in the throes of withdrawal have a higher sensitivity to pokes on the foot. And surprisingly, so did these mice’s perfectly healthy cagemates. “We realized that there was some transfer of information about pain” from injured mouse to bystander, says Ryabinin, of Oregon Health & Sciences University in Portland. When mice suffered from alcohol withdrawal, morphine withdrawal or an inflaming injection, they become more sensitive to a poke in the paw with a thin fiber — a touchy reaction that signals a decreased pain tolerance. Mice that had been housed in the same cage with the mice in pain also grew more sensitive to the poke, Ryabinin and colleagues found. These bystander mice showed other signs of heightened pain sensitivity, such as quickly pulling their tails out of hot water and licking a paw after an irritating shot. |© Society for Science & the Public 2000 - 20
By Catherine Caruso Imagine you are faced with the classic thought experiment dilemma: You can take a pile of money now or wait and get an even bigger stash of cash later on. Which option do you choose? Your level of self-control, researchers have found, may have to do with a region of the brain that lets us take the perspective of others—including that of our future self. A study, published today in Science Advances, found that when scientists used noninvasive brain stimulation to disrupt a brain region called the temporoparietal junction (TPJ), people appeared less able to see things from the point of view of their future selves or of another person, and consequently were less likely to share money with others and more inclined to opt for immediate cash instead of waiting for a larger bounty at a later date. The TPJ, which is located where the temporal and parietal lobes meet, plays an important role in social functioning, particularly in our ability to understand situations from the perspectives of other people. However, according to Alexander Soutschek, an economist at the University of Zurich and lead author on the study, previous research on self-control and delayed gratification has focused instead on the prefrontal brain regions involved in impulse control. “When you have a closer look at the literature, you sometimes find in the neuroimaging data that the TPJ is also active during delay of gratification,” Soutschek says, “but it's never interpreted.” © 2016 Scientific American
Link ID: 22772 - Posted: 10.20.2016
Hannah Devlin Science correspondent Monkeys have been observed producing sharp stone flakes that closely resemble the earliest known tools made by our ancient relatives, proving that this ability is not uniquely human. Previously, modifying stones to create razor-edged fragments was thought to be an activity confined to hominins, the family including early humans and their more primitive cousins. The latest observations re-write this view, showing that monkeys unintentionally produce almost identical artefacts simply by smashing stones together. The findings put archaeologists on alert that they can no longer assume that stone flakes they discover are linked to the deliberate crafting of tools by early humans as their brains became more sophisticated. Tomos Proffitt, an archaeologist at the University of Oxford and the study’s lead author, said: “At a very fundamental level - if you’re looking at a very simple flake - if you had a capuchin flake and a human flake they would be the same. It raises really important questions about what level of cognitive complexity is required to produce a sophisticated cutting tool.” Unlike early humans, the flakes produced by the capuchins were the unintentional byproduct of hammering stones - an activity that the monkeys pursued decisively, but the purpose of which was not clear. Originally scientists thought the behaviour was a flamboyant display of aggression in response to an intruder, but after more extensive observations the monkeys appeared to be seeking out the quartz dust produced by smashing the rocks, possibly because it has a nutritional benefit. © 2016 Guardian News and Media Limited
Link ID: 22771 - Posted: 10.20.2016
Tina Hesman Saey VANCOUVER — Zika virus’s tricks for interfering with human brain cell development may also be the virus’s undoing. Zika infection interferes with DNA replication and repair machinery and also prevents production of some proteins needed for proper brain growth, geneticist Feiran Zhang of Emory University in Atlanta reported October 19 at the annual meeting of the American Society of Human Genetics. Levels of a protein called p53, which helps control cell growth and death, shot up by 80 percent in human brain cells infected with the Asian Zika virus strain responsible for the Zika epidemic in the Americas, Zhang said. The lab dish results are also reported in the Oct. 14 Nucleic Acids Research. Increased levels of the protein stop developing brain cells from growing and may cause the cells to commit suicide. A drug that inactivates p53 stopped brain cells from dying, Zhang said. Such p53 inhibitors could help protect developing brains in babies infected with Zika. But researchers would need to be careful giving such drugs because too little p53 can lead to cancer. Zika also makes small RNA molecules that interfere with production of proteins needed for DNA replication, cell growth and brain development, Zhang said. In particular, a small viral RNA called vsRNA-21 reduced the amount of microcephalin 1 protein made in human brain cells in lab dishes. The researchers confirmed the results in mouse experiments. That protein is needed for brain growth; not enough leads to the small heads seen in babies with microcephaly. Inhibitors of the viral RNAs might also be used in therapies, Zhang suggested. |© Society for Science & the Public 2000 - 2016
Keyword: Development of the Brain
Link ID: 22770 - Posted: 10.20.2016
Hannah Devlin Science correspondent Migraine sufferers have a different mix of gut bacteria that could make them more sensitive to certain foods, scientists have found. The study offers a potential explanation for why some people are more susceptible to debilitating headaches and why some foods appear to act as triggers for migraines. The research showed that migraine sufferers had higher levels of bacteria that are known to be involved in processing nitrates, which are typically found in processed meats, leafy vegetables and some wines. The latest findings raise the possibility that migraines could be triggered when nitrates in food are broken down more efficiently, causing vessels in the brain and scalp to dilate. Antonio Gonzalez, a programmer analyst at the University of California San Diego and the study’s first author, said: “There is this idea out there that certain foods trigger migraines - chocolate, wine and especially foods containing nitrates. We thought that perhaps there are connections between what people are eating, their microbiomes and their experiences with migraines.” When nitrates in food are broken down by bacteria in the mouth and gut they are eventually converted into nitric oxide in the blood stream, a chemical that dilates blood vessels and can aid cardiovascular health by boosting circulation. © 2016 Guardian News and Media Limited
By Meredith Knight In June, international diabetes organizations endorsed provocative new guidelines suggesting physicians should consider gastric bypass surgery for a greatly expanded number of diabetics—those with a body mass index of 30 and above as opposed to just those with a BMI of 40 or more. Research has shown that the surgery helps people lose more weight, maintain the loss longer and achieve better blood glucose levels than those who slim down by changing diet and exercise habits. Now a study in mice suggests the effectiveness of bariatric surgery may stem in part from changes it causes in the brain. According to the study, published in the International Journal of Obesity, gastric bypass surgery causes the hyperactivation of a neural pathway that leads from stomach-sensing neurons in the brain stem to the lateral parabrachial nucleus, an area in the midbrain that receives sensory information from the body, and then to the amygdala, the brain's emotion- and fear-processing center. The obese mice underwent so-called Roux-en-Y bypass surgery, in which surgeons detach most of the stomach, leaving only a tiny pouch connected to the small intestine. Shortly after the surgery, the mice begin to show increased activation in this neural pathway, along with reduced meal size and a preference for less fatty food. They also begin to secrete higher levels of satiety hormones. Similar behavioral and hormonal patterns are found in humans after bypass surgery, suggesting that the brain changes may also be similar—but the authors say looking at this particular circuit in humans with brain imaging is difficult because the resolution is not up to the task. © 2016 Scientific American,
Link ID: 22768 - Posted: 10.19.2016
Susan Milius For widemouthed, musical midshipman fish, melatonin is not a sleep hormone — it’s a serenade starter. In breeding season, male plainfin midshipman fish (Porichthys notatus) spend their nights singing — if that’s the word for hours of sustained foghorn hums. Males dig trysting nests under rocks along much of North America’s Pacific coast, then await females drawn in by the crooning. New lab tests show that melatonin, familiar to humans as a possible sleep aid, is a serenade “go” signal, says behavioral neurobiologist Ni Feng of Yale University. From fish to folks, nighttime release of melatonin helps coordinate bodily timekeeping and orchestrate after-dark biology. The fish courtship chorus, however, is the first example of the hormone prompting a launch into song, according to Andrew Bass of Cornell University. And what remarkable vocalizing it is. The plainfin midshipman male creates a steady “mmm” by quick-twitching specialized muscles around its air-filled swim bladder up to 100 times per second in chilly water. A fish can extend a single hum for about two hours, Feng and Bass report October 10 in Current Biology. That same kind of super-fast muscle shakes rattle-snake tails and trills vocal structures in songbirds and bats. |© Society for Science & the Public 2000 - 2016
By MARC SANTORA The morning after Christine Grounds gave birth to her son Nicholas, she awoke to find a neurologist examining her baby. It was summer 2006, and Nicholas was her first child. There had been no indication that anything was wrong during her pregnancy, but it was soon clear that there was a problem. “Did you know he has microcephaly?” she remembers the doctor asking matter-of-factly. Confused, she replied, “What is microcephaly?” This was before the Zika virus had spread from Brazil across South and Central America and the Caribbean and reached Florida. It was before doctors had determined that the virus could cause microcephaly, a birth defect in which children have malformed heads and severely stunted brain development. And it was before people had seen the devastating pictures of scores of newborns with the condition in Brazil and elsewhere that shocked the world this year. Ms. Grounds, a 45-year-old psychotherapist, and her husband, Jon Mir, who live in Manhattan, had no idea what microcephaly would mean for them or for their child. “We had a diagnosis but no prognosis,” recalled Mr. Mir, 44, who works in finance. The doctors could offer few answers. “We don’t know if he will walk,” the couple recalled being told. “We don’t know if he will talk. He might be in a vegetative state.” But the truth was, even the doctors did not know. As mosquito season draws to a close in much of the country, taking with it the major risk of new Zika infections, there are still more than 2,600 pregnant women who have tested positive for the virus in the United States and its territories, according to the Centers for Disease Control and Prevention. They, and thousands more around the world, face the prospect of giving birth to a child with microcephaly. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22766 - Posted: 10.19.2016
By Jessica Hamzelou Is depression caused by an inflamed brain? A review of studies looking at inflammation and depression has found that a class of anti-inflammatory drugs can ease the condition’s symptoms. Golam Khandaker at the University of Cambridge and his colleagues analysed 20 clinical studies assessing the effects of anti-cytokine drugs in people with chronic inflammatory conditions. These drugs block the effects of cytokines – proteins that control the actions of the immune system. Anti-cytokines can dampen down inflammation, and are used to treat rheumatoid arthritis. Together, these trials involved over 5,000 volunteers, and provide significant evidence that anti-cytokine drugs can also improve the symptoms of depression, Khandaker’s team found. These drugs work about as well as commonly used antidepressants, they say. The most commonly used anti-depressant drugs, known as SSRIs, act to increase levels of serotonin in the brain, to improve a person’s mood. But depression might not always be linked to a lack of serotonin, and SSRIs don’t work for everyone. Recent research has found that around a third of people with depression appear to have higher levels of cytokines in their brains, while people with “overactive” immune systems seem more likely to develop depression. Khandaker’s team think that inflammation in the brain might be responsible for the fatigue experienced by people with depression. © Copyright Reed Business Information Ltd.
By Meredith Wadman The second century C.E. Greek physician and philosopher Galen advised patients suffering from disorders of the spirit to bathe in and drink hot spring water. Modern day brain scientists have posited that Galen’s prescription delivered more than a placebo effect. Lithium has for decades been recognized as an effective mood stabilizer in bipolar disease, and lithium salts may have been present in the springs Galen knew. Yet exactly how lithium soothes the mind has been less than clear. Now, a team led by Ben Cheyette, a neuroscientist at the University of California in San Francisco (UCSF), has linked its success to influence over dendritic spines, tiny projections where excitatory neurons form connections, or synapses, with other nerve cells. Lithium treatment restored healthy numbers of dendritic spines in mice engineered to carry a genetic mutation that is more common in people with autism, schizophrenia, and bipolar disorder than in unaffected people, they report today in Molecular Psychiatry. The lithium also reversed symptoms in these mutant mice—lack of interest in social interactions, decreased motivation, and increased anxiety—that mimic those in the human diseases. “They showed there’s a correlation between the ability of lithium to reverse not only the behavioral abnormalities in the mice, but also the [dendritic] spine abnormalities,” says Scott Soderling, a neuroscientist at Duke University in Durham, North Carolina, who studies how dysfunctions in signaling at brain synapses and lead to psychiatric disorders. Soderling adds that the work also sheds light on the roots of these diseases. “It gives further credence to this idea that these spine abnormalities are functionally linked to the behavioral disorders.” © 2016 American Association for the Advancement of Science.
Link ID: 22764 - Posted: 10.18.2016
Laura Sanders When the body’s internal sense of time doesn’t match up with outside cues, people can suffer, and not just from a lack of sleep. Such ailments are similar in a way to motion sickness — the queasiness caused when body sensations of movement don’t match the external world. So scientists propose calling time-related troubles, which can afflict time-zone hoppers and people who work at night, “circadian-time sickness.” This malady can be described, these scientists say, with a certain type of math. The idea, to be published in Trends in Neurosciences, is “intriguing and thought-provoking,” says neuroscientist Samer Hattar of Johns Hopkins University. “They really came up with an interesting idea of how to explain the mismatch.” Neuroscientist Raymond van Ee of Radboud University in the Netherlands and colleagues knew that many studies had turned up ill effects from an out-of-whack circadian clock. Depression, metabolic syndromes and memory troubles have been found alongside altered daily rhythms. But despite these results, scientists don’t have a good understanding of how body clocks work, van Ee says. Van Ee and colleagues offer a new perspective by using a type of math called Bayesian inference to describe the circadian trouble. Bayesian inference can be used to describe how the brain makes and refines predictions about the world. This guesswork relies on the combination of previous knowledge and incoming sensory information (SN: 5/28/16, p. 18). In the case of circadian-time sickness, these two cues don’t match up, the researchers propose. |© Society for Science & the Public 2000 - 2016
Link ID: 22763 - Posted: 10.18.2016