Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 3341 - 3360 of 29528

By Michelle Roberts Health editor, BBC News online An infectious disease that can harm the brain and is spread to people by tick bites has been identified in ticks in the UK for the first time. Public Health England (PHE) says it has confirmed cases of tick-borne encephalitis virus in ticks from two parts of England - Thetford Forest and an area on the Hampshire-Dorset border. PHE says the risk to people is still "very low". It is monitoring the situation to check how common the infected ticks may be. What is it? A tick is a tiny, spider-like creature that lives in undergrowth and on animals, including deer and dogs. People who spend time walking in countryside areas where infected ticks can be found are at risk of being bitten and catching diseases they carry. Tick-borne encephalitis virus is already circulating in mainland Europe and Scandinavia, as well as Asia. Evidence now shows it has reached the UK. How it got here is less clear. Experts say infected ticks may have hitched a ride on migratory birds. Earlier this year, a European visitor, who has since recovered, became ill after being bitten by a tick while in the New Forest area, Public Health England says. Further investigations revealed infected ticks were present in two locations in England. Should I worry? Ticks are becoming more common across many parts of the UK, largely due to increasing deer numbers. Being bitten by one doesn't necessarily mean you will get sick. Dr Nick Phin, from Public Health England, said: ''These are early research findings and indicate the need for further work. However, the risk to the general public is currently assessed to be very low." Most people who catch the virus will have no or only mild flu-like symptoms. But the disease can progress to affect the brain and central nervous system and can sometimes be fatal. © 2019 BBC

Keyword: Miscellaneous
Link ID: 26782 - Posted: 11.02.2019

By Laura Sanders Every 20 seconds, a wave of fresh cerebrospinal fluid rolls into the sleeping brain. These slow, rhythmic blasts, described for the first time in the Nov. 1 Science, may help explain why sleep is so important for brain health. Studies on animals have shown that the fluid, called CSF, can wash harmful proteins, including those implicated in Alzheimer’s disease, out of the brain. The new results give heft to the idea that a similar power wash happens in sleeping people. Researchers studied 13 healthy, young people in an MRI scanner as they fell into non-REM sleep, the type of slumber that takes up most of the night. At the same time, the scientists monitored different sorts of activity in participants’ heads. Electrodes measured the activity of large collections of nerve cells, and functional MRI measured the presence of oxygenated blood that gives energy to those nerve cells. By using a form of rapid fMRI, the team also measured another type of activity — the movements of CSF in the brain. Fast fMRI revealed waves of fresh CSF that flowed rhythmically into the sleeping brains, a pattern that was obvious — and big, says study coauthor Laura Lewis, a neuroscientist and engineer at Boston University. “I’ve never had something jump out at me to this degree,” she says. “It was very striking.” Awake people have small, gentle waves of CSF that are largely linked to breathing patterns. In contrast, the sleep waves were tsunamis. “The waves we saw during sleep were much, much larger, and higher velocity,” Lewis says. © Society for Science & the Public 2000–2019

Keyword: Sleep
Link ID: 26781 - Posted: 11.01.2019

By Christof Koch “And death shall have no dominion”—Dylan Thomas, 1933 You will die, sooner or later. We all will. For everything that has a beginning has an end, an ineluctable consequence of the second law of thermodynamics. Few of us like to think about this troubling fact. But once birthed, the thought of oblivion can’t be completely erased. It lurks in the unconscious shadows, ready to burst forth. In my case, it was only as a mature man that I became fully mortal. I had wasted an entire evening playing an addictive, first-person shooter video game—running through subterranean halls, flooded corridors, nightmarishly turning tunnels, and empty plazas under a foreign sun, firing my weapons at hordes of aliens relentlessly pursuing me. I went to bed, easily falling asleep but awoke abruptly a few hours later. Abstract knowledge had turned to felt reality—I was going to die! Not right there and then but eventually. Advertisement Evolution equipped our species with powerful defense mechanisms to deal with this foreknowledge—in particular, psychological suppression and religion. The former prevents us from consciously acknowledging or dwelling on such uncomfortable truths while the latter reassures us by promising never-ending life in a Christian heaven, an eternal cycle of Buddhist reincarnations or an uploading of our mind to the Cloud, the 21st-century equivalent of rapture for nerds. Death has no such dominion over nonhuman animals. Although they can grieve for dead offspring and companions, there is no credible evidence that apes, dogs, crows and bees have minds sufficiently self-aware to be troubled by the insight that one day they will be no more. Thus, these defense mechanisms must have arisen in recent hominin evolution, in less than 10 million years. © 2019 Scientific American

Keyword: Consciousness
Link ID: 26780 - Posted: 11.01.2019

By Kelly Servick CHICAGO, ILLINOIS—In 2014, U.S. regulators approved a futuristic treatment for blindness. The device, called Argus II, sends signals from a glasses-mounted camera to a roughly 3-by-5-millimeter grid of electrodes at the back of eye. Its job: Replace signals from light-sensing cells lost in the genetic condition retinitis pigmentosa. The implant’s maker, Second Sight, estimates that about 350 people in the world now use it. Argus II offers a relatively crude form of artificial vision; users see diffuse spots of light called phosphenes. “None of the patients gave up their white cane or guide dog,” says Daniel Palanker, a physicist who works on visual prostheses at Stanford University in Palo Alto, California. “It’s a very low bar.” But it was a start. He and others are now aiming to raise the bar with more precise ways of stimulating cells in the eye or brain. At the annual meeting of the Society for Neuroscience here last week, scientists shared progress from several such efforts. Some have already advanced to human trials—“a real, final test,” Palanker says. “It’s exciting times.” Several common disorders steal vision by destroying photoreceptors, the first cells in a relay of information from the eye to the brain. The other players in the relay often remain intact: the so-called bipolar cells, which receive photoreceptors’ signals; the retinal ganglion cells, which form the optic nerve and carry those signals to the brain; and the multilayered visual cortex at the back of the brain, which organizes the information into meaningful sight. Because adjacent points in space project onto adjacent points on the retina, and eventually activate neighboring points in an early processing area of the visual cortex, a visual scene can be mapped onto a spatial pattern of signals. But this spatial mapping gets more complex along the relay, so some researchers aim to activate cells as close to the start as possible. © 2019 American Association for the Advancement of Science

Keyword: Vision; Robotics
Link ID: 26779 - Posted: 11.01.2019

Ricardo F. Muñoz I have been convinced of the importance of prevention in addressing mental-health problems since the early 1970s, when I began my doctorate in clinical psychology. But only now is there sufficient evidence from clinical trials of the effectiveness of preventive interventions, using approaches derived from interpersonal and cognitive behavioural therapy, to justify deploying them. And only now are the tools available to make such interventions available to people worldwide. Two recent reports underline this conclusion. In February, the US Preventive Services Task Force, an independent panel of experts in evidence-based medicine, urged clinicians to “provide or refer pregnant and postpartum persons who are at increased risk of perinatal depression to counseling interventions”1. And last month, the US National Academies of Sciences, Engineering, and Medicine (NASEM) released a report2 calling on various stakeholders, from educators to policymakers, to prevent mental-health disorders and to promote healthy mental, emotional and behavioural development in the under 25s. (I was a member of the committees that prepared this document and two previous NASEM reports in 1994 and 2009 on preventive interventions3,4.) The latest NASEM call to action2 is so all-encompassing, it is hard to know where to begin. I propose that initial efforts focus on preventing depression in pregnant women or in women who have recently given birth (perinatal depression). There is substantial evidence for the effectiveness of providing such women with basic skills in mood management5. These interventions could have an impact across generations, because better maternal mental health is linked to babies’ healthier development2. And if researchers and health-care systems were to monitor and compare the epidemiology of depression in thousands of mothers and their children in areas that have or have not deployed preventive interventions, stakeholders could measure their effect on entire communities. © 2019 Springer Nature Limited

Keyword: Depression
Link ID: 26778 - Posted: 11.01.2019

By Derek Lowe So Amgen has exited the neuroscience area, with a good-sized round of layoffs at their research site Cambridge. The company has a migraine drug (Aimovig) that they’ll continue to support, and they’ll stick with their existing clinical programs, but it looks like all the early-stage stuff is gone. What does this mean? Not as much as you might think. Neuroscience is indeed hard, and Amgen’s not the only company to rethink its commitment to it (Eli Lilly did something similar last month with their neuro efforts in the UK). But there are still plenty of participants, large and small – it’s not that the field is being totally abandoned by pharma. It’s just being abandoned by Amgen, because they have other areas that look a lot more promising for them. And let’s face it, Amgen is a bit of an oddity, anyway – it’s not for nothing that they get referred to as a law firm with fume hoods. Enbrel is what pays a lot of the bills over there, and Enbrel is (and has long been) a patent-court story, not a research one. Inflammation, cardiovascular disease, and oncology are going to be the focus there, and given the company’s portfolio, that makes a lot of sense. It looks like the only neuro programs going on will be the ones that intersect with the larger inflammation area. One interesting thing that came out of the company’s statements was that management felt that a lot of the neuroscience landscape is focused on what their CFO David Meline called “orphan or niche diseases”, and that the company wants to work on things that will have a broader impact. Now, it’s not like there isn’t a neuroscience disease with a huge health impact, and it’s one that even has some inflammation and cardiovascular connections. So one of the things that Amgen is saying is “No Alzheimer’s research for us, thanks”. © 2017 American Association for the Advancement of Science

Keyword: Alzheimers
Link ID: 26777 - Posted: 11.01.2019

By Vanessa Barbara SÃO PAULO, Brazil — It’s hard to feel normal when you wake up at 4 p.m. every day. No, I’m not a nurse who works the evening shift. No, I’m not the hard-partying heir to a Brazilian agribusiness fortune. And before you think it, I’m not lazy, either — I’ve written seven books so far! I sleep until the late afternoon because I’ve finally learned, after fighting it for years, that it’s better to come across as pathetic than to be always exhausted, depressed or sick. I have a severe case of delayed sleep phase syndrome, a chronic misalignment of the body’s circadian rhythms with the daily light-dark cycle of our environment. The phrase “night owl” doesn’t really do it justice; my natural bedtime is around 6 a.m. While we as a culture are gradually becoming more aware of the many ways that bodies can differ from the norm, much of the world still takes for granted that people sleep at night and are awake during the day. Not me. I miss having lunch. According to conventional wisdom, going to bed early and waking up with the birds is a mere matter of habit and will power. This misconception is widespread, even among doctors. And for a long time, I believed it. I spent years taking melatonin and Ambien in order to fall asleep by 2 a.m.; I used to wake up at 11 a.m. and then spend the rest of the day on stimulants such as Provigil and Ritalin. Yet I was always tired and depressed — the outcome that so often results when we try to force ourselves to be different from what we naturally need to be. The last two decades have seen rapid advances in the field of chronobiology, the study of the biochemical clocks that keep our natural physiological rhythms. The 2017 Nobel Prize in Medicine, for instance, was awarded to three American geneticists for their discoveries of molecular mechanisms controlling the circadian rhythms in fruit flies. © 2019 The New York Times Company

Keyword: Biological Rhythms; Sleep
Link ID: 26776 - Posted: 10.31.2019

Specialized brain activation “replays” the possible routes that rats can take as they navigate a space, helping them keep track of the paths they’ve already taken and choose among the routes that they can take next, according to a National Institutes of Health-funded study published in the journal Neuron. “These findings reveal an internal ‘replay’ process in the brain that allows animals to learn from past experiences to form memories of paths leading toward goals, and subsequently to recall these paths for planning future decisions,” said Shantanu Jadhav, Ph.D., assistant professor at Brandeis University, Waltham, Massachusetts, and senior author of the study. “These results help us better understand how coordinated activation at the level of neurons can contribute to the complex processes involved in learning and decision-making.” The hippocampus, a structure located in the middle of the brain, is critical to learning and memory and contains specialized “place” cells that relay information about location and orientation in space. These place cells show specific patterns of activity during navigation that can be “replayed” later in forward or reverse order, almost as if the brain were fast-forwarding or rewinding through routes the rats have taken. In previous research, Jadhav and colleagues had discovered these replay events, marked by bursts of neural activity called sharp-wave ripples, lead to coordinated activity in the hippocampus and the prefrontal cortex, an area of the brain just behind the forehead that is involved in decision-making.

Keyword: Learning & Memory
Link ID: 26775 - Posted: 10.31.2019

Terry Gross Ever since childhood, author Kevin Wilson has lived with disturbing images that flash through his mind without warning. "I've always had this kind of agitation and looping thoughts and small tics," he says. "Falling off of tall buildings, getting stabbed, catching on fire — they were these just quick, kind of violent bursts in my head." Not that Wilson would ever harm anyone else — the harm in these quick, intrusive thoughts was strictly internal. The images fed off of his own anxiety, and left him feeling terrified. It wasn't until Wilson was diagnosed with Tourette's syndrome as an adult that he began to understand what he was seeing. At first, he was skeptical of the diagnosis; Tourette's is a neurological disorder often characterized by involuntary vocal or motor tics, and Wilson's version wasn't what he'd seen portrayed on TV or in books. "Mine is so much more internal," he says. "Those images and looping tics are in my head. And so a lot of the work that I'm doing is just keeping it in there." One way that Wilson helps control the images is to include them through his writing. His new novel, Nothing to See Here, is about a woman who takes over the care of twin children who burst into flames when they're afraid or angry. "Writing is, I think, the thing that saved me — being able to transfer what was in my head onto the page," he says. "There's this freedom that once it ... goes out into the world and you publish it, you're kind of free of it for a little while — at least it's somebody else's problem." © 2019 npr

Keyword: Tourettes
Link ID: 26774 - Posted: 10.31.2019

By Aimee Cunningham At age 37, Hope Hartman developed a painful, burning rash in her right ear, in the part “you would clean with a Q-tip,” the Denver resident says. The pain got so bad she went to a local emergency room, where the staff was flummoxed. Hartman was admitted to the hospital, where she started to lose sensation on the right side of her face. During that 2013 health crisis, Hartman’s husband, Mike, sent a picture of the ear to his mom, a nurse. She said it looked like zoster, better known as shingles, which is caused by the varicella zoster virus. She “diagnosed it from an iPhone photo,” Hartman recalls. Antiviral treatment didn’t fully clear the infection. For about two weeks after her release from the hospital, Hartman coped with severe pain, hearing loss and difficulty eating. Her right eye wouldn’t fully open or close. Following an appointment with neurologist Maria Nagel of the University of Colorado School of Medicine in Aurora, Hartman was admitted to the university’s hospital to get another antiviral drug intravenously. The pain subsided, and Hartman regained her hearing and the feeling in her face. To spare others the same trauma of a delayed diagnosis, Hartman arranged for Nagel to give a talk on the virus at the local hospital where staff missed the signs of the illness, known as Ramsay Hunt syndrome. That’s the name for a shingles infection that strikes the facial nerve important to facial movement. As Hartman experienced, varicella zoster virus can cause a grab bag of symptoms that go beyond the typical torso rash. Hartman’s young age didn’t help with the diagnosis. Shingles is more common in people 50 and older. But no one is risk-free. Varicella zoster virus lives in about 95 percent of the U.S. adult population, thanks to the virus’s first line of attack: chicken pox. The body eventually clears the itchy, red pox from the skin, but the virus remains, dormant in nerve cells. The rash kept scores of U.S. children home from school until about 1995 (when a vaccine became available). © Society for Science & the Public 2000–2019.

Keyword: Pain & Touch
Link ID: 26773 - Posted: 10.31.2019

By Nicholas Bakalar A healthy diet may help relieve the symptoms of depression. There is good evidence from observational studies that diet can affect mood, and now a randomized controlled trial suggests that healthy eating can modestly improve clinical levels of depression. The study, in PLOS One, randomized 76 college students with poor diet and depression symptoms to two groups. One group was put on a Mediterranean-style diet high in fruits, vegetables, fish, olive oil, nuts and seeds, and low in refined carbohydrates, sugar and saturated fat. The other continued their usual eating habits. At the beginning and end of the three-week trial, all participants were assessed with well-validated scales measuring depression, anxiety, current mood, memory and self-efficacy (confidence in one’s ability to exert control over behavior). Symptoms of depression improved, on average, in the diet group, shifting from the moderate severity range to the normal range. Depressive symptoms among the controls, meanwhile, remained stable, staying within the moderate severity range. On tests of anxiety and stress, the diet group had significantly lower scores than the controls, after controlling for levels of anxiety and stress at the start of the study. There were no differences between the two groups in memory or self-efficacy scores. The study controlled for smoking, physical activity, B.M.I. and other factors. © 2019 The New York Times Company

Keyword: Depression; Obesity
Link ID: 26772 - Posted: 10.31.2019

By Gretchen Reynolds Taking more steps during the day may be related to better sleep at night, according to an encouraging new study of lifestyle and sleep patterns. The study, which delved into the links between walking and snoozing, suggests that being active can influence how well we sleep, whether we actually exercise or not. Sleep and exercise scientists have long been intrigued and befuddled by the ties between physical activity and somnolence. To most of us, it might seem as if that relationship should be uncomplicated, advantageous and one-way. You work out, grow tired and sleep better that night. But a variety of past studies indicate that the effects of exercise on sleep are more scrambled than that. In some studies, when people work out strenuously, they sleep relatively poorly, suggesting that intense exercise might disrupt slumber. Other experiments have found that the impacts of exertion and sleep work both ways; after a night of ragged sleep, people often report finding their normal workout extra wearing. Past research also has produced conflicting results about whether and how the timing of exercise matters, and if afternoon workouts aid or impair that night’s sleep. Most of these past studies have focused on planned exercise, though, not more incidental, everyday physical activity, and much of the research has involved people with clinical sleep problems, such as insomnia. Little has been known about whether simply moving around more during the day, absent formal exercise, might influence sleep, particularly in people who already tend to sleep fairly well. © 2019 The New York Times Company

Keyword: Sleep
Link ID: 26771 - Posted: 10.30.2019

By Robert Martone We humans have evolved a rich repertoire of communication, from gesture to sophisticated languages. All of these forms of communication link otherwise separate individuals in such a way that they can share and express their singular experiences and work together collaboratively. In a new study, technology replaces language as a means of communicating by directly linking the activity of human brains. Electrical activity from the brains of a pair of human subjects was transmitted to the brain of a third individual in the form of magnetic signals, which conveyed an instruction to perform a task in a particular manner. This study opens the door to extraordinary new means of human collaboration while, at the same time, blurring fundamental notions about individual identity and autonomy in disconcerting ways. Direct brain-to-brain communication has been a subject of intense interest for many years, driven by motives as diverse as futurist enthusiasm and military exigency. In his book Beyond Boundaries one of the leaders in the field, Miguel Nicolelis, described the merging of human brain activity as the future of humanity, the next stage in our species’ evolution. (Nicolelis serves on Scientific American’s board of advisers.) He has already conducted a study in which he linked together the brains of several rats using complex implanted electrodes known as brain-to-brain interfaces. Nicolelis and his co-authors described this achievement as the first “organic computer” with living brains tethered together as if they were so many microprocessors. The animals in this network learned to synchronize the electrical activity of their nerve cells to the same extent as those in a single brain. The networked brains were tested for things such as their ability to discriminate between two different patterns of electrical stimuli, and they routinely outperformed individual animals. © 2019 Scientific American

Keyword: Robotics; Language
Link ID: 26770 - Posted: 10.30.2019

Sarah Boseley Health editor The use of cannabis medicines to treat people with depression, anxiety, psychosis or other mental health issues cannot be justified because there is little evidence that they work or are safe, according to a major new study. A review of evidence from trials conducted over nearly 40 years, published in the journal Lancet Psychiatry, concludes that the risks outweigh the benefits. And yet, say the authors, they are being given to people with mental health problems in Australia, the US and Canada, and demand is likely to grow. Prof Louisa Degenhardt of the National Drug and Alcohol Research Centre at UNSW Sydney, Australia, lead author of the study, said the findings had important implications in countries where medical use was allowed. “There is a notable absence of high-quality evidence to properly assess the effectiveness and safety of medicinal cannabinoids compared with placebo, and until evidence from randomised controlled trials is available, clinical guidelines cannot be drawn up around their use in mental health disorders,” she said. “In countries where medicinal cannabinoids are already legal, doctors and patients must be aware of the limitations of existing evidence and the risks of cannabinoids. These must be weighed when considering use to treat symptoms of common mental health disorders. Those who decide to proceed should be carefully monitored for positive and negative mental health effects of using medicinal cannabinoids.” © 2019 Guardian News & Media Limited

Keyword: Drug Abuse; Schizophrenia
Link ID: 26769 - Posted: 10.30.2019

By Zeynep Tufekci More than a billion people around the world have smartphones, almost all of which come with some kind of navigation app such as Google or Apple Maps or Waze. This raises the age-old question we encounter with any technology: What skills are we losing? But also, crucially: What capabilities are we gaining? Talking with people who are good at finding their way around or adept at using paper maps, I often hear a lot of frustration with digital maps. North/south orientation gets messed up, and you can see only a small section at a time. And unlike with paper maps, one loses a lot of detail after zooming out. I can see all that and sympathize that it may be quite frustrating for the already skilled to be confined to a small phone screen. (Although map apps aren’t really meant to be replacements for paper maps, which appeal to our eyes, but are actually designed to be heard: “Turn left in 200 feet. Your destination will be on the right.”) But consider what digital navigation aids have meant for someone like me. Despite being a frequent traveler, I’m so terrible at finding my way that I still use Google Maps almost every day in the small town where I have lived for many years. What looks like an inferior product to some has been a significant expansion of my own capabilities. I’d even call it life-changing. Part of the problem is that reading paper maps requires a specific skill set. There is nothing natural about them. In many developed nations, including the U.S., one expects street names and house numbers to be meaningful referents, and instructions such as “go north for three blocks and then west” make sense to those familiar with these conventions. In Istanbul, in contrast, where I grew up, none of those hold true. For one thing, the locals rarely use street names. Why bother when a government or a military coup might change them—again. House and apartment numbers often aren’t sequential either because after buildings 1, 2 and 3 were built, someone squeezed in another house between 1 and 2, and now that’s 4. But then 5 will maybe get built after 3, and 6 will be between 2 and 3. Good luck with 1, 4, 2, 6, 5, and so on, sometimes into the hundreds, in jumbled order. Besides, the city is full of winding, ancient alleys that intersect with newer avenues at many angles. © 2019 Scientific American

Keyword: Attention; Learning & Memory
Link ID: 26768 - Posted: 10.30.2019

By Lisa Sanders, M.D. “Please find something wrong with me,” the 28-year-old woman pleaded. For nearly a year, she’d been looking for a reason for the strange symptoms that now dominated her life. Dr. Raphael Sung, a cardiologist specializing in finding and fixing abnormal heart rhythms at National Jewish Health hospital in Denver, was surprised by her reaction to the news that her heart was normal. Most patients are happy to get that report. For this patient, it seemed like just one more dead end. The patient’s symptoms started right after her baby was born 10 months earlier. Out of nowhere, her heart would start beating like crazy. At first, she assumed that these were anxiety attacks, triggered by the stress of bringing her premature daughter home. Her baby spent her first week of life in the newborn intensive care unit. When she was big enough to come home, she still weighed only four pounds, nine ounces. The new mother worried that without the doctors and nurses and equipment that had kept her alive, her tiny baby might die. But she didn’t. She seemed to thrive at home. Despite that, her mother’s heart continued to take off like a spooked horse several times a day. After a couple of weeks, her symptoms worsened. Sometimes her racing heart would set off terrible headaches, the worst she’d ever had. It was as if someone had thrust a sharp stick deep into her brain. The knife of pain quickly turned into a sense of pressure so intense it felt as if the back of her skull would blow off. Minutes later, she would feel the blood drain from her face; she’d be suddenly drenched in sweat. Her hands would curl into tight fists, and vomit would shoot out of her mouth like a geyser. Her husband joked (though only once) that she looked like the girl in “The Exorcist.” © 2019 The New York Times Company

Keyword: Hormones & Behavior
Link ID: 26767 - Posted: 10.30.2019

By Gary Stix Sigmund Freud never uttered the word neuroscience. Neither did Santiago Ramón y Cajal. It was biophysicist Francis Schmitt who grafted “neuro” with “science” in 1962 when he established the Neurosciences Research Program at MIT. The new moniker was intended to encompass a merging of relevant neuro disciplines, ranging as far afield as physiology, psychology, immunology, physics and chemistry. Brains and behaviors have been scrutinized for millennia. But as psychology blogger Vaughn Bell has pointed out, the 1960s marked a shift in perspective. Neuroscience was the formal name given by Schmitt. But the period represented the beginnings of a “neuroculture,” that put brain science on a pedestal —even leading to the familiar meme proclaiming “my brain made me do it.” One example was rooted in pharmaceutical companies’ development of psychiatric drugs that resulted in their investing “millions both into divining the neurochemistry of experience and into massive marketing campaigns that linked brain functions to the psyche,” Bell notes. The field received an adrenaline boost precisely 50 years ago with the founding of the Society for Neuroscience, allowing Schmitt’s collaborative vision to be globally shared. SFN’s first annual meeting in 1971 drew 1,395 attendees to Washington, D.C. This year’s wrapped up on October 23, bringing more than 27,500 to Chicago—and the annual numbers have occasionally topped 30,000. SFN now boasts 37,000 members from more than 95 countries. © 2019 Scientific American

Keyword: Miscellaneous
Link ID: 26766 - Posted: 10.30.2019

By Perri Klass, M.D. Sleeping through the night is a hot topic in pediatrics, so it was no surprise that there was a standing-room-only crowd for a lecture on it at the national conference of the American Academy of Pediatrics in New Orleans over the weekend. The speaker, Dr. Adiaha I.A. Spinks-Franklin, a developmental behavioral pediatrician, did her training at Children’s Hospital, Boston, where her teachers included the pediatric sleep expert, Dr. Richard Ferber, whose name has become a verb: “we Ferberized our baby.” But Dr. Spinks-Franklin, an associate professor of pediatrics at Baylor College of Medicine, wasn’t talking about the burning question of whether to let babies cry. In her presentation, “Strategies to Help Sleepless Teens,” she started by reviewing the factors that can contribute to inadequate sleep in adolescents: social media and electronic devices in the bedroom. Intensely caffeinated drinks. The pressures of heavily overloaded schedules, including academic demands, extracurricular activities, travel sports teams, jobs and social lives. The biology of adolescent sleep reflects a natural and normal delay in melatonin secretion that leads to a later sleep onset time, which unfortunately coincides with early high school start times, creating a high-stress set up. Pediatricians often see adolescents with insomnia, who have trouble falling asleep or staying asleep, waking up too early or finding sleep not restful or refreshing. Evaluating insomnia in an adolescents means looking at the predisposing factors, she said, including how that adolescent responds to stress, and possible genetic influences, and the precipitating factors — the specific triggers for insomnia — and finally, the perpetuating factors, which can keep the pattern going. All these adolescents should be screened for depression and anxiety, Dr. Spinks-Franklin said; both can affect sleep onset or sleep maintenance. And both are alarmingly common in adolescents. © 2019 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 26765 - Posted: 10.29.2019

Patti Neighmond More Americans have been getting less than seven hours of sleep a night in the past several years, especially in professions such as health care. ER Productions Limited/Getty Images If you often hit that midafternoon slump and feel drowsy at your desk, you're not alone. The number of working Americans who get less than seven hours of sleep a night is on the rise. And the people hardest hit when it comes to sleep deprivation are those we depend on the most for our health and safety: police and health care workers, along with those in the transportation field, such as truck drivers. In a recent study, researchers from Ball State University in Muncie, Ind., analyzed data from the National Health Interview Survey. They looked at self-reports of sleep duration among 150,000 adults working in different occupations from 2010 to 2018. Researchers found the prevalence of inadequate sleep, defined as seven hours or less, increased from 30.9% in 2010 to 35.6% in 2018. But it was worse for police officers and health care workers. Around half of respondents in these professions reported not getting seven hours a night. For many, the norm was six or even just five hours. The researchers didn't examine why sleep time is dwindling. But Jagdish Khubchandani — professor of health science at Ball State University who headed the study — speculates one of the biggest reasons has to do with stress, which is on the rise among Americans. © 2019 npr

Keyword: Sleep
Link ID: 26764 - Posted: 10.29.2019

By Anahad O’Connor In recent years, hospitals and medical centers across the country have stopped selling sugar-sweetened beverages in an effort to reduce obesity and diabetes. Now a new study carried out at the University of California, San Francisco, has documented the health impact of a soda sales ban on its employees. Ten months after a sales ban went into effect, U.C.S.F. workers who tended to drink a lot of sugary beverages had cut their daily intake by about half. By the end of the study period, the group had, on average, reduced their waist sizes and belly fat, though they did not see any changes in their body mass index. Those who cut back on sugary beverages also tended to see improvements in insulin resistance, a risk factor for Type 2 diabetes. The new research, published on Monday in JAMA Internal Medicine, is the first peer-reviewed study to examine whether a workplace sales ban on sugary drinks could lead to reduced consumption of the beverages and improve employee health. At least nine other University of California campuses have said they are going to adopt similar initiatives to reduce sugary beverage sales and promote water consumption. “This was an intervention that was easy to implement,” said Elissa Epel, an author of the study and director of the Aging, Metabolism, and Emotions Center at U.C.S.F. “It’s promising because it shows that an environmental change can help people over the long run, particularly those who are consuming large-amounts of sugary beverages, and possibly even lead to a reduction in their risk of cardiometabolic disease.” In recent years, the link between sugar and obesity has drawn increasing scientific attention. Health authorities say that Americans have gotten fatter because they are consuming too many calories of all kinds. But some experts have singled out the role of added sugar consumption, which increased more than 30 percent between 1977 and 2010. © 2019 The New York Times Company

Keyword: Obesity
Link ID: 26763 - Posted: 10.29.2019