Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 5541 - 5560 of 29538

LaVonne Moore has Alzheimer's disease, but her doctors hope her dementia symptoms could possibly be kept in check by a new type of treatment. Electric wires implanted deep in her brain stimulate areas involved with decision-making and problem-solving. Unlike many long-term dementia patients, LaVonne, 85, can cook meals, dress herself and organise outings. But it remains unclear whether her deep brain stimulation (DBS) therapy is responsible for her independence. DBS is already helping hundreds of thousands of patients with Parkinson's disease to overcome symptoms of tremor, but its use in Alzheimer's is still very experimental. Image copyright Ohio State University Only a small number of DBS studies have been done for Alzheimer's and they have focused on stimulating brain regions governing memory, rather than judgement. But Dr Douglas Scharre and colleagues at the Ohio State University Wexner Medical Center believe their approach, which targets the decision-making frontal lobe of the brain, might help patients keep their independence for longer. LaVonne's brain pacemaker was implanted three and a half years ago. Since then, her husband, Tom, from Delaware, Ohio, says her dementia has worsened - but more slowly than he had expected. "LaVonne has had Alzheimer's disease longer than anybody I know, and that sounds negative, but it's really a positive thing because it shows that we're doing something right." Two other patients have had the same treatment as LaVonne, but only one of them appeared to benefit significantly, according to the Journal of Alzheimer's Disease. Experts say it is too early to say if the treatment will help counteract cognitive decline. © 2018 BBC

Keyword: Alzheimers
Link ID: 24589 - Posted: 01.30.2018

By NIRAJ CHOKSHI Years of scolding from health experts about a good night’s rest may be breaking through. Americans are finally getting more sleep — about 18 minutes more per weeknight compared with 2003. It may not sound like much, but researchers say it’s a positive sign. Weeknight Sleep, in Hours Americans gained an average 1.4 minutes per year in weeknight sleep between 2003 and 2016. “If we only got more sleep, we would then see that we actually perform better and would probably be more creative and more productive during the day,” said Dr. Mathias Basner, the associate professor of sleep and chronobiology in psychiatry at the University of Pennsylvania and the lead author of the analysis of federal survey data, published this month in the journal “Sleep.” The incremental gains took place over 13 years. Dr. Basner and his colleague, Dr. David F. Dinges, found that Americans gained about 1.4 minutes of sleep per weeknight each year between 2003 and 2016. People also slept more on weekends, though the improvement was not as great — an extra 50 seconds of sleep per weekend night per year, a total gain of about 11 minutes. On average, Americans get more than eight hours of sleep on weeknights and more on weekends, according to the data. But sleep length varies widely. According to the Centers for Disease Control and Prevention, more than a third of adults get insufficient sleep, which it defines as less than seven hours. © 2018 The New York Times Company

Keyword: Sleep
Link ID: 24588 - Posted: 01.30.2018

Nearly a third of older adults don’t get solid zzz’s, according to a University of Michigan poll of 1,065 people age 65 and older. To help them sleep, 36 percent report taking a prescription drug, over-the-counter aid or a dietary supplement such as melatonin. But research suggests the benefits are modest at best. A Consumer Reports Best Buy Drugs analysis found that people taking prescription sleep medications such as Ambien (zolpidem and generic) or Lunesta (eszopiclone and generic) fell asleep only eight to 20 minutes faster than people taking a placebo. Worse, prescription sedatives and some OTC sleep aids can be risky, especially for older adults, with side effects that can include dry mouth, confusion, dizziness, next-day drowsiness, and impaired balance and coordination. Taking sleep meds may also cause dependency, increase the risk of car accidents, and more than double the risk of falls and fractures — common reasons for hospitalization and death in older adults, according to Consumer Reports’ Choosing Wisely campaign. Because of these dangers, the American Geriatrics Society includes the potent prescription sleep drugs — Ambien, Lunesta and zaleplon (Sonata) — on its list of medications that adults age 65 and older should avoid. Compounding those dangers is the tendency for many to use the medications for longer than recommended. In a 2015 Consumer Reports survey of 4,023 U.S. adults, 41 percent of people who used OTC sleep aids reported taking them for a year or longer. Most of these drugs should be taken for just a few weeks or less. That’s because mounting evidence suggests that long-term use of certain sleep meds that contain diphenhydramine, found in products including Sominex, Tylenol PM and ZzzQuil; antihistamines such as Benadryl; and some cold and cough medicines may increase the chances of dementia. © 1996-2018 The Washington Post

Keyword: Sleep; Development of the Brain
Link ID: 24587 - Posted: 01.30.2018

By Jessica Wright Nearly 20 years ago, a new strain of mice debuted in a California laboratory. The mice were missing a gene called SCN2A that helps neurons transmit electrical currents. And the study announcing their genesis was the last word on the matter for many years. About a decade later, the mouse’s creator, Maurice Montal, sacrificed the few animals left from the colony. He had sent some of his mice to other researchers, and some ended up with a team in Houston, Texas. But they, too, eventually stopped working with the strain. Perhaps the only one who continued to work with the mice was a postdoctoral researcher named Edward Glasscock, who brought the mice from Houston to the University of Louisiana when he launched his own lab. After years of work, Glasscock found that a mutation in SCN2A can muffle the effect of another mutation that triggers sudden death in people with epilepsy. That turned out to be only the beginning of the mice’s comeback. In June 2016, Kevin Bender, an autism researcher, sent Glasscock an urgent request asking for the mice. Requests from two other autism researchers quickly followed. Soon the mice were populating labs in San Francisco, Baltimore, and France. “I was surprised that there would be such a rush to get them,” says Glasscock, assistant professor of cellular biology and anatomy at Louisiana State University. “Back when I first started working with the mice, it would never have been on my radar that they would have been an important gene for autism.” © 1986-2018 The Scientist

Keyword: Autism
Link ID: 24586 - Posted: 01.30.2018

By JANE E. BRODY The media love contrarian man-bites-dog stories that purport to debunk long-established beliefs and advice. Among the most popular on the health front are reports that saturated fats do not cause heart disease and that the vegetable oils we’ve been encouraged to use instead may actually promote it. But the best-established facts on dietary fats say otherwise. How well polyunsaturated vegetable oils hold up health-wise when matched against saturated fats like butter, beef fat, lard and even coconut oil depends on the quality, size and length of the studies and what foods are eaten when fewer saturated fats are consumed. So before you succumb to wishful thinking that you can eat well-marbled steaks, pork ribs and full-fat dairy products with abandon, you’d be wise to consider the findings of what is probably the most comprehensive, commercially untainted review of the dietary fat literature yet published. They are found in a 26-page advisory prepared for the American Heart Association and published last June by a team of experts led by Dr. Frank M. Sacks, professor of cardiovascular disease prevention at the Harvard T.H. Chan School of Public Health. The report helps to explain why the decades-long campaign to curb cardiovascular disease by steering the American diet away from animal fats has been less successful than it might have been and how it inadvertently promoted expanding waistlines and an epidemic of Type 2 diabetes. When people cut back on a particular nutrient, they usually replace it with something else to maintain their needed caloric input. Unfortunately, in too many cases, saturated fats — and fats in general — gave way to refined carbohydrates and sugars, the so-called SnackWell phenomenon that prompted fat-wary eaters to overindulge in high-calorie, low-nutrient foods. © 2018 The New York Times Company

Keyword: Obesity
Link ID: 24585 - Posted: 01.30.2018

Tim McDonnell Last April, Fredua Agyemang, a musician in Kumasi, Ghana, was performing onstage at a funeral, which in this country is often a festive affair with hundreds of guests. Suddenly, he began to feel dizzy, then lost consciousness and collapsed. When he woke up three days later, his bandmates broke the news: He had suffered a stroke. Immediately, he thought of another doctor visit eight years earlier, when, at the age of 34, he had been diagnosed with hypertension and prescribed medication to reduce his blood pressure. The medication had given him problems with erectile dysfunction, a common side effect, and he soon stopped taking it regularly. That decision seemed foolish, he recalls. He was having difficulty moving and speaking and knew that he wouldn't be back onstage anytime soon. "I still have weakness," he says, nine months later. "I'm not able to walk well, I can't use my left arm, I can't sing." Doctors found that Agyemang's stroke was hemorrhagic, meaning that a blood vessel in his brain burst from excessive pressure. In the U.S., this type of stroke is rare; nearly 90 percent of strokes in the U.S. are "ischemic," meaning they're caused by a clot or other blockage of a blood vessel in the brain. But according to a new study, the largest-ever of stroke patients in Africa, up to one-third of strokes in this area of the world are hemorrhagic. And while the survival rate for ischemic strokes is around 80 percent, for hemorrhagic strokes the odds of survival are only 50/50. Agyemang is lucky to be alive. © 2018 npr

Keyword: Stroke
Link ID: 24584 - Posted: 01.29.2018

By David Kohn Each year, thousands of Americans suffer a traumatic brain injury. In 2013, about 2.8 million TBI-related emergency department visits, hospitalizations and deaths occurred in the United States, according to the Centers for Disease Control and Prevention. Most of these are what are called mild traumatic brain injuries, or mTBIs — head injuries that don’t cause a coma. People with an mTBI typically get better within a few weeks, but for as many as 20 percent, problems can linger for months or years. Many of these patients find themselves stuck with depression, cognitive problems, headaches, fatigue and other symptoms. Known as post-concussion syndrome, this phenomenon is often difficult to treat. Antidepressants can lift moods, painkillers can ease headaches and physical therapy may ease dizziness, but most researchers agree that these remedies don’t heal the injury within the brain. Could oxygen do the trick? A growing group of scientists and physicians say that hyperbaric treatment, which exposes patients to pure oxygen at higher-than-normal air pressure, may work. “These patients don’t have enough oxygen to heal the injured parts of their brains,” said Shai Efrati, a researcher and physician at Tel Aviv University in Israel and a leading hyperbaric scientist. “Hyperbaric treatment massively increases the amount of oxygen available to the brain.” But other researchers believe that the treatment has no merit and should not be recommended. © 1996-2018 The Washington Post

Keyword: Brain Injury/Concussion; Stroke
Link ID: 24583 - Posted: 01.29.2018

Sarah Varney Two-year old Maverick Hawkins sits on a red plastic car in his grandmother's living room in the picturesque town of Nevada City, Calif., in the foothills of the Sierra Nevada mountains. His playpal Delilah Smith, a fellow 2-year old, snacks on hummus and cashews and delights over the sounds of her Princess Peppa stuffie. It's playtime for the kids of the provocatively named FaceBook group "Pot Smoking Moms Who Cuss Sometimes." Maverick's mother, Jenna Sauter, started the group after he was born. "I was a new mom, a young mom — I was 22 — and I was just feeling really lonely in the house, taking care of him," she says. She wanted to reach out to other mothers but didn't want to hide her marijuana use. "I wanted friends who I could be open with," Sauter says — "like I enjoy going to the river and I like to maybe smoke a joint at the river." There are nearly 2,600 members now in the FaceBook group. Marijuana, which became legal for recreational use in California earlier this month, is seen by many group members as an all-natural and seemingly harmless remedy for everything from morning sickness to post-partum depression. Delilah Smith's mom Andria is 21 and a week away from her due date with her second child. She took umbrage when an emergency room physician recently suggested she take "half a Norco"— a pill akin to Vicodin, an opioid-based painkiller — for her excruciating back pain. © 2018 npr

Keyword: Development of the Brain; Drug Abuse
Link ID: 24582 - Posted: 01.29.2018

Emily Willingham How many times did I say it – to myself, out loud alone or out loud to others, throughout my childhood? ‘I wish I were a boy.’ The words were mine, a fervent and frequent wish. They were not born of a feeling of mismatch between external expectations and internal signals. Except for a lifelong tension with society’s mixed messages about what it means to be a woman, I’m comfortable identifying as the gender assigned to me. But I wished for boyness because the boys did so many things I wanted to do and was excluded from doing because I was a girl. My body and my brain mapped to each other just fine, but my body didn’t map at all to what society told these boys – and me – I was allowed to do. As many a woman can attest, this feeling of belonging in male spaces that lock you out doesn’t end with teenhood, adulthood, careerhood or parenthood. An aficionado of adventure stories, I couldn’t – still can’t – help but notice that the places men can go are often No Women’s Lands for someone like me. Not because I lack the physicality, strength or stamina to traverse them but because the mere presentation of being female is itself dangerous. Realistically, it invites violence, exclusion and violation in too many ways to be considered anything but a liability. And then there are the less wild places, just boys’ clubs and men’s clubs, de facto or tacit, where being a girl or woman means being viewed as an intruder or, as women have always known, being subject to harassment or worse. Every day, I see men circle their masculinity like musk oxen, protective and exclusionary, in my professions of academia and journalism. Even in the virtual world of social media, they reflexively exclude women who are their peers in expertise and competence while readily engaging men who are neither. I am wryly amused when people committed to the idea that men and women are cognitively different throw women the double-edged bone of being ‘better at verbal expression’. (Look, we’re good at a thing! That you’ll also use to make fun of us chatty, chatty Cathies!) I read that and think of who receives most of the major book awards and other writing accolades. Hint: it’s men. I’ll wager that the social factors involved in the latter contribute to the assumptions underlying the former. © Aeon Media Group Ltd. 2012-2018.

Keyword: Sexual Behavior
Link ID: 24581 - Posted: 01.29.2018

By Jennifer Hassan The first time sleep paralysis struck me was in the winter of 2012. My grandfather had recently died, and I was spending time at my grandmother’s house. After 60 years of marriage, she wasn’t used to being alone or to the sadness an empty home can bring. Determined to help her in any way I could, I moved into her spare bedroom. As night came, I tucked her into bed and turned out the light — a task she had done for me on countless occasions growing up. The role reversal saddened me but also gave me an overwhelming urge to protect one of the most important women in my life. I lay down in the next bedroom and listened to her muffled sobs. I woke up a few hours later, feeling cold. As I went to pull the blankets up around me, I realized I couldn’t move. I began to panic. What was happening to me? Why was my body paralyzed? I tried to lift my arms: Nothing. My head was cemented to the pillow, my body embedded, frozen. Then the pressure came, pushing against my chest. The more I panicked, the harder it became to breathe. Like something out of a bad horror movie, I tried to scream, but no words came out. Unable to move my eyes, I had no option but to stare upward into the darkness. I couldn’t see anyone else, but for some reason it felt as if I had company. There was a hidden presence and it was tormenting me, refusing to let me go. After what felt like hours but was probably just a few minutes, I was able to move again. Shaking, I switched the bedroom light on and sat upright in bed until morning came. © 1996-2018 The Washington Post

Keyword: Sleep
Link ID: 24580 - Posted: 01.29.2018

Sara Reardon Superconducting computing chips modelled after neurons can process information faster and more efficiently than the human brain. That achievement, described in Science Advances on 26 January1, is a key benchmark in the development of advanced computing devices designed to mimic biological systems. And it could open the door to more natural machine-learning software, although many hurdles remain before it could be used commercially. Artificial intelligence software has increasingly begun to imitate the brain. Algorithms such as Google’s automatic image-classification and language-learning programs use networks of artificial neurons to perform complex tasks. But because conventional computer hardware was not designed to run brain-like algorithms, these machine-learning tasks require orders of magnitude more computing power than the human brain does. “There must be a better way to do this, because nature has figured out a better way to do this,” says Michael Schneider, a physicist at the US National Institute of Standards and Technology (NIST) in Boulder, Colorado, and a co-author of the study. NIST is one of a handful of groups trying to develop ‘neuromorphic’ hardware that mimics the human brain in the hope that it will run brain-like software more efficiently. In conventional electronic systems, transistors process information at regular intervals and in precise amounts — either 1 or 0 bits. But neuromorphic devices can accumulate small amounts of information from multiple sources, alter it to produce a different type of signal and fire a burst of electricity only when needed — just as biological neurons do. As a result, neuromorphic devices require less energy to run. © 2018 Macmillan Publishers Limited

Keyword: Robotics; Learning & Memory
Link ID: 24579 - Posted: 01.27.2018

Carl Zimmer For centuries, people have drawn the line between nature and nurture. In the nineteenth century, the English polymath Francis Galton cast nature-versus-nurture in scientific terms. He envisioned a battle between heredity and experience that shapes each of us. “When nature and nurture compete for supremacy…the former proves the stronger,” Galton wrote in 1874. Today, scientists can do something Galton couldn’t imagine: they can track the genes we inherit from our parents. They are gaining clues to how that genetic legacy influences many aspects of our experience, from our risk of developing cancer to our tendency to take up smoking. But determining exactly how any particular variation in DNA shapes the course of our life is proving far trickier than Galton would have guessed. There is no clean line between nature and nurture: How a particular variant acts, if at all, may depend on your environment. A study published on Thursday offers a striking new demonstration of this complexity. Genes may help determine how long children stay in school, the researchers found, but some of those genes operate at a distance — by influencing parents. The study was published in Science. The authors go on coin a new phrase for this effect: “genetic nurture.” To scientists accustomed to tracing the links between the genes you carry and the traits they govern, it’s a headspinning idea. A genetic variant may shape you not because it directly influences you, but because it changes those around you, noted Paige Harden, a psychologist at the University of Texas who co-authored a commentary on the new study: “Something is happening outside your own skin.” © 2018 The New York Times Company

Keyword: Development of the Brain; Genes & Behavior
Link ID: 24578 - Posted: 01.27.2018

By Shawna Williams When the late organic chemist John Daly was on the hunt for poisonous frogs, he employed an unadvisable method: “It involved touching the frog, then sampling it on the tongue. If you got a burning sensation, then you knew this was a frog you ought to collect,” he once told a National Institutes of Health (NIH) newsletter writer. Daly survived to gather frogs from South America, Madagascar, Australia, and Thailand, and he extracted more than 500 compounds from their skin (many of which the frogs in turn had harvested from their insect diets). One of these compounds, the toxin epibatidine, turned out to have an analgesic effect 200 times more potent than morphine in rodents, Daly and his colleagues reported in 1992 (J Am Chem Soc, 114:3475-78, 1992); and rather than working through opioid receptors, epibatidine bound to nicotinic receptors. “To have a drug that works as well [as opioids] but is actually targeting a completely independent receptor system is really one of those holy grails of the drug industry,” says Daniel McGehee, who studies nicotinic receptors at the University of Chicago. But an epibatidine-related compound tested by Abbott Labs as an analgesic in the late 2000s caused uncontrollable vomiting, McGehee says. Although research on nicotinic receptors continues, he’s not aware of any epibatidine analogs currently in the drug development pipeline. But frogs may yet hold clues to killing pain. At least one frog does deploy an opioid: the waxy monkey tree frog (Phyllomedusa sauvagii), whose skin is laced with the peptide dermorphin. Although the compound does not appear to be a toxin that wards off predators, dermorphin has about 40 times the potency of morphine in a guinea-pig ileum assay, but it doesn’t effectively cross the blood-brain barrier, says pharmacologist Tony Yaksh of the University of California, San Diego. Dermorphin also boasts an unusual chemical property: the inclusion of a D-amino acid in its sequence. Almost all amino acids found in natural compounds are L-isomers, and dermorphin’s stereochemistry makes it resistant to metabolism and “certainly renders it more potent,” Yaksh writes in an email to The Scientist. © 1986-2018 The Scientist

Keyword: Pain & Touch; Drug Abuse
Link ID: 24577 - Posted: 01.27.2018

By Eli Meixler Friday’s Google Doodle celebrates the birthday of Wilder Penfield, a scientist and physician whose groundbreaking contributions to neuroscience earned him the designation “the greatest living Canadian.” Penfield would have turned 127 today. Later celebrated as a pioneering researcher and a humane clinical practitioner, Penfield pursued medicine at Princeton University, believing it to be “the best way to make the world a better place in which to live.” He was drawn to the field of brain surgery, studying neuropathy as a Rhodes scholar at Oxford University. In 1928, Penfield was recruited by McGill University in Montreal, where he also practiced at Royal Victoria Hospital as the city’s first neurosurgeon. Penfield founded the Montreal Neurological Institute with support from the Rockefeller Foundation in 1934, the same year he became a Canadian citizen. Penfield pioneered a treatment for epilepsy that allowed patients to remain fully conscious while a surgeon used electric probes to pinpoint areas of the brain responsible for setting off seizures. The experimental method became known as the Montreal Procedure, and was widely adopted. But Wilder Penfield’s research led him to another discovery: that physical areas of the brain were associated with different duties, such as speech or movement, and stimulating them could generate specific reactions — including, famously, conjuring a memory of the smell of burnt toast. Friday’s animated Google Doodle features an illustrated brain and burning toast. © 2017 Time Inc.

Keyword: Miscellaneous
Link ID: 24576 - Posted: 01.27.2018

By Carly Ledbetter Two years after “the dress” divided people over its color, the internet is back with another puzzling wardrobe question. What color are these shoes? Some people think these Vans sneakers look gray and mint (or teal), while others see pink and white. For some, the color changes the more they stare at the shoes: While others are dead-set on the color they see: Twitter user @dolansmalik explained one theory about why the shoes look like different colors to some people: “THE REAL SHOE IS PINK & WHITE OKAY?!” she wrote on Twitter. “The second pic was with flash & darkened, so it looks teal & gray. (depends on what lighting ur in).” Bevil Conway is an investigator with the National Eye Institute who helped contribute to a study on the differences in color perception for the famous “dress” controversy two years ago. He told HuffPost how and why our eyes play tricks on us, in situations like “the dress” and the shoes above. “This is related to the famous dress insofar as both are related to issues of color constancy,” he explained. “Basically your visual system is constantly trying to color correct the images projected on the retina, to remove the color contamination introduced by the spectral bias in the light source.” Conway explained just how and why some people see turquoise in the shoes, while others see pink. “In that manipulated photograph there is a lot of the turquoise cast over the whole image. When you first look at it, after having looked at the pink version, your visual system is still adapted to the lighting conditions of the pink version and so you see the turquoise in the other version, and you attribute this to the shoe itself,” he said. “But after a while, your visual system adapts to the turquoise across the whole of that image and interprets it as part of the light source, eventually discounting it and restoring the shoe to the original pink version (or at least pinker).” ©2018 Oath Inc.

Keyword: Vision
Link ID: 24575 - Posted: 01.26.2018

By Katarina Zimmer | Cellular senescence, the process by which cells cease to divide in response to stress, may be a double-edged sword. In addition to being an important anti-cancer mechanism, recent studies show it may also contribute to age-related tissue damage and inflammation. A study published in Cell Reports yesterday (January 23) suggests that cellular senescence could be a factor underlying neurodegeneration in sporadic forms of Parkinson’s disease. “I think the proposition that cellular senescence drives neurodegeneration in Parkinson’s disease and other ageing-related neurodegenerative diseases . . . has a great deal of merit,” writes D James Surmeier, a physiologist at Northwestern University, to The Scientist in an email. “To my knowledge, [this study] is the first strong piece of evidence for this model.” Cellular senescence may be the basis by which the herbicide and neurotoxin paraquat, which has been previously linked to Parkinson’s disease, can contribute to the disease, the researchers propose. The vast majority of Parkinson’s disease cases are sporadic, rather than inherited, and caused by a combination of environmental and genetic factors. Julie Andersen, a neuroscientist at the Buck Institute for Research on Aging, says her laboratory decided to focus on paraquat based on epidemiological evidence linking it to the condition in humans and on lab work showing that mice treated with the chemical suffer a loss of dopaminergic neurons in the same region that is affected in humans. It is an acutely toxic chemical—capable of causing death—and was banned in the E.U. in 2007 over safety concerns, but is still used extensively by American farmworkers. © 1986-2018 The Scientist

Keyword: Parkinsons; Glia
Link ID: 24574 - Posted: 01.26.2018

by Ariana Eunjung Cha A new class of epilepsy medications based on an ingredient derived from marijuana could be available as soon as the second half of 2018 in the United States, pending Food and Drug Administration approval. Officials from GW Pharmaceuticals, the company that developed the drug, on Wednesday announced promising results from a study on 171 patients randomized into treatment and placebo groups. Members of the group, ages 2 to 55, have a condition called Lennox-Gastaut syndrome and were suffering from seizures that were not being controlled by existing drugs. On average they had tried and discontinued six anti-seizure treatments and were experiencing 74 “drop” seizures per month. Drop seizures involve the entire body, trunk or head and often result in a fall or other type of injury. The results, published in the Lancet, show that over a 14-week treatment period, 44 percent of patients taking the drug, called Epidiolex, saw a significant reduction in seizures, compared with 22 percent of the placebo group. Moreover, more of the patients who got the drug experienced a 50 percent or greater reduction in drop seizures. Elizabeth Thiele, director of pediatric epilepsy at Massachusetts General Hospital and lead author of the study, said the results varied depending on the patient. “For some, it does not do a whole lot. But for the people it does work in, it is priceless,” she said. “One child who comes to mind had multiple seizures a day. She had been on every medication possible,” said Thiele, a professor of neurology at Harvard Medical School. Then the patient tried the cannabis-based treatment and has been seizure-free for almost four years. “She is now talking about college options. She would have never had that conversation before. It has been life-changing.” © 1996-2018 The Washington Post

Keyword: Epilepsy; Drug Abuse
Link ID: 24573 - Posted: 01.26.2018

Emmarie Huetteman Dr. Andrey Ostrovsky's family did not discuss what killed his uncle in 2015. The man was young, not quite two weeks past his 45th birthday, when he died, and had lost touch with loved ones in his final months. At the time, Ostrovsky wondered if his uncle had perhaps killed himself. Almost two years later, Ostrovsky was Medicaid's chief medical officer, grappling professionally with an opioid crisis that kills about 115 Americans each day, when he learned the truth: His uncle had died of a drug overdose. Family members knew the uncle's life had been turbulent for a while before his death; they'd watched as he divorced his wife and became estranged from his 4-year-old daughter and eventually lost his job as a furniture store manager. But Ostrovsky wanted to better understand what had happened to the man — his stepfather's younger brother. So last fall, when he found himself in southeastern Florida, where his uncle had died, Ostrovsky contacted one of the uncle's friends for what he expected would be a quick cup of coffee. Instead the friend "let loose," revealing that he and Ostrovsky's uncle had been experimenting with a variety of drugs the night of the death. It was the tragic culmination of more than a decade of substance abuse — a pattern of behavior much of the family knew nothing about. An autopsy showed there were opiates and cocaine in his uncle's system, Ostrovsky later learned. © 2018 npr

Keyword: Drug Abuse
Link ID: 24572 - Posted: 01.26.2018

By Shawna Williams When the Voyager I spacecraft left Earth in 1977, it carried with it a “Golden Record” containing audio recordings of messages meant for any intelligent life that might cross its path. It bore sounds from around the world, including greetings in 55 languages, Chuck Berry’s “Johnny B. Goode,” and a fussy baby being soothed by its mother. According to Marc Bornstein, a developmental psychologist at Eunice Kennedy Shriver National Institute of Child Health and Human Development, Carl Sagan and other members of the committee who decided what to include on the record were spot on in picking the latter track. “Infant cry is . . . the very first communication between an infant and a caregiver,” Bornstein says. Crying is infants’ best tool for ensuring they get the care they need, but Bornstein and his research collaborators wondered about the caregivers’ responses: to what extent were those innate versus learned? To investigate, they enrolled 684 new mothers and their babies from 11 countries around the world and put cameras in their homes. Each time a baby began crying, the researchers recorded what the mother did in the next five seconds. Did she pick the baby up? Kiss or stroke it? Talk to it? Try to distract it with a toy? “Within five seconds, the predominant kinds of responses are picking up and holding and talking to the baby,” says Bornstein (PNAS, 114:E9465-73, 2017). The degree of uniformity surprised him. “People in Kenya and Cameroon . . . the mothers are growing up and have been reared in wildly different circumstances than mothers in Brazil and Argentina or the United States, or certainly than Japan or South Korea.” © 1986-2018 The Scientist

Keyword: Sexual Behavior; Language
Link ID: 24571 - Posted: 01.26.2018

Ewen Callaway The oldest human fossils ever found outside Africa suggest that Homo sapiens might have spread to the Arabian Peninsula around 180,000 years ago — much earlier than previously thought. The upper jaw and teeth, found in an Israeli cave and reported in Science on 25 January1, pre-date other human fossils from the same region by at least 50,000 years. But scientists say that it is unclear whether the fossils represent a brief incursion or a more-lasting expansion of the species. Researchers originally thought that H. sapiens emerged in East Africa 200,000 years ago then moved out to populate the rest of the world. Until discoveries in the past decade countered that story, scientists thought that a small group left Africa some 60,000 years ago and that signs of earlier travels, including 80,000–120,000 year-old skulls and other remains from Israel discovered in the 1920s and 1930s, were from failed migrations. However, recent discoveries have muddied that simple narrative. Some H. sapiens-like fossils from Morocco that are older than 300,000 years, reported last year2, have raised the possibility that humans evolved earlier and perhaps elsewhere in Africa. Teeth from southern China, described in 20153, hint at long-distance migrations some 120,000 years ago. And genome studies have sown more confusion, with some comparisons of global populations pointing to just one human migration from Africa4,5, and others suggesting multiple waves6. © 2018 Macmillan Publishers Limited,

Keyword: Evolution
Link ID: 24570 - Posted: 01.26.2018