Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 3161 - 3180 of 29522

Katarina Zimmer As early as the 1990s, researchers proposed that a very common type of herpes virus—then known as human herpesvirus 6 (HHV6)—could be somehow involved in the development of multiple sclerosis, a neurodegenerative disease characterized by autoimmune reactions against the protective myelin coating of the central nervous system. However, the association between HHV6 and the disease soon became fraught with controversy as further studies produced discordant results. Complicating matters further, HHV6 turned out to be two related, but distinct variants—HHV6A and HHV6B. Because the two viruses are similar, for a while no method existed to tell whether a patient had been infected with one or the other, or both—making it difficult to draw a definitive association between either of the viruses and the disease. Now, a collaboration of European researchers has developed a technique capable of distinguishing antibodies against one variant from the other. Using that method in a Swedish cohort of more than 8,700 multiple sclerosis patients and more than 7,200 controls, they found that patients were much more likely to carry higher levels of anti-HHV6A antibodies than healthy people, while they were likelier to carry fewer antibodies against HHV6B. The findings, published last November in Frontiers in Immunology, hint that previous contradictory results may at least be partially explained by the fact that researchers couldn’t distinguish between the two viruses. “This article now makes a pretty convincing case that it is HHV6A that correlates with multiple sclerosis, and not HHV6B,” remarks Margot Mayer-Pröschel, a neuroscientist at the University of Rochester Medical Center who wasn’t involved in the study. “Researchers can now focus on one of these viruses rather than looking at [both] of them together.” © 1986–2020 The Scientist.

Keyword: Multiple Sclerosis; Neuroimmunology
Link ID: 26956 - Posted: 01.14.2020

By Matt Richtel The number of women drinking dangerous amounts of alcohol is rising sharply in the United States. That finding was among several troubling conclusions in an analysis of death certificates published Friday by the National Institute on Alcohol Abuse and Alcoholism. The analysis looked at deaths nationwide each year from 1999 through 2017 that were reported as being caused at least partly by alcohol, including acute overdose, its chronic use, or in combination with other drugs. The death rate tied to alcohol rose 51 percent overall in that time period, taking into account population growth. Most noteworthy to researchers was that the rate of deaths among women rose much more sharply, up 85 percent. In sheer numbers, 18,072 women died from alcohol in 2017, according to death certificates, compared with 7,662 in 1999. “More women are drinking and they are drinking more,” said Patricia Powell, deputy director of the alcohol institute, which is a division of the National Institutes of Health. Still, far more men than women die from alcohol-related illnesses, the study showed. In 2017, alcohol played a role in the deaths of 72,558 men, compared to 35,914 in 1999, a 35 percent increase when population growth is factored in. Like much research of its kind, the findings do not alone offer the reasons behind the increase in alcohol deaths. In fact, the data is confounding in some respects, notably because teenage drinking overall has been dropping for years, a shift that researchers have heralded as a sign that alcohol has been successfully demonized as a serious health risk. Experts said that the new findings could partly reflect the fact that baby boomers are aging and the health effects of chronic alcohol use have become more apparent. The increase in deaths might also reflect the increase in opioid-related deaths, which in many cases can involve alcohol as well, and that would be reflected on death certificates. © 2020 The New York Times Company

Keyword: Drug Abuse; Sexual Behavior
Link ID: 26955 - Posted: 01.13.2020

There are differences in the way English and Italian speakers are affected by dementia-related language problems, a small study suggests. While English speakers had trouble pronouncing words, Italian speakers came out with shorter, simpler sentences. The findings could help ensure accurate diagnoses for people from different cultures, the researchers said. Diagnostic criteria are often based on English-speaking patients. In the University of California study of 20 English-speaking patients and 18 Italian-speaking patients, all had primary progressive aphasia - a neuro-degenerative disease which affects areas of the brain linked to language. It is a feature of Alzheimer's disease and other dementia disorders. Brain scans and tests showed similar levels of cognitive function in people in both language groups. But when the researchers asked participants to complete a number of linguistic tests, they picked up obvious differences between the two groups in the challenges they faced. 'Easier to pronounce' "We think this is specifically because the consonant clusters that are so common in English pose a challenge for a degenerating speech-planning system," said study author Maria Luisa Gorno-Tempini, professor of neurology and psychiatry. "In contrast, Italian is easier to pronounce, but has much more complex grammar, and this is how Italian speakers with [primary progressive aphasia] tend to run into trouble." As a result, the English speakers tended to speak less while the Italian speakers had fewer pronunciation problems, but simplified what they did say. English is a Germanic language while Italian is a Romance language, derived from Latin along with French, Spanish and Portuguese. The researchers, writing in Neurology, are concerned that many non-native English speakers may not be getting the right diagnosis "because their symptoms don't match what is described in clinical manuals based on studies of native English speakers". The San Francisco research team says it now wants to repeat the research in larger groups of patients, and look for differences between speakers of other languages, such as Chinese and Arabic. © 2020 BBC

Keyword: Alzheimers; Language
Link ID: 26954 - Posted: 01.13.2020

By John Horgan Last month I participated in a symposium hosted by the Center for Theory & Research at Esalen, a retreat center in Big Sur, California. Fifteen men and women representing physics, psychology and other fields attempted to make sense of mystical and paranormal experiences, which are generally ignored by conventional, materialist science. The organizers invited me because of my criticism of hard-core materialism and interest in mysticism, but in a recent column I pushed back against ideas advanced at the meeting. Below other attendees push back against me. My fellow speaker Bjorn Ekeberg, whose response is below, took the photos of Esalen, including the one of me beside a stream (I'm the guy on the right). -- John Horgan Jeffrey Kripal, philosopher of religion at Rice University and author, most recently, of The Flip: Epiphanies of Mind and the Future of Knowledge (see our online chats here and here): Thank you, John, for reporting on your week with us all. As one of the moderators of “Physics, Experience and Metaphysics,” let me try to reply, briefly (and too simplistically), to your various points. First, let me begin with something that was left out of your generous summary: the key role of the imagination in so many exceptional or anomalous experiences. As you yourself pointed out with respect to your own psychedelic opening, this is no ordinary or banal “imagination.” This is a kind of “super-imagination” that projects fantastic visionary displays that none of us could possibly come up with in ordinary states: this is a flying caped Superman to our bespectacled Clark Kent. None of this, of course, implies that anything seen in these super-imagined states is literally true (like astral travel or ghosts) or non-human, but it does tell us something important about why the near-death or psychedelic experiencers commonly report that these visionary events are “more real” than ordinary reality (which is also, please note, partially imagined, if our contemporary neuroscience of perception is correct). Put in terms of a common metaphor that goes back to Plato, the fictional movies on the screen can ALL be different and, yes, of course, humanly and historically constructed, but the Light projecting them can be quite Real and the Same. Fiction and reality are in no way exclusive of one another in these paradoxical states. © 2020 Scientific American

Keyword: Consciousness
Link ID: 26953 - Posted: 01.13.2020

By Daniel J. Levitin I’m 62 years old as I write this. Like many of my friends, I forget names that I used to be able to conjure up effortlessly. When packing my suitcase for a trip, I walk to the hall closet and by the time I get there, I don’t remember what I came for. And yet my long-term memories are fully intact. I remember the names of my third-grade classmates, the first record album I bought, my wedding day. This is widely understood to be a classic problem of aging. But as a neuroscientist, I know that the problem is not necessarily age-related. Short-term memory contains the contents of your thoughts right now, including what you intend to do in the next few seconds. It’s doing some mental arithmetic, thinking about what you’ll say next in a conversation or walking to the hall closet with the intention of getting a pair of gloves. Short-term memory is easily disturbed or disrupted. It depends on your actively paying attention to the items that are in the “next thing to do” file in your mind. You do this by thinking about them, perhaps repeating them over and over again (“I’m going to the closet to get gloves”). But any distraction — a new thought, someone asking you a question, the telephone ringing — can disrupt short-term memory. Our ability to automatically restore the contents of the short-term memory declines slightly with every decade after 30. But age is not the major factor so commonly assumed. I’ve been teaching undergraduates for my entire career and I can attest that even 20-year-olds make short-term memory errors — loads of them. They walk into the wrong classroom; they show up to exams without the requisite No. 2 pencil; they forget something I just said two minutes before. These are similar to the kinds of things 70-year-olds do. © 2020 The New York Times Company

Keyword: Learning & Memory; Alzheimers
Link ID: 26952 - Posted: 01.13.2020

Lesley McClurg Americans know the dangers of drugs such as morphine and heroin. But what about a supplement that acts in the brain a bit like an opiate and is available in many places to kids — even from vending machines. Kratom, an herb that's abundant, legal in most states and potentially dangerous, is the subject of an ongoing debate over its risks and benefits. Usually, the leaf, which comes from a tropical Southeast Asian tree, is chewed, brewed or crushed into a bitter green powder. The chemicals in the herb interact with different types of receptors in the brain — some that respond to opioids, and others to stimulants. Often sold in the U.S. in a processed form — as pills, capsules or extracts — a small amount of kratom can perk you up, while a large dose has a sedative effect. Some people who have struggled with an opioid addiction and switched to kratom swear the substance salvaged their health, livelihood and relationships. But the federal Food and Drug Administration and the Drug Enforcement Administration worry that kratom carries the risk of physical and psychological dependency and, in some people, addiction. The FDA warns consumers not to use kratom, and the DEA threatened to prohibit kratom's sale and use in the U.S. (outside of research) in 2016; advocates and lawmakers subsequently pushed back, and the stricter scheduling of kratom that would have prompted that sort of ban never occurred. These days, the DEA lists it as a drug of concern. © 2020 npr

Keyword: Drug Abuse
Link ID: 26951 - Posted: 01.13.2020

By Benedict Carey Soldiers with deep wounds sometimes feel no pain at all for hours, while people without any detectable injury live in chronic physical anguish. How to explain that? Over drinks in a Boston-area bar, Ronald Melzack, a psychologist, and Dr. Patrick Wall, a physiologist, sketched out a diagram on a cocktail napkin that might help explain this and other puzzles of pain perception. The result, once their idea was fully formed, was an electrifying theory that would become the founding document for the field of modern pain studies and establish the career of Dr. Melzack, whose subsequent work deepened medicine’s understanding of pain and how it is best measured and treated. Dr. Melzack died on Dec. 22 in a hospital near his home in Montreal, where he lived, his daughter, Lauren Melzack, said. He was 90, and had spent most of his professional life as a professor of psychology at McGill University. When Dr. Melzack and Dr. Wall, then at the Massachusetts Institute of Technology, met that day in 1959 or 1960 (accounts of their encounter vary), pain perception was thought to work something like a voltmeter, in which nerves send signals up to the brain that reflect the severity of the injury. But that model failed to explain not only battlefield experience but also a host of clinical findings and everyday salves. Most notably, rubbing a wound lessens its sting — and accounting for just that common sensation proved central to the new theory. Doctors knew that massaging the skin activated so-called large nerve fibers, which are specialized to detect subtle variations of touch; and that deeper, small fibers sounded the alarm of tissue damage. The two researchers reasoned that all these sensations must pass through a “gate” in the spinal cord, which adds up their combined signals before sending a message to the brain. In effect, activating the large fibers blocks signals from the smaller ones, by closing the gate. © 2020 The New York Times Company

Keyword: Pain & Touch
Link ID: 26950 - Posted: 01.13.2020

By Kelly Servick The dark, thumping cavern of an MRI scanner can be a lonely place. How can scientists interested in the neural activity underlying social interactions capture an engaged, conversing brain while its owner is so isolated? Two research teams are advancing a curious solution: squeezing two people into one scanner. One such MRI setup is under development with new funding from the U.S. National Science Foundation (NSF), and another has undergone initial testing described in a preprint last month. These designs have yet to prove that their scientific payoff justifies their cost and complexity, plus the requirement that two people endure a constricted almost-hug, in some cases for 1 hour or more. But the two groups hope to open up new ways to study how brains exchange subtle social and emotional cues bound up in facial expressions, eye contact, and physical touch. The tool could “greatly expand the range of investigations possible,” says Winrich Freiwald, a neuroscientist at Rockefeller University. “This is really exciting.” Functional magnetic resonance imaging (fMRI), which measures blood oxygenation to estimate neural activity, is already a common tool for studying social processes. But compared with real social interaction, these experiments are “reduced and artificial,” says Lauri Nummenmaa, a neuroscientist at the University of Turku in Finland. Participants often look at static photos of faces or listen to recordings of speech while lying in a scanner. But photos can’t show the subtle flow of emotions across people’s faces, and recordings don’t allow the give and take of real conversation. © 2019 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 26949 - Posted: 01.10.2020

Nell Greenfieldboyce Parrots can perform impressive feats of intelligence, and a new study suggests that some of these "feathered apes" may also practice acts of kindness. African grey parrots voluntarily helped a partner get a food reward by giving the other bird a valuable metal token that could be exchanged for a walnut, according to a newly published report in the journal Current Biology. "This was really surprising that they did this so spontaneously and so readily," says Désirée Brucks, a biologist at ETH Zürich in Switzerland who is interested in the evolution of altruism. Children as young as 1 seem highly motivated to help others, and scientists used to think this kind of prosocial behavior was uniquely human. More recent research has explored "helping" behavior in other species, everything from nonhuman primates to rats and bats. To see whether intelligent birds might help out a feathered pal, Brucks and Auguste von Bayern of the Max Planck Institute for Ornithology in Germany tested African grey parrots. They used parrots that had previously been trained to understand that specific tokens, in the form of small metal rings, could be traded for a food treat through an exchange window. In their experiment, this exchange window was covered up and closed on one bird's cage, making it impossible for that bird to trade. The bird had a pile of tokens in its cage but no way to use them. Meanwhile, its neighbor in an adjacent cage had an open exchange window — but no tokens for food. © 2020 npr

Keyword: Emotions; Evolution
Link ID: 26948 - Posted: 01.10.2020

Kristen S. Morrow Human beings used to be defined as “the tool-maker” species. But the uniqueness of this description was challenged in the 1960s when Dr. Jane Goodall discovered that chimpanzees will pick and modify grass stems to use to collect termites. Her observations called into question homo sapiens‘ very place in the world. Since then scientists’ knowledge of animal tool use has expanded exponentially. We now know that monkeys, crows, parrots, pigs and many other animals can use tools, and research on animal tool use changed our understanding of how animals think and learn. Studying animal tooling – defined as the process of using an object to achieve a mechanical outcome on a target – can also provide clues to the mysteries of human evolution. Our human ancestors’ shift to making and using tools is linked to evolutionary changes in hand anatomy, a transition to walking on two rather than four feet and increased brain size. But using found stones as pounding tools doesn’t require any of these advanced evolutionary traits; it likely came about before humans began to manufacture tools. By studying this percussive tool use in monkeys, researchers like my colleagues and I can infer how early human ancestors practiced the same skills before modern hands, posture and brains evolved. Understanding wild animals’ memory, thinking and problem-solving abilities is no easy task. In experimental research where animals are asked to perform a behavior or solve a problem, there should be no distractions – like a predator popping up. But wild animals come and go as they please, over large spaces, and researchers cannot control what is happening around them. © 2010–2020, The Conversation US, Inc.

Keyword: Evolution
Link ID: 26947 - Posted: 01.10.2020

By Philippa Roxby Health reporter A sleep disorder that can leave people gasping for breath at night could be linked to the amount of fat on their tongues, a study suggests. When sleep apnoea patients lost weight, it was the reduction in tongue fat that lay behind the resulting improvements, researchers said. Larger and fattier tongues are more common among obese patients. But the Pennsylvania team said other people with fatty tongues may also be at risk of the sleep disorder. The researchers now plan to work out which low-fat diets are particularly good at slimming down the tongue. Tongue tied "You talk, eat and breathe with your tongue - so why is fat deposited there?" said study author Dr Richard Schwab, of Perelman School of Medicine, Philadelphia. "It's not clear why - it could be genetic or environmental - but the less fat there is, the less likely the tongue is to collapse during sleep." Sleep apnoea is a common disorder that can cause loud snoring, noisy breathing and jerky movements when asleep. It can also cause sleepiness during the day, which can affect quality of life. The most common type is obstructive sleep apnoea, in which the upper airway gets partly or completely blocked during sleep. Those who are overweight or who have a large neck or tonsils are more likely to have the condition. Researchers at the Perelman School of Medicine, University of Pennsylvania, scanned 67 people with obstructive sleep apnoea who were obese and had lost 10% of their body weight, improving their symptoms improved by 30%. © 2020 BBC.

Keyword: Sleep; Obesity
Link ID: 26946 - Posted: 01.10.2020

By Veronique Greenwood The cuttlefish hovers in the aquarium, its fins rippling and large, limpid eyes glistening. When a scientist drops a shrimp in, this cousin of the squid and octopus pauses, aims and shoots its tentacles around the prize. There’s just one unusual detail: The diminutive cephalopod is wearing snazzy 3-D glasses. Putting 3-D glasses on a cuttlefish is not the simplest task ever performed in the service of science. “Some individuals will not wear them no matter how much I try,” said Trevor Wardill, a sensory neuroscientist at the University of Minnesota, who with other colleagues managed to gently lift the cephalopods from an aquarium, dab them between the eyes with a bit of glue and some Velcro and fit the creatures with blue-and-red specs. The whimsical eyewear was part of an attempt to tell whether cuttlefish see in 3-D, using the distance between their two eyes to generate depth perception like humans do. It was inspired by research in which praying mantises in 3-D glasses helped answer a similar question. The team’s results, published Wednesday in the journal Science Advances, suggest that, contrary to what scientists believed in the past, cuttlefish really can see in three dimensions. Octopuses and squid, despite being savvy hunters, don’t seem to have 3-D vision like ours. Previous work, more than 50 years ago, had found that one-eyed cuttlefish could still catch prey, suggesting they might be similar. But cuttlefish eyes often focus in concert when they’re hunting, and there is significant overlap in what each eye sees, a promising combination for generating 3-D vision. Dr. Wardill, Rachael Feord, a graduate student at the University of Cambridge, and the team decided to give the glasses a try during visits to the Marine Biological Lab in Woods Hole, Mass. The logic went like this: With each eye covered by a different colored lens, two different-colored versions of a scene, just slightly offset from each other, should pop out into a three-dimensional image. By playing a video on the tank wall of a scuttling pair of shrimp silhouettes, each a different color and separated from each other by varying amounts, the researchers could make a shrimp seem closer to the cuttlefish or farther away. If, that is, the cuttlefish experienced 3-D vision like ours. © 2020 The New York Times Company

Keyword: Vision; Evolution
Link ID: 26945 - Posted: 01.09.2020

Amber Dance The girl tried hard to hold her arms and hands steady, but her fingers wriggled and writhed. If she closed her eyes, the squirming got worse. It wasn’t that she lacked the strength to keep her limbs still — she just didn’t seem to have control over them. Carsten Bönnemann remembers examining the teenager at a hospital in Calgary, Canada, in 2013. As a paediatric neurologist with the US National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, he often travelled to weigh in on puzzling cases. But he had never seen anything like this. If she wasn’t looking at her limbs, the girl didn’t seem to have any clue where they were. She lacked the sense of her body’s position in space, a crucial ability known as proprioception. “This is something that just doesn’t occur,” says Bönnemann. His team sequenced the girl’s genes, and those of another girl with similar symptoms1, and found mutations in a gene called PIEZO2. Their timing was fortunate: just a few years earlier, researchers looking for the mechanisms that cells use to sense touch had found that the gene encoded a pressure-sensitive protein2. The discovery of Piezo2 and a related protein, Piezo1, was a high point in a decades-long search for the mechanisms that control the sense of touch. The Piezos are ion channels — gates in the cell membrane that allow ions to pass through — that are sensitive to tension. “We’ve learned a lot about how cells communicate, and it’s almost always been about chemical signalling,” says Ardem Patapoutian, a molecular neurobiologist at Scripps Research in La Jolla, California, whose group identified the Piezos. “What we’re realizing now is that mechanical sensation, this physical force, is also a signalling mechanism, and very little is known about it.” © 2020 Springer Nature Limited

Keyword: Pain & Touch
Link ID: 26944 - Posted: 01.09.2020

By Elizabeth Brico The statistics are heartbreaking. Each year in the U.S., about 32,000 newborns are diagnosed with neonatal abstinence syndrome, a form of withdrawal that can result from in utero exposure to a number of drugs taken by the mother during pregnancy. Opioids — both prescribed and illegal — are among the most common culprits. These medications can be necessary, even life-saving, but that doesn’t make the resultant NAS any easier to watch: Newborns who suffer from the syndrome may exhibit tremors, irritability, hyperactive reflexes, high-pitched crying, and other symptoms. But drugs are not solely to blame for the prolonged suffering many of these infants experience. The way NAS cases are handled also has a profound impact on their severity, and often leads to negative outcomes. Health care providers and law enforcement authorities have historically separated these fragile babies from their mothers, doling out severe punishments to the latter. Although there is a growing awareness that change is needed, many hospitals still use outdated approaches — and child welfare agencies are particularly behind the times in this arena. Recent studies suggest that policies that place blame on mothers only heighten a newborn’s suffering by preventing infants from accessing potent care for reducing withdrawal symptoms: contact with mom. Misperceptions about opioid addiction, dependency, and NAS are woven into the very fabric of U.S. and state law. In order to receive federal funding for child abuse prevention, health care workers are required to report substance-affected newborns to Child Protective Services. Additionally, states can require health care providers to report or test for drug exposure during pregnancy. In many cases, mothers are reported even if the exposure is the result of prescribed methadone or buprenorphine — opioid-based drugs commonly used to treat addiction.

Keyword: Drug Abuse; Development of the Brain
Link ID: 26943 - Posted: 01.09.2020

Emily Makowski Bruce McEwen, a neuroendocrinologist at Rockefeller University, died January 2 after a brief illness. He was 81 years old. McEwen is best known for his research on how stress hormones can reconfigure neural connections in the brain, according to a university statement. In 1968, McEwen and his colleagues discovered that the rat hippocampus is affected by the hormone cortisol, sparking further research into how hormones can enter the brain and affect mental functioning and mood. At the time, most scientists believed that the brain was not malleable after becoming fully developed, a line of thinking that McEwen’s research findings contradicted. In 1993, he coined the term allostatic load, which describes the physiological effects of chronic stress. With his wife, Karen Bulloch, a Rockefeller professor, he studied how immune cells in the brain increase during a person’s lifespan and can contribute to neurodegenerative disease. He also researched how sex hormones affect the central nervous system. Over the course of his career, which spanned six decades, McEwen received many accolades including the Pasarow Foundation award in neuropsychiatry, the Fondation Ipsen Neuronal Plasticity and Endocrine Regulation prizes, the Scolnick Prize in Neuroscience, and the William James Lifetime Achievement Award for Basic Research. He was a member of the National Academy of Sciences, the National Academy of Medicine, and the American Society of Arts and Sciences. “Bruce was a giant in the field of neuroendocrinology,” McEwen’s colleague Leslie Vosshall, a neuroscientist at Rockefeller, says in the statement. “He was a world leader in studying the impact of stress hormones on the brain, and led by example to show that great scientists can also be humble, gentle, and generous human beings.” © 1986–2020 The Scientist

Keyword: Stress; Hormones & Behavior
Link ID: 26942 - Posted: 01.09.2020

By Rodrigo Pérez Ortega Nearly 2600 years ago, a man was beheaded near modern-day York, U.K.—for what reasons, we still don’t know—and his head was quickly buried in the clay-rich mud. When researchers found his skull in 2008, they were startled to find that his brain tissue, which normally rots rapidly after death, had survived for millennia—even maintaining features such as folds and grooves (above). Now, researchers think they know why. Using several molecular techniques to examine the remaining tissue, the researchers figured out that two structural proteins—which act as the “skeletons” of neurons and astrocytes—were more tightly packed in the ancient brain. In a yearlong experiment, they found that these aggregated proteins were also more stable than those in modern-day brains. In fact, the ancient protein clumps may have helped preserve the structure of the soft tissue for ages, the researchers report today in the Journal of the Royal Society Interface. Aggregated proteins are a hallmark of aging and brain diseases like Alzheimer’s. But the team didn’t find any protein clumps typical of those conditions in the ancient brain. The scientists still aren’t sure what made the proteins aggregate, but they suspect it could have something to do with the burial conditions, which appeared to take place as part of a ritual. In the meantime, the new findings could help researchers gather information from proteins of other ancient tissues from which DNA cannot be easily recovered. © 2019 American Association for the Advancement of Science

Keyword: Brain imaging; Glia
Link ID: 26941 - Posted: 01.09.2020

By Perri Klass, M.D. In December, the American Academy of Pediatrics put out a new clinical report on autism, an extensive document with an enormous list of references, summarizing 12 years of intense research and clinical activity. During this time, the diagnostic categories changed — Asperger’s syndrome and pervasive developmental disorder, diagnostic categories that once included many children, are no longer used, and we now consider all these children (and adults) to have autism spectrum disorder, or A.S.D. The salient diagnostic characteristics of A.S.D. are persistent problems with social communication, including problems with conversation, with nonverbal communication and social cues, and with relationships, together with restricted repetitive behavior patterns, including repetitive movements, rigid routines, fixated interests and sensory differences. Dr. Susan Hyman, the lead author on the new report, who is the division chief of developmental and behavioral pediatrics at Golisano Children’s Hospital at the University of Rochester, said in an email that much has changed over the past 12 years. She pointed in particular to increased medical awareness and understanding of conditions that often occur together with A.S.D., and to a greater emphasis on planning — together with families — how to support children as they grow. Dr. Susan E. Levy, a co-author of the statement who is a developmental behavioral pediatrician at Children’s Hospital of Philadelphia, said that one key message of the report is the emphasis on early identification and referral for treatment, even if a diagnosis of autism is suspected but not yet confirmed. The outcomes are better when treatment starts as early as possible, she said. The average age of diagnosis is now around 4 years, but the goal is to get it well under 2, she said. And children who are at higher risk — for example, those whose siblings have A.S.D. — should receive especially close screening and attention. © 2020 The New York Times Company

Keyword: Autism
Link ID: 26940 - Posted: 01.07.2020

By Brooke Siem The prescriptions began in the wake of my father’s sudden death when I was 15: Wellbutrin XL and Effexor XR for anxiety and depression, two separate doses of Synthroid to right a low-functioning thyroid, a morning and nighttime dose of tetracycline for acne, birth control to regulate the unpleasant side effects of womanhood, and four doses of Sucralfate to be taken at each meal and before bedtime — all given to me by the time I was old enough to vote. My general practitioner asked what Sucralfate was after I’d finished rattling off my prescriptive party mix during our first appointment. I was 22 and a recent Manhattan transplant. I had an apartment in Murray Hill and a job waiting tables at a local Italian restaurant. “It’s for something called bile reflux disease,” I said. “I used to randomly puke up bile all the time.” “Huh. Never heard of it.” He ripped off a completed prescription slip and scribbled across the new blank page. “You should really get the prescription for antidepressants from a psychiatrist, but I’ll give it to you along with all the rest since you’ve been on it for so long. And whenever you come back, maybe we should do a physical.” At the time, it never occurred to me that my medication needed monitoring or that perhaps my doctor should do a physical before sending me to the pharmacy. Not only was this five-minute exchange routine, but at no point during my years in the American mental health system did a psychiatrist, psychologist, doctor or pharmacist suggest that I consider reevaluating the decision to take antidepressants. Therefore, I believed that my only choices were to cope with depression or cope with antidepressants, and that depression would always thump inside me with the regularity of my own pulse.

Keyword: Depression; Drug Abuse
Link ID: 26939 - Posted: 01.07.2020

By Matthew Hutson When you are stuck on a problem, sometimes it is best to stop thinking about it—consciously, anyway. Research has shown that taking a break or a nap can help the brain create pathways to a solution. Now a new study expands on the effect of this so-called incubation by using sound cues to focus the sleeping mind on a targeted problem. When humans sleep, parts of the brain replay certain memories, strengthening and transforming them. About a decade ago researchers developed a technique, called targeted memory reactivation (TMR), aimed at further reinforcing selected memories: when a sound becomes associated with a memory and is later played during sleep, that memory gets reactivated. In a study published last November in Psychological Science, scientists tested whether revisiting the memory of a puzzle during sleep might also improve problem-solving. About 60 participants visited the laboratory before and after a night of sleep. In an evening session, they attempted spatial, verbal and conceptual puzzles, with a distinct music clip repeating in the background for each, until they had worked on six puzzles they could not solve. Overnight they wore electrodes to detect slow-wave sleep—slumber's deepest phase, which may be important for memory consolidation—and a device played the sounds assigned to three of the six unsolved puzzles. The next day, back at the lab, the participants attempted the six puzzles again. (Each repeated the experiment with a different set of puzzles the following night.) All told, the subjects solved 32 percent of the sound-prompted puzzles versus 21 percent of the untargeted puzzles—a boost of more than 50 percent. © 2020 Scientific American

Keyword: Sleep; Learning & Memory
Link ID: 26938 - Posted: 01.07.2020

Natalie C Tronson Ph.D. We all have a strong intuitive sense of what memory is: it’s the conscious recollection of events, people, and places from our past. And it’s something we often wish we were better at so we didn’t continuously lose our keys, forget where our car was parked, and we could remember more facts for exams, remember people’s birthdays, or what I came all the way upstairs to grab. But memory is so much more. Memory is also how I can find my way around the town I live in now—and how I can still find my way around the town I grew up in, despite the many changes over the 25 years since I left. It’s how I know how to drive the car, and how I can sing four verses of Mary Had a Little Lamb to my child sitting in the back seat demanding that I sing. It’s why I know to stop at the red light, go at the green, and avoid the stretch of road that has been under construction for the past six months. It’s also one reason why I feel anxious when pedestrians run across the street randomly, and why our cats come running home when they hear the front door of our house open. That’s a lot of different types of memory just for a quick drive home: memory for spatial learning, verbal memory for songs, motor learning for driving, and episodic memory, among others, are in there too. Not only are there a lot of different types of memory, but there is also a lot of real estate and energy in our brains (and in the brains of many other species) taken up for learning and memory processes. © 2020 Sussex Publishers, LLC

Keyword: Learning & Memory
Link ID: 26937 - Posted: 01.07.2020