Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 3181 - 3200 of 29522

By Jane E. Brody If you live with or work with someone who suffers from migraine, there’s something very important you should know: A migraine is not “just a headache,” as many seem to think. Nor is it something most sufferers can simply ignore and get on with their lives. And if you are a migraine sufferer, there’s something potentially life-changing that you should know: There are now a number of medications available that can either prevent or alleviate many attacks, as well as a newly marketed wearable nerve-stimulating device that can be activated by a smartphone to relieve the pain of migraine. Migraine is a neurological disorder characterized by recurrent attacks of severe, often incapacitating headache and dysfunction of the autonomic nervous system, which controls the body’s myriad automatic activities like digestion and breathing. The throbbing or pulsating pain of migraine is often accompanied by nausea and vomiting. Translation: Migraine is a headache, all right, but with body-wide effects because the brain converses with the rest of the body. It is often severe enough to exact a devastating toll on someone’s ability to work, interact with others, perform the tasks of daily life, or even be in a normal living environment. When in the throes of a migraine attack, sufferers may be unable to tolerate light, noise, smells or even touch. Dr. Stephen Silberstein, a neurologist at Thomas Jefferson University and director of the Jefferson Headache Center, told me “There are 47 million people in this country with migraine, and for six million, the condition is chronic, which means they have more than 15 headache days a month,” he said. “It’s time to destigmatize migraine and provide sufferers with effective treatment,” said Dr. David W. Dodick, neurologist at the Mayo Clinic in Scottsdale. “They’re not fakers, weak individuals who are trying to get out of work.” © 2020 The New York Times Company

Keyword: Pain & Touch
Link ID: 26936 - Posted: 01.07.2020

By James Gallagher Health and science correspondent An early life full of neglect, deprivation and adversity leads to people growing up with smaller brains, a study suggests. The researchers at King's College London were following adopted children who spent time in "hellhole" Romanian orphanages. They grew up with brains 8.6% smaller than other adoptees. The researchers said it was the "most compelling" evidence of the impact on the adult brain. The appalling care at the orphanages came to light after the fall of Romania's communist dictator Nicolae Ceausescu in 1989. "I remember TV pictures of those institutions, they were shocking," Prof Edmund Sonuga-Barke, who now leads the study following those children, told the BBC. He described the institutions as "hellholes" where children were "chained into their cots, rocking, filthy and emaciated". The children were physically and psychologically deprived with little social contact, no toys and often ravaged by disease. The children studied had spent between two weeks and nearly four years in such institutions. Previous studies on children who were later adopted by loving families in the UK showed they were still experiencing mental health problems in adulthood. Higher levels of traits including autism, attention deficit hyperactivity disorder (ADHD) and a lack of fear of strangers (disinhibited social engagement disorder) have all been documented. The latest study, published in Proceedings of the National Academy of Sciences, is the first to scan the brains for answers. There were 67 Romanian adoptees in the study and their brains were compared to 21 adoptees who did not suffer early life deprivation. "What we found is really quite striking," Prof Sonuga-Barke told the BBC. First the total brain volume - the size of the brain - was 8.6% smaller in the Romanian adoptees on average. And the longer they spent in the Romanian orphanages, the greater the reduction in brain size. © 2020 BBC.

Keyword: Development of the Brain; Stress
Link ID: 26935 - Posted: 01.07.2020

Ryan F. Mandelbaum Scientists have uncovered a new kind of electrical process in the human brain that could play a key role in the unique way our brains compute. Our brains are computers that work using a system of connected brain cells, called neurons, that exchange information using chemical and electric signals called action potentials. Researchers have discovered that certain cells in the human cortex, the outer layer of the brain, transmit signals in a way not seen in corresponding rodent cells. This process might be important to better understanding our unique brains and to improving programs that are based on a model of the human brain. “Human neurons may be more powerful computational devices than previously thought,” study corresponding author Matthew Larkum at Humboldt University of Berlin told Gizmodo in an email. Human brains have a thick cortex, especially the second and third layers (L2/3) from the surface. These layers contain brain cells with lots of branches, called dendrites, that connect them to and exchange information with other brain cells. The researchers acquired and analyzed slices of L2/3 tissue from patients with epilepsy and tumors, focusing specifically on these dendrites. Larkum explained via email that epilepsy surgeries provided a sufficient amount of available cortex tissue, while the tumor patient tissue was used to ensure that the observations weren’t unique to people with epilepsy. The team hooked the tissues to a patch clamp—essentially a system that constructs an electrical circuit from the cells and a measurement instrument—and used fluorescing microscope to observe the action of these L2/3 cells. The team noticed that inputted electrical currents ignited more action potentials than they would in rodent cells and that a chemical that should have blocked the dendrites’ activity did not completely do so. © 2020 G/O Media Inc.

Keyword: Learning & Memory
Link ID: 26934 - Posted: 01.04.2020

By Simon Makin For many people battling addictions, seeing drug paraphernalia—or even places associated with past use—can ignite cravings that make relapse more likely. Associating environmental cues with pleasurable experiences is a basic form of learning, but some researchers think such associations can “hijack” behavior, contributing to problems such as addiction and eating disorders. Researchers led by neuroscientist Shelly Flagel of the University of Michigan have found a brain circuit that may control this hijacking; rats that exhibit a type of compulsive behavior show different brain connectivity and activity than those that do not, and manipulation of the circuit altered their behavior. These findings may help researchers understand why some individuals are more susceptible to impulse-control disorders. “This is technically a really excellent study,” says neuroscientist Jeff Dalley of the University of Cambridge, who was not involved in the work. In the study, published last September in eLife, researchers showed rats an inert lever shortly before delivering a tasty treat via a chute, then sorted them into groups based on their responses. All rats learned to associate the lever with the treat, but some—dubbed “goal trackers”—began to approach the food chute directly after seeing the lever, whereas inherent “sign trackers” kept compulsively returning to the lever itself. The team suspected that two brain regions were involved: the paraventricular nucleus of the thalamus (PVT), which drives behavior, and the prelimbic cortex, which is involved in reward learning. The researchers used a technique called chemogenetics to alter neurons in the circuit connecting these regions, which let them turn on or inhibit signals from the prelimbic cortex using drugs. Activating the circuit reduced sign trackers' tendency to approach the lever but did not affect goal trackers. Deactivating it drew goal trackers to the lever (sign-tracking behavior), without affecting preexisting sign trackers. The team also found increased dopamine, a chemical messenger involved in reward processing, in the newly sign-tracking brains. © 2020 Scientific American

Keyword: Drug Abuse
Link ID: 26933 - Posted: 01.04.2020

By Sharon Jayson AUSTIN, Texas — Retired state employees Vickey Benford, 63, and Joan Caldwell, 61, are Golden Rollers, a group of the over-50 set that gets out on assorted bikes — including trikes for adults they call “three wheels of awesome” — for an hour of trail riding and camaraderie. “I love to exercise, and I like to stay fit,” said Caldwell, who tried out a recumbent bike, a low-impact option that can be easier on the back. “It keeps me young.” Benford encouraged Caldwell to join the organized rides, which have attracted more than 225 riders at city rec centers and senior activity centers. The cyclists can choose from a small, donated fleet of recumbent bikes, tandem recumbents and tricycles. “With seniors, it’s less about transportation and more about access to the outdoors, social engagement and quality of life,” said Christopher Stanton, whose idea for Golden Rollers grew out of the Ghisallo Cycling Initiative, a youth biking nonprofit he founded in 2011. But that’s not all, according to brain scientists. They point to another important benefit: Exercising both body and brain can help people stay healthier longer. The new thinking about aging considers not just how long one lives, but how vibrant one stays later in life. “If you’re living, you want to be living well,” said Tim Peterson, an assistant professor of internal medicine at the Washington University School of Medicine in St. Louis. “Most people who were interested in life span and were studying genes — which control life span — switched to ‘healthspan.’” “Healthspan,” a coinage now gaining traction, refers to the years that a person can expect to live in generally good health — free of chronic illnesses and cognitive decline that can emerge near life’s end. Although there’s only so much a person can do to delay the onset of disease, there’s plenty that scientists are learning to improve your chances of a better healthspan. © 2020 Kaiser Family Foundation

Keyword: Development of the Brain
Link ID: 26932 - Posted: 01.04.2020

Jerold Chun, M.D., Ph.D. Alzheimer’s disease (AD) is the most common cause of dementia, currently affecting an estimated 5.8 million Americans. It has been over a century since AD was first described, but it is still not sufficiently well understood to enable development of drugs to treat it. As lifespan continues to rise and for myriad other reasons, the number of AD cases per state in the US is predicted to increase 12 to 43 percent over the next five years. The lack of disease-modifying treatments may reflect a feature of AD pathology that was first noted in its initial description: the vast heterogeneity of the hallmark “senile plaques” that are found in all AD brains. When Alois Alzheimer and Oskar Fischer described the first cases of AD, they noted plaque accumulations of a protein called amyloid that builds up in between brain cells and interrupts cell-to-cell communication; amyloid plaques vary in size, shape, abundance, and location within the brain. “Among the plaques in the cerebral cortex many were of an extraordinary size, such as I have never seen,” Alois Alzheimer stated. “Some evidently arose from the fusion of smaller ones since they contained several central cores, but others had one exceptionally big central core and uncommonly large halo.” Disease heterogeneity extends to behavior and includes varying age of onset, symptoms, and disease progression. Some variability may be explained by genetic heterogeneity, since more than 33 AD risk factor genes have been identified via a technique called “genome wide association studies” (GWAS), which broadly samples DNA from cells outside of the brain to identify mutations that are present in every cell of the body. None of these genes, however, are considered to cause AD. © 2020 The Dana Foundation

Keyword: Alzheimers; Genes & Behavior
Link ID: 26931 - Posted: 01.04.2020

By Knvul Sheikh When researchers began tinkering with a class of tranquilizer drugs called benzodiazepines in the 1950s, they felt they had uncovered a solution to modern anxiety and insomnia. Benzodiazepines worked quickly and effectively to quell racing heartbeats and dismiss spinning thoughts. The dozen or so different types — including Xanax, Valium, Ativan and Klonopin — became the most frequently prescribed drugs around the world, even as concerns arose about their potential side effects and addictive properties. “Patients themselves, and not the medical profession, were the first to realize that long-term use of benzodiazepines can cause problems,” wrote Dr. Heather Ashton, a British psychopharmacologist. She said that patients who had been on the medications for months or years would come to her with fears that the drugs were making them more ill. Some continued to have symptoms of depression or anxiety. Others had developed muscle weakness, memory lapses, or heart or digestive issues. Dr. Ashton would dedicate much of her career to listening to hundreds of patients’ experiences and rigorously collecting data. The result of her work, in 1999, was “Benzodiazepines: How They Work And How To Withdraw.” Now known simply as “The Ashton Manual,” it has become a cornerstone for those looking to quit the drugs safely. Addiction researchers worldwide still cite it in studies on benzodiazepines. And patient support groups have translated and distributed it in about a dozen languages. Dr. Ashton died on Sept. 15, 2019, at her home in Newcastle upon Tyne, England. She was 90. Her death, which had not been widely reported, was confirmed by her son John. Image“Benzodiazepines: How They Work and How to Withdraw,” better known as “The Ashton Manual,” has become a cornerstone for those looking to quit anxiety drugs safely. © 2020 The New York Times Company

Keyword: Drug Abuse
Link ID: 26930 - Posted: 01.04.2020

A cousin of the starfish that resides in the coral reefs of the Caribbean and Gulf of Mexico lacks eyes but can still see, according to scientists who studied the creature. Researchers said on Thursday that the red brittle star, called Ophiocoma wendtii, joins a species of sea urchin as the only creatures known to be able to see without having eyes — known as extraocular vision. The red brittle star possesses this exotic capability thanks to light-sensing cells, called photoreceptors, covering its body and pigment cells, called chromatophores, that move during the day to facilitate the animal's dramatic colour change from a deep reddish-brown in daytime to a striped beige at night. Brittle stars, with five radiating arms extending from a central disk, are related to starfish (also called sea stars), sea cucumbers, sea urchins and others in a group of marine invertebrates called echinoderms. They have a nervous system but no brain. Looking for a safe hiding place The red brittle star — which measure up to about 35 centimetres (14 inches) from arm tip to arm tip — lives in bright and complex habitats, with high predation threats from reef fish. It stays hidden during daytime — making the ability to spot a safe place to hide critical — and comes out at night to feed on detritus. Its photoreceptors are surrounded during daytime by chromatophores that narrow the field of the light being detected, making each photoreceptor like the pixel of a computer image that, when combined with other pixels, makes a whole image. The visual system does not work at night, when the chromatophores contract. "If our conclusions about the chromatophores are correct, this is a beautiful example of innovation in evolution," said Lauren Sumner-Rooney, a research fellow at Oxford University Museum of Natural History, who led the study published in the journal Current Biology. ©2020 CBC/Radio-Canada.

Keyword: Vision; Evolution
Link ID: 26929 - Posted: 01.04.2020

Nicola Davis and Hannah Devlin Tangles of a protein found inside the brain cells of people with Alzheimer’s disease can be used to predict future brain shrinkage, research suggests. In healthy people, a protein called tau is important in supporting the internal structure of brain cells. However, in those with Alzheimer’s, chemical changes take place that cause the protein to form tangles that disrupt the cells. Such tangles have previously been linked to a loss of brain cells. Now scientists have used imaging techniques to track the extent of tau tangles in the brains of those with early signs of Alzheimer’s, revealing that levels of the protein predict not only how much brain shrinkage will subsequently occur, but where. “Our study supports the notion that tau pathology accumulates upstream of brain tissue loss and clinical symptoms,” said Prof Gil Rabinovici, a co-author of the research from the University of California, San Francisco. A number of drugs targeting tau tangles are currently in clinical trials, including some that aim to interfere with the production of tau in the brain or its spread between cells. Dr Renaud La Joie, another author of the research, said the findings suggested the imaging technique could prove valuable both in choosing which patients to enrol to test such drugs and in monitoring whether the drugs work. Dr Laura Phipps, of Alzheimer’s Research UK, said: “The ability to track tau in the brain will be critical for testing treatments designed to prevent the protein causing damage, and the scans used in this study could be an important tool for future clinical trials.” Writing in the journal Science Translational Medicine, La Joie and colleagues report how they used an imaging technique called PET to study the brains of 32 people aged between 49 and 83 who were in the early stages of showing Alzheimer’s symptoms. © 2020 Guardian News & Media Limited

Keyword: Alzheimers; Brain imaging
Link ID: 26928 - Posted: 01.02.2020

By Lisa Sanders, M.D. The 67-year-old woman had just flown back to her old hometown, Eugene, Ore., to pick up one more load of boxes to move them to her new hometown, Homer, Alaska. As usual, the shuttle to long-term parking was nowhere in sight, so she pulled out the handles of her bags and wheeled them down the now-familiar airport road. It was a long walk — maybe half a mile — but it was a beautiful afternoon for it. A lone woman walking down this rarely used road in the airport caught the attention of Diana Chappell, an off-duty emergency medical technician, on her way to catch her own flight. She watched as the woman approached a building where some airport E.M.T.s were stationed. Suddenly the woman stopped. She rose to her toes and turned gracefully, then toppled over like a felled tree and just lay there. Chappell jumped out of the car and ran to the woman. She was awake but couldn’t sit up. Chappell helped her move to the side of the road and took a quick visual survey. The woman had a scrape over her left eye where her glasses had smashed into her face. Her left knee was bleeding, and her left wrist was swelling. She’d dropped the handle of one of her rolling bags, the woman explained. When she tried to pick it up, she fell. But she felt fine now. As she spoke, Chappell noticed that her speech was slightly slurred and that the left side of her mouth wasn’t moving normally. “I don’t know you, but your speech sounds a little slurred,” she said. “Have you been drinking?” Not at all, the woman answered — surprised by the question. Chappell introduced herself, then asked the woman if she could do a few quick tests to make sure she was O.K. Chappell asked her to smile, but the left side of the patient’s mouth did not cooperate; she asked her to shrug her shoulders, and the left side wouldn’t stay up. You need to go to the hospital, she told the woman. The woman protested; she felt fine. At least let me call my E.M.T. pals to check your blood pressure, Chappell insisted. After a fall like that, it could be high. The woman reluctantly agreed, and Chappell called her colleagues. The woman on the ground was embarrassed by the flashing lights of the emergency vehicle but allowed her blood pressure to be taken. It was sky-high. She really did need to go to the hospital. © 2020 The New York Times Company

Keyword: Stroke
Link ID: 26927 - Posted: 01.02.2020

Judith Grisel I used to think addiction was caused by screwy molecules in the brain, and would be cured by neuroscience. I began learning about how the brain works after I ended up in treatment for drug addiction in the mid-1980s, when hopes for neuroscientific cures were as overblown as the hairstyles. My own journey away from the destructive cycle of addiction has been sourced much more by factors outside my brain Like many at the time, I envisioned the brain as executive director of an epic drama – solely responsible for the total picture of what I did, felt and thought. My specific purpose in getting a doctorate in behavioural neuroscience was to discover the neural explanation for my irrational choices around mind-altering chemicals. What was the faulty neural switch that swept away heartfelt promises or strongly held convictions in response to practically every opportunity to twist reality? I made increasingly risky and harebrained decisions, as the possibility of transient bliss in a shot of cocaine, a belly full of booze or a head in the (cannabis) clouds came to outweigh my obligations or common sense. Final exams, “last chances” at work, or loved ones’ funerals, for example, didn’t stand a chance compared to hitching myself to whatever intoxicating ride I could catch. By the time I hit bottom, the choice between facing stark reality or using drugs to escape was no choice at all: cortical regulation had completely given way to subcortical impulses and habits. Globally 35 million people are estimated to suffer from drug use disorders. The causes of this public health disaster are complicated, but it is widely accepted that about half of the contribution comes from inherited risk, and the rest an unfortunate confluence of environmental factors interacting with that biologic vulnerability.

Keyword: Drug Abuse
Link ID: 26926 - Posted: 01.02.2020

By Gretchen Reynolds What’s good for your muscles can also be good for your mind. A Single Workout Can Alter the Brain A single, moderate workout may immediately change how our brains function and how well we recognize common names and similar information, according to a promising new study of exercise, memory and aging. The study adds to growing evidence that exercise can have rapid effects on brain function and also that these effects could accumulate and lead to long-term improvements in how our brains operate and we remember. Until recently, scientists thought that by adulthood, human brains were relatively fixed in their structure and function, especially compared to malleable tissues, like muscle, that continually grow and shrivel in direct response to how we live our lives. But multiple, newer experiments have shown that adult brains, in fact, can be quite plastic, rewiring and reshaping themselves in various ways, depending on our lifestyles. A hormone that is released during exercise may improve brain health and lessen the damage and memory loss that occur during dementia, a new study finds. The study, which was published this month in Nature Medicine, involved mice, but its findings could help to explain how, at a molecular level, exercise protects our brains and possibly preserves memory and thinking skills, even in people whose pasts are fading. Considerable scientific evidence already demonstrates that exercise remodels brains and affects thinking. Researchers have shown in rats and mice that running ramps up the creation of new brain cells in the hippocampus, a portion of the brain devoted to memory formation and storage. Exercise also can improve the health and function of the synapses between neurons there, allowing brain cells to better communicate. © 2019 The New York Times Company

Keyword: Alzheimers
Link ID: 26925 - Posted: 12.30.2019

By John Horgan I just spent a week at a symposium on the mind-body problem, the deepest of all mysteries. The mind-body problem--which encompasses consciousness, free will and the meaning of life--concerns who we really are. Are we matter, which just happens to give rise to mind? Or could mind be the basis of reality, as many sages have insisted? The week-long powwow, called “Physics, Experience and Metaphysics,” took place at Esalen Institute, the legendary retreat center in Big Sur, California. Fifteen men and women representing physics, psychology, philosophy, religious studies and other fields sat in a room overlooking the Pacific and swapped mind-body ideas. What made the conference unusual, at least for me, was the emphasis on what were called “exceptional experiences,” involving telepathy, telekinesis, astral projection, past-life recall and mysticism. I’ve been obsessed with mysticism since I was a kid. As defined by William James in The Varieties of Religious Experience, mystical experiences are breaches in your ordinary life, during which you encounter absolute reality--or, if you prefer, God. You believe, you know, you are seeing things the way they really are. These experiences are usually brief, lasting only minutes or hours. They can be triggered by trauma, prayer, meditation or drugs, or they may strike you out of the blue. Advertisement I’ve had mild mystical intuitions while sober, for example, during a Buddhist retreat last year. But my most intense experience, by far, happened in 1981 while I was under the influence of a potent hallucinogen. I tried to come to terms with my experiences in my book Rational Mysticism, but my obsession endures. © 2019 Scientific American

Keyword: Consciousness
Link ID: 26924 - Posted: 12.30.2019

By Jane E. Brody Every now and then I write a column as much to push myself to act as to inform and motivate my readers. What follows is a prime example. Last year in a column entitled “Hearing Loss Threatens Mind, Life and Limb,” I summarized the current state of knowledge about the myriad health-damaging effects linked to untreated hearing loss, a problem that afflicts nearly 38 million Americans and, according to two huge recent studies, increases the risk of dementia, depression, falls and even cardiovascular diseases. Knowing that my own hearing leaves something to be desired, the research I did for that column motivated me to get a proper audiology exam. The results indicated that a well-fitted hearing aid could help me hear significantly better in the movies, theater, restaurants, social gatherings, lecture halls, even in the locker room where the noise of hair dryers, hand dryers and swimsuit wringers often challenges my ability to converse with my soft-spoken friends. That was six months ago, and I’ve yet to go back to get that recommended hearing aid. Now, though, I have a new source of motivation. A large study has documented that even among people with so-called normal hearing, those with only slightly poorer hearing than perfect can experience cognitive deficits. That means a diminished ability to get top scores on standardized tests of brain function, like matching numbers with symbols within a specified time period. But while you may never need or want to do that, you most likely do want to maximize and maintain cognitive function: your ability to think clearly, plan rationally and remember accurately, especially as you get older. While under normal circumstances, cognitive losses occur gradually as people age, the wisest course may well be to minimize and delay them as long as possible and in doing so, reduce the risk of dementia. Hearing loss is now known to be the largest modifiable risk factor for developing dementia, exceeding that of smoking, high blood pressure, lack of exercise and social isolation, according to an international analysis published in The Lancet in 2017. © 2019 The New York Times Company

Keyword: Hearing
Link ID: 26923 - Posted: 12.30.2019

By Debbie Jackson BBC Scotland "Fluffing your son's hair, really hugging him, holding his hand." For someone who has been through what she has in the space of a year, Corinne Hutton doesn't need much to make her happy. Last January she got the double hand transplant she had been waiting more than five years for, and feared would never happen. This January, she will celebrate her "handiversary", a year since a surgeon handed her back her independence. Being able to do the simplest things for 11-year-old son Rory means the world to Finding Your Feet charity founder Cor. "From an emotional point of view to be able to do things for him - make the packed lunches or the washing, or do the ironing is great," she said. "But on top of that, being able to hold his hand, fluff his hair, little things that might not be hugely exciting to him - but they matter a lot to me. "People don't appreciate what it means to have lost them," she said. Cor became the first Scot to undergo a double hand transplant when, in a 12-hour procedure, Prof Simon Kay attached two donor hands to her arms at Leeds general Infirmary. The 48-year-old lost her hands and feet in 2013 after suffering acute pneumonia and sepsis, which almost killed her. After more than a dozen false alarms over the years, a match for her own blood group, skin tone and hand size had been found. Much celebration and wonder was made of the news that the transplant had finally happened, but the aftermath was far from easy. © 2019 BBC.

Keyword: Pain & Touch
Link ID: 26922 - Posted: 12.30.2019

By Sarah Bate Alice is six years old. She struggles to make friends at school and often sits alone in the playground. She loses her parents in the supermarket and approaches strangers at pickup. Once she became separated from her family on a trip to the zoo, and she now has an intense fear of crowded places. Alice has a condition called face blindness, also known as prosopagnosia. This difficulty in recognising facial identity affects 2 percent of the population. Like Alice, most of these people are born with the condition, although a small number acquire face-recognition difficulties after brain injury or illness. Unfortunately, face blindness seems largely resilient to improvement. Yet a very recent study offers more promising findings: children’s face-recognition skills substantially improved after they played a modified version of the game Guess Who?over a two-week period. In the traditional version of Guess Who?, two players see an array of 24 cartoon faces, and each selects a target. Both then take turns asking yes/no questions about the appearance of their opponent’s chosen face, typically inquiring about eye color, hairstyle and accessories such as hats or spectacles. The players use the answers to eliminate faces in the array; when only one remains, they can guess the identity of their opponent’s character. The experimental version of the game preserved this basic setup but used lifelike faces that differed only in the size or spacing of the eyes, nose or mouth. That is, the hairstyle and outer face shape were identical, and children had to read the faces solely on the basis of small differences between the inner features. This manipulation is thought to reflect a key processing strategy that underlies human face recognition: the ability to account not only for the size and shape of features but also the spacing between them. Evidence suggests this ability to process faces “holistically” is impaired in face blindness. The Guess Who? training program aimed to capitalize on this link. Children progressed through 10 levels of the game, with differences between the inner features becoming progressively less obvious. Children played for half an hour per day on any 10 days over a two-week period, advancing to the next level when they won the game on two consecutive rounds. © 2019 Scientific American

Keyword: Attention
Link ID: 26921 - Posted: 12.27.2019

By Christie Aschwanden When she was 24, Susannah Cahalan developed a sudden psychosis. She grew paranoid — convinced her apartment was infested with bedbugs, that people were spying on her, that her boyfriend was cheating. She started to believe she could age people with her mind. As she recounted in her 2013 bestseller, “Brain on Fire: My Month of Madness,” she received several misdiagnoses (bipolar disorder, schizoaffective disorder) before an alert doctor discovered the true culprit: autoimmune encephalitis. The moment her illness was deemed neurological, ”as in physical, in the body, real,” rather than psychiatric, “in the mind and therefore somehow less real,” the quality of her care drastically improved, Cahalan writes in her new book, “The Great Pretender.” Sympathy and understanding replaced the detached attitude that had defined her treatment as a mental patient, “as if a mental illness were my fault, whereas a physical illness was something unearned, something ‘real,’” she writes. Cahalan, a journalist, recovered from her brief psychosis, but the distinction between physical and mental illness continued to perplex her. “What does mental illness mean, anyway, and why would one affliction be more ‘real’ than another?” she asks. These questions form the backbone of “The Great Pretender.” The book centers on the work of David Rosenhan, a Stanford psychologist whose paper, “On Being Sane in Insane Places,” was an instant sensation when it was published in the journal Science in 1973. The paper begins with a question: “If sanity and insanity exist, how shall we know them?”

Keyword: Schizophrenia
Link ID: 26920 - Posted: 12.27.2019

Getting a Good Night’s Sleep Without Drugs By Jane E. Brody As many as 20 percent to 30 percent of people in the general population sleep poorly. They may have difficulty falling asleep or staying asleep, some awaken much too early, while others do not feel rested despite spending a full night seemingly asleep in bed. For one person in 10, insomnia is a chronic problem that repeats itself night after night. Little wonder that so many resort to sleeping pills to cope with it. But experts report that there are better, safer and more long-lasting alternatives than prescription drugs to treat this common problem. The alternatives are especially valuable for older people who metabolize drugs more slowly, are more likely to have treatable underlying causes of their insomnia and are more susceptible to adverse side effects of medications. Is Your Sleep Cycle Out of Sync? It May Be Genetic By Jane E. Brody Early to bed, early to rise — a fine plan for a dairy farmer who has to get up long before dawn to milk the cows. But if you’re someone who works all day with stocks and clients and may want to enjoy an evening out now and then, it would be better not to be getting up at 2 a.m. and have to struggle to stay awake through dinner or a show. Such is the challenge faced by a friend who has what sleep specialists call an advanced sleep phase. Her biological sleep-wake cycle, or circadian rhythm, is out of sync with the demands of the modern world. Read more>>> By Perri Klass, M.D. The biology of adolescent sleep reflects a natural and normal delay in melatonin secretion that leads to a later sleep onset time, which unfortunately coincides with early high school start times, creating a high-stress set up. Pediatricians often see adolescents with insomnia, who have trouble falling asleep or staying asleep, waking up too early or finding sleep not restful or refreshing. © 2019 The New York Times Company

Keyword: Sleep
Link ID: 26919 - Posted: 12.27.2019

Alejandra Manjarrez When he was a postdoc at KU Leuven in Belgium, Daniel Vigo helped analyze results from an experiment that simulated a spaceflight to Mars. Six crew members were secluded in an artificially lit, spacecraft-like facility for 520 days starting in June 2010. Part of an international project known as the Mars500 mission, the experiment aimed to assess the psychological, social, and biological effects of prolonged confinement and isolation, along with the absence of normal day and night rhythms. That isolation, of course, was just an illusion, manufactured by the Institute for Biomedical Problems of the Russian Academy of Sciences and the European Space Agency. The simulation took place in central Moscow, where any sudden medical problems could have received immediate attention—as Vigo, now a researcher at the Catholic University of Argentina and a member of the National Scientific and Technical Research Council (CONICET), tells The Scientist in Spanish. He began wondering what would happen in a less artificial scenario. One of the key findings from the study, for example, was that confinement—in this case in an artificially lit building—disrupted normal sleep patterns: the crew members in the Mars500 experiment had suffered from sleep problems and rapidly fell into sleep-wake routines that were out of sync with one another. But what would the story be like for people experiencing a similarly extreme living environment, Vigo wondered, without the safety net provided by a carefully controlled simulation? © 1986–2019 The Scientist

Keyword: Sleep; Biological Rhythms
Link ID: 26918 - Posted: 12.27.2019

By Nicholas Bakalar The right diet might help you sleep better. In a study of 77,860 postmenopausal women, researchers found that consuming foods that had a low glycemic index is associated with a reduced risk for insomnia. Foods with low glycemic indexes — for example, vegetables, nuts and whole grain breads — have carbohydrates that are slowly absorbed and cause lower, and slower, rises in blood glucose and insulin levels after being consumed. For this study, in the American Journal of Clinical Nutrition, participants completed lengthy questionnaires about what foods they ate and how often. They also reported their degree of insomnia at the start of the study and after three years of follow-up. Compared with the one-fifth of participants whose diet had the lowest glycemic index, those with the highest were 11 percent more likely to have insomnia. Some low-glycemic index foods — whole grains and dairy foods, for example — were not associated with reduced insomnia. But people who ate the most fruits and vegetables were about 14 percent less likely to have insomnia, and the largest consumers of fiber were 13 percent less likely. In contrast, women who ate the most refined grains had a 16 percent higher risk of insomnia than those who ate the least. Although the study controlled for many health and behavioral characteristics, the study showed only an association and could not prove cause and effect. “Randomized controlled trials examining dietary patterns in relation to insomnia are needed to clarify these findings,” the authors write. © 2019 The New York Times Company

Keyword: Sleep; Obesity
Link ID: 26917 - Posted: 12.27.2019