Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Frances Stead Sellers October 8 at 9:00 AM One late-summer day last year, my surroundings started playing tricks with me. The letters I typed on my computer screen looked fuzzy. Objects on my desk seemed to slip sideways, escaping their own outlines. My colleagues, viewed across the room, appeared to have shifted slightly so that now they stood or sat as ghostly silhouettes beside themselves. I put it down to fatigue or lack of food. The next morning, neither sleep nor sustenance had cured me. I squinted my way out of my apartment and reached for the handrail that runs alongside the front steps. As my left arm extended, my forearm divided somewhere between my elbow and my wrist, so that now I had two left hands and 10 fingers, groping for two railings that ran not parallel to one another but diverged into the distance. Below my four feet, the neat brick geometry of a Capitol Hill sidewalk had become a muddled mosaic. I looked up to see two identically dressed men, swinging their arms in unison as they marched, lockstep, toward me. I closed one eye and then the other. Both worked well. In fact, each restored reassuring order to the world: One man. One left hand. One railing. But when I tried to walk with only my right eye open, I keeled over to the side. I lost my balance — and a little confidence. “Not good,” the ophthalmologist murmured later that morning as he tracked the movement of my eyes from left to right and back again. Diplopia was his diagnosis. Greek for double vision. How did I feel? he asked. Had I had a virus? I needn’t go to the ER, he said, unless I developed a splitting headache or started vomiting. But I should see a neuro-ophthalmologist. Soon. © 1996-2017 The Washington Post
Keyword: Vision; Movement Disorders
Link ID: 24166 - Posted: 10.09.2017
By Leslie Kaufman It is 7 p.m. on a spring Friday, and the Highland Hospital emergency room in Oakland, one of the busiest trauma centers in northern California, is expecting. When the patient—a young bicyclist hit by a car—arrives, blood is streaming down his temples. From a warren of care rooms, a team of nearly a dozen doctors and nurses materializes and buzzes around the patient. Amelia Breyre, a first-year resident who looks not much older than a college sophomore, immediately takes charge. As soon as the team finishes immobilizing the victim, Breyre must begin making split-second decisions: X-ray? Intubate? Transfusion? She quickly determines there is no internal bleeding or need for surgery and orders up neck X-rays after bandaging the patient’s head. Breyre will make a half-dozen similar critical choices tonight. Highland, a teaching hospital, is perhaps the most selective emergency-medical residency in the nation. To be here, she must be outstanding. To succeed, though, she must stay sharp. That quality of focus—amid the chaos and battered humanity that comes through Highland’s doors—is itself in need of urgent care. Andrew Herring, an emergency-room doctor who supervises Breyre and 40 other residents, is worried about the team. ER doctors are shift workers, and their hours are spread over a dizzying, ever-changing schedule of mornings, afternoons, and nights that total 20 different shifts a month. That’s meant to equally distribute the burden of nocturnal work across an entire team of physicians. But despite those good intentions, Herring says, the result is that every single one of them is exhausted and sleep deprived. That’s dangerous for doctor and patient alike.
Keyword: Biological Rhythms
Link ID: 24165 - Posted: 10.09.2017
Children with attention deficit hyperactivity disorder may fidget, tap and swivel around in a chair much more than normally developing children because it helps them to learn complex material, psychologists have found. ADHD is often perceived as a behavioural problem because it can result in symptoms such as inattention, impulsivity, and hyperactivity that can affect social interaction and learning. Scientists increasingly recognize ADHD as a brain disorder that affects about five per cent of the school-age population. Now brain tests show children with ADHD tend to learn less when sitting still compared to when they're moving. It is not for lack of motivation, says Prof. Mark Rapport, a child psychopathology researcher who focuses on ADHD at the University of Central Florida in Orlando. Rapport and his colleagues set out to test an observation made by many parents — that children with ADHD can pay attention if they are doing an activity they enjoy. They put 32 boys aged eight to 12 with ADHD and 30 of their peers who are not affected by the disorder through a battery of memory and other tests. Participants watched two videos on separate days: an instructional math lesson without performing the calculations, and a scene from Star Wars Episode 1 — The Phantom Menace. During the Star Wars movie, the boys with ADHD did not squirm more than other children, but when asked to concentrate on the math lesson, there was a difference between the two groups. "All children and all people in general, moved more when they were engaged in a working memory task. Kids with ADHD move about twice as much under the same conditions," Rapport said. ©2017 CBC/Radio-Canada.
Keyword: ADHD; Learning & Memory
Link ID: 24164 - Posted: 10.09.2017
By Frank Swain Just what you need in the age of ubiquitous surveillance: the latest cochlear implants will allow users stream audio directly from their iPhone into their cochlear nerve. Apple and implant manufacturer Cochlear have made “Made for iPhone” connectivity available for any hearing implants that use the next-generation Nucleus 7 sound processor. The advance means that these implants can also stream music and Netflix shows. The technology was first unveiled in 2014 when it was added to hearing aids such as the Starkey Halo and ReSound LiNX. But this is the first time it’s been linked into the central nervous system. While some cochlear implants already offer Bluetooth connectivity, these often require users to wear extra dongles or other intermediary devices to pick up digital signals, and then rebroadcast them to the hearing aid as radio. This technology simply beams the signal right into the brain. It’s also a better way to use Bluetooth. Bluetooth headsets have been commonplace since the early 2000s, but the energy-sapping technology has meant they are typically clunky devices with poor battery life. In 2014, Apple technicians developed a way to stream audio over the low energy Bluetooth format used by wearables such as FitBits. Now, tiny devices like hearing aids – and Apple’s Airpods — can stream audio signals for up to a week on a battery the size of an aspirin. © Copyright New Scientist Ltd.
Keyword: Hearing
Link ID: 24163 - Posted: 10.09.2017
By Neuroskeptic A curious flurry of headlines in praise of beer appeared this week: Beer really DOES make you happier! Key molecule boosts brain’s reward centre Drinking Beer Makes You Really Happy, Confirms Awesome New Study Drinking beer can make you happy, researchers claim It was reported that scientists from Germany have discovered that a molecule in beer called hordenine activates dopamine receptors in the brain, and thus produces a positive mood. The research in question was published back in March of this year, so I’m not sure why it only made the headlines this week – maybe Oktoberfest had something to do with it. Either way, the study did indeed find that hordenine is a dopamine D2 receptor agonist, but it’s not clear this has any relevance to beer drinkers. The German researchers, Sommer et al., are chemists, not neuroscientists. They used computational simulations to model whether 13,000 known ‘food-derived’ molecules would bind to the D2 receptor. The hordenine molecule was predicted to fit the receptor, and follow-up experiments showed that it does indeed bind to it, suggesting possible psychoactive properties.
Keyword: Drug Abuse
Link ID: 24162 - Posted: 10.09.2017
By HEATHER MURPHY Well done -- you are an atypical person. Usually people notice the other, smaller toothbrush first. Most people will quickly spot the toothbrush on the front of the counter, but take longer — or even fail to find — the much bigger one behind it. The oversight has to do with scale. People have a tendency to miss objects when their size is inconsistent with their surroundings, according to a recent study in Current Biology. This is just the latest in a robust body of research that reveals how expectations dramatically affect our ability to notice what’s around us. Though the image above was provided by the authors of the study to illuminate their point, the study was set up slightly differently. The researchers were interested not only in what people saw — but also in how their performance compared with computers. Flesh-and-blood participants and a deep neural network, a computer system with advanced machine vision, were given one second to select an object in a computer-rendered scene, such as the one below. The object could be absent, presented at scale or featured at four times scale. Is there a parking meter in this image? Once you know what to expect, of course, it's easier. In the study, the object was either absent, presented at scale or featured at four times scale. Humans missed giant objects about 13 percent more than normal-sized objects, the researchers found. Scale had no impact on machine performance. “We were surprised about how compelling of an effect it is,” said Miguel Eckstein, a psychologist at the University of California, Santa Barbara’s Vision and Image Understanding Laboratory and one of the authors. In particular, the first time a person examined a photo with a giant object, the object often seemed to be invisible. But it’s not a deficiency, he said: “This is a useful trick the brain does to rapidly process scenes and find what we are looking for.” © 2017 The New York Times Company
Keyword: Attention
Link ID: 24161 - Posted: 10.07.2017
Victoria Lorrimar Michael Burdett The idea of dangerous, inhumane artificial intelligence taking over the world is familiar to many of us, thanks to cautionary tales such as the Matrix and Terminator franchises. But what about the more sympathetic portrayals of robots? The benevolence of Arnold Schwarzenegger’s Terminator character in the later movies of the franchise may have been the exception in older portrayals of AI, but human-like machines are often represented more positively in contemporary films. Think of Ex Machina, Chappie or A.I. Artificial Intelligence. This shift is very likely representative of a wider shift in how we think about these technologies in reality. Blade Runner 2049, long-anticipated sequel to the original 1982 Blade Runner film, is a part of this shift. The ability of science fiction to inspire technological innovation is well-known. A lot of science fiction writers are scientists and technologists (Arthur C Clarke and Geoffrey Landis are two examples), and ideas from science fiction have sparked more serious scientific research (touch screens and tablet computers are common examples). But science fiction serves other purposes too. It can be a tool for exploring the social and ethical implications of technologies being developed now – a fictional laboratory for testing possible futures. It can also prepare us to deal with certain technologies as they arise in the real world. © 2010–2017, The Conversation US, Inc.
Keyword: Consciousness; Robotics
Link ID: 24160 - Posted: 10.07.2017
By Shawna Williams THE PAPER P. Réu et al., “The lifespan and turnover of microglia in the human brain,” Cell Rep, 20:779-84, 2017. A RENEWABLE RESOURCE? Evidence has emerged that some of the brain’s cells can be renewed in adulthood, but it is difficult to study the turnover of cells in the human brain. When it comes to microglia, immune cells that ward off infection in the central nervous system, it’s been unclear how “the maintenance of their numbers is controlled and to what extent they are exchanged,” says stem cell researcher Jonas Frisén of the Karolinska Institute in Sweden. NUCLEAR SIGNATURE Frisén and colleagues used brain tissue from autopsies, together with the known changes in concentrations of carbon-14 in the atmosphere over time, to estimate how frequently microglia are renewed. They also analyzed microglia from the donated brains of two patients who had received a labeled nucleoside as part of a cancer treatment trial in the 1990s. SLOW CHURN Microglia, which populate the brain as blood cell progenitors during fetal development, were replaced at a median rate of 28 percent per year; on average, the cells were 4.2 years old. For Marie-Ève Tremblay, a neuroscientist at the Université Laval in Québec City who was not involved in the study, what stands out is the range of microglia ages found—from brand-new to more than 20 years old. “That’s quite striking!” she writes in an email to The Scientist. © 1986-2017 The Scientist
Keyword: Glia; Development of the Brain
Link ID: 24159 - Posted: 10.07.2017
By Giorgia Guglielmi This mantis shrimp (Gonodactylus smithii) might have a much more elaborate brain than previously thought. That’s the conclusion of the first study to peer into the head of more than 200 crustaceans, including crabs, shrimp, and lobsters. Researchers discovered that the brain of mantis shrimp contains memory and learning centers, called mushroom bodies, which so far have been seen only in insects. The team also found similar structures in close relatives of these sea creatures: cleaner shrimp, pistol shrimp, and hermit crabs. This may not be a coincidence, the researchers say, because mantis shrimp and their brethren are the only crustaceans that hunt over long distances and might have to remember where to get food. But the finding, reported in eLife, is likely to stir debate: Scientists agree that mushroom bodies evolved after the insect lineage split off from the crustacean lineage about 480 million years ago; finding these learning centers in mantis shrimp means that either mushroom bodies are much more ancient than scientists realized and were lost in all crustaceans but mantis shrimp, or that these structures are similar to their counterparts in insects but have evolved independently. © 2017 American Association for the Advancement of Science.
Keyword: Learning & Memory; Evolution
Link ID: 24158 - Posted: 10.07.2017
By GINA KOLATA For the first time, doctors have used gene therapy to stave off a fatal degenerative brain disease, an achievement that some experts had thought impossible. The key to making the therapy work? One of medicine’s greatest villains: HIV. The patients were children who had inherited a mutated gene causing a rare disorder, adrenoleukodystrophy, or ALD. Nerve cells in the brain die, and in a few short years, children lose the ability to walk or talk. They become unable to eat without a feeding tube, to see, hear or think. They usually die within five years of diagnosis. The disease strikes about one in 20,000 boys; symptoms first occur at an average age of 7. The only treatment is a bone-marrow transplant — if a compatible donor can be found — or a transplant with cord blood, if it was saved at birth. But such transplants are an onerous and dangerous therapy, with a mortality rate as high as 20 percent. Some who survive are left with lifelong disabilities. Now a new study, published online in the New England Journal of Medicine, indicates that gene therapy can hold off ALD without side effects, but only if it is begun when the only signs of deterioration are changes in brain scans. The study involved 17 boys (the disease strikes males almost exclusively), ages 4 to 13. All got gene therapy. Two years later, 15 were functioning normally without obvious symptoms. “To me, it seems to be working,” said Dr. Jim Wilson, director of the gene therapy program at the University of Pennsylvania’s Perelman School of Medicine, who was not involved in the new study. © 2017 The New York Times Company
Keyword: Genes & Behavior; Development of the Brain
Link ID: 24157 - Posted: 10.06.2017
By Ann Gibbons The insult "You're a Neandertal!" has taken on dramatic new meaning in the past few years, as researchers have begun to identify the genes many of us inherited from our long-extinct relatives. By sequencing a remarkably complete genome from a 50,000-year-old bone fragment of a female Neandertal found in Vindija Cave in Croatia, researchers report online today in Science a new trove of gene variants that living people outside of Africa obtained from Neandertals. Some of this DNA could influence cholesterol levels, the accumulation of belly fat, and the risk of schizophrenia and other diseases. The genome is only the second from a Neandertal sequenced to such high quality that it can reliably reveal when, where, and what DNA was passed from Neandertals to modern humans—and which diseases it may be causing or preventing today. "It's really exciting because it's more than two times better to have two Neandertal genomes," says evolutionary genomicist Tony Capra of Vanderbilt University in Nashville. The first Neandertal genome was a composite drawn from three individuals from Vindija Cave. Then, over the past few years, ancient DNA researchers sequenced two more Neandertal genomes, including another high-quality sequence from an individual that lived 122,000 years ago in the Altai Mountains of Siberia. Together, the genomes showed that living Europeans and Asians carry traces of DNA from Neandertals who mated with members of Homo sapiens soon after our species left Africa. (Most Africans lack Neandertal DNA as a result.) © 2017 American Association for the Advancement of Science.
Keyword: Obesity; Evolution
Link ID: 24156 - Posted: 10.06.2017
Hannah Devlin French scientists have been criticised for concealing the death of the patient at the centre of a breakthrough in which consciousness was restored to a man in a persistent vegetative state. The treatment was hailed as a major advance in the field and suggested that the outlook for these patients and their families might be less bleak than was previously thought. However, it has emerged that the scientists behind the research withheld the fact that the man, who remains anonymous, died a few months after receiving the therapy. The team justified the decision, citing the family’s wish to keep the death private and a concern that people might have wrongly linked the therapy, which involved nerve stimulation, to the 35-year-old’s death from a lung infection. However, others said the decision had created an over-optimistic narrative of a patient on an upward trajectory. Damian Cruse, a cognitive neuroscientist at the University of Birmingham, said: “I do worry that the media coverage of the study gave a more hopeful message to other families in this situation than the message that perhaps would have been delivered with all of the facts … If we protect patient anonymity, then there’s no reason not to be able to tell the full story.” When the paper came out last month, Angela Sirigu, who led the work at the Institut des Sciences Cognitives Marc Jeannerod in Lyon, France, told the Guardian: “He is still paralysed, he cannot talk, but he can respond. Now he is more aware.” © 2017 Guardian News and Media Limited
Keyword: Consciousness
Link ID: 24155 - Posted: 10.06.2017
By Michael Price Expensive medications tend to make us feel better, even when they’re no different than cheap generics. But they can also make us feel worse, according to a new study. Researchers have found that we’re more likely to experience negative side effects when we take a drug we think is pricier—a flip side of the placebo effect known as the “nocebo” effect. The work could help doctors decide whether to recommend brand-name or generic drugs depending on each patient’s expectations. In the study, researchers asked 49 people to test out a purported anti-itch cream that, in reality, contained no active ingredient. Some got “Solestan® Creme,” a fake brand name in a sleek blue box designed to look like other expensive brands on the market. Others received “Imotadil-LeniPharma Creme”—another fake, this time housed in a chintzier orange box resembling those typically used for generic drugs. “I put a lot of effort into making the designs convincing,” says study leader Alexandra Tinnermann, a neuroscientist at University Medical Center Hamburg-Eppendorf in Germany. The researchers rubbed one of the two creams on the volunteers’ forearms and waited a few minutes for it soak in. They told the participants that the cream could cause increased sensitivity to pain—a known side effect of real medications called hyperalgesia. Then the scientists affixed a small device to the volunteers’ arms that delivered a brief flash of heat up to about 45°C (or 113°F). © 2017 American Association for the Advancement of Science.
Keyword: Pain & Touch
Link ID: 24154 - Posted: 10.06.2017
By Clare Wilson OUR braininess may have evolved thanks to gene changes that made our brain cells less sticky. The cortex is the thin, highly folded outer layer of our brains and it is home to some of our most sophisticated mental abilities, such as planning, language and complex thoughts. Around three millimetres thick, this layer is folded into an intricate pattern of ridges and valleys, which allows the cortex to be large, but still fit into a relatively small space. Many larger mammals, such as primates, dolphins and horses, have various patterns of folds in their cortex, but folds are rarer in smaller animals like mice. So far, we have only identified a few genetic mutations that contributed to the evolution of the human brain, including ones that boosted the number of cells in the cortex. One theory about how the cortex came to be folded is that it buckled as the layer of cells expanded. Daniel del Toro at the Max Planck Institute of Neurobiology in Munich, Germany, and colleagues wondered if some of the genetic changes in our brain’s evolution might have been about more than just an increasing number of cells. They investigated the genes for two molecules – FLRT1 and FLRT3 – which make developing brain cells stick to each other more. Human brain cells produce only a small amount of these compounds, while mice brain cells make lots. Del Toro’s team created mice embryos that lacked functioning FLRT1 and FLRT3 genes, which meant their cortex cells were only loosely attached to each other, like those of humans. © Copyright New Scientist Ltd.
Keyword: Development of the Brain; Learning & Memory
Link ID: 24153 - Posted: 10.05.2017
By Emma Yasinski Scientists and physicians have tried countless methods to treat the nightmares, anxiety, and flashbacks of posttraumatic stress disorder (PTSD) in soldiers, from talk therapy to drugs designed to press the “delete” button on specific memories. Now, one group of researchers proposes another solution: Prevent the condition in the first place by predicting who is most likely to get it. In a new study, they say a 105-question survey already given to all U.S. soldiers may be able to do just that. “It’s a very important study,” says Sharon Dekel, who studies PTSD at Harvard Medical School in Boston, but was not involved in the new work. Only a minority of people exposed to trauma develop the disorder, and the new work may lead to better screening methods for this “vulnerable population,” she adds. U.S. Army soldiers have taken the Global Assessment Tool (GAT), a survey about their mental health, every 2 years since 2009. The confidential questionnaire asks soldiers to rate their agreement with statements like “My leaders respect and value me,” and “I believe there is a purpose to my life.” It’s meant to help soldiers understand their own strengths and weaknesses. But Yu-Chu Shen, a health economics researcher at the Naval Postgraduate School in Monterey, California, wondered whether the survey could also predict the likelihood of someone developing PTSD or depression. So she and colleagues designed a study to see how soldiers’ GAT scores aligned with later illnesses. They looked at 63,186 recruits who enlisted in the Army between 2009 and 2012 and had not yet been exposed to combat. The team then compared the scores with how the same soldiers fared on a postduty comprehensive health assessment that also looked for signs of PTSD and depression. © 2017 American Association for the Advancement of Science
Keyword: Stress
Link ID: 24152 - Posted: 10.05.2017
By Jessica Hamzelou AT LAST, we’ve seen how the brain memories when we sleep. By scanning slumbering people, researchers have watched how the “trace” of a memory moves from one region of the brain to another. “The initial memory trace kind of disappears, and at the same time, another emerges,” says Shahab Vahdat at Stanford University in California. It is the first time memories have been observed being filed away in humans during sleep, he says. Vahdat and his colleagues did this by finding people who were able to fall asleep in the confined, noisy space of an fMRI scanner, which is no easy undertaking. “We screened more than 50 people in a mock scanner, and only 13 made it through to the study,” says Vahdat. The team then taught this group of volunteers to press a set of keys in a specific sequence – in the same way that a pianist might learn to play a tune. It took each person between about 10 and 20 minutes to master a sequence involving five presses. “They had to learn to play it as quickly and as accurately as possible,” says Vahdat. Once they had learned the sequence, each volunteer put on a cap of EEG electrodes to monitor the electrical activity of their brain, and entered an fMRI scanner – which detects which regions of the brain are active. The team saw a specific pattern of brain activity while the volunteers performed the key-pressing task. Once they had stopped, this pattern kept replaying, as if each person was subconsciously revising what they had learned. © Copyright New Scientist Ltd.
Keyword: Sleep; Learning & Memory
Link ID: 24151 - Posted: 10.05.2017
By Simon Makin About 10 years ago David Adam scratched his finger on a barbed wire fence. The cut was shallow, but drew blood. As a science journalist and author of The Man Who Couldn't Stop: OCD and the True Story of a Life Lost in Thought, a book about his own struggles with obsessive-compulsive disorder, Adam had a good idea of what was in store. His OCD involved an obsessive fear of contracting HIV and produced a set of compulsive behaviors revolving around blood. In this instance he hurried home to get some tissue and returned to check there was not already any blood on the barbed-wire. “I looked and saw there was no blood on the tissue, looked underneath the fence, saw there was no blood, turned to walk away, and had to do it all again, and again and again,” he says. “You get stuck in this horrific cycle, where all the evidence you use to form judgments in everyday life tells you there’s no blood. And if anyone asked, you’d say ‘no.’ Yet, when you ask yourself, you say ‘maybe.’” Such compulsive behaviors, and the obsessions to which they are typically linked are what define OCD. Far from merely excessive tidiness, the mental disorder can have a devastating impact on a person’s life. Adam's story illustrates a curious feature of the condition. Sufferers are usually well aware their behavior is irrational but cannot stop themselves from doing whatever it is they feel compelled to do. Advertisement A new study published September 28 in Neuron uses mathematical modeling of decision-making during a simple game to provide insight into what might be going on. The game looked at a critical aspect of the way we perceive the world. Normally, a person's confidence about their knowledge of the surrounding environment guides their actions. “If I think it’s going to rain, I'm going to take an umbrella,” says lead author Matilde Vaghi. The study shows this link between belief and action is broken to some extent in people with OCD. As a consequence, what they do conflicts with what they know. This insight suggests compulsive behaviors are a core feature rather than merely a consequence of obsessions or a result of inaccurate beliefs. © 2017 Scientific America
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 24150 - Posted: 10.05.2017
By Caroline Williams We are used to hearing that meditation is good for the brain, but now it seems that not just any kind of meditation will do. Just like physical exercise, the kind of improvements you get depends on exactly how you train – and most of us are doing it all wrong. That the brain changes physically when we learn a new skill, like juggling or playing a musical instrument, has been known for over a decade. Previous studies had suggested that meditation does something similar for parts of the brain involved in focused attention. Two new studies published in Science Advances suggest that certain kinds of meditation can change social and emotional circuitry, too. The research comes out of the ReSource Project at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, and looked at the effects of three different meditation techniques on the brains and bodies of more than 300 volunteers over 9 months. One technique was based on mindfulness meditation, and taught people to direct attention to the breath or body. A second type concentrated on compassion and emotional connection via loving kindness meditations and non-judgmental problem-sharing sessions with a partner. A final method encouraged people to think about issues from different points of view, also via a mix of partnered sessions and solo meditation. In one study, MRI scans taken after each three-month course showed that parts of the cortex involved in the specific skill that was trained grew thicker in comparison with scans from a control group. © Copyright New Scientist Ltd.
Keyword: Stress
Link ID: 24149 - Posted: 10.05.2017
David Dobbs By the time Nev Jones entered DePaul University's esteemed doctoral program in philosophy, she had aced virtually every course she ever took, studied five languages and become proficient in three, and seemed to have read and memorized pretty much everything. Small and slightly built, with a commanding presence that emerged when she talked, she was the sort of student that sharp teachers quickly notice and long remember: intellectually voracious, relentlessly curious, endlessly capable, and, as one of her high school teachers put it, "magnificently intense." Her mind drew on a well-stocked, seemingly flawless memory with a probing, synthesizing intelligence. With astounding frequency she produced what one doctoral classmate called "genius-level reflections." So Jones grew alarmed when, soon after starting at DePaul in the fall of 2007, at age 27, she began having trouble retaining things she had just read. She also struggled to memorize the new characters she was learning in her advanced Chinese class. She had experienced milder versions of these cognitive and memory blips a couple times before, most recently as she’d finished her undergraduate studies earlier that year. These new mental glitches were worse. She would study and draw the new logograms one night, then come up short when she tried to draw them again the next morning. These failures felt vaguely neurological. As if her synapses had clogged. She initially blamed them on the sleepless, near-manic excitement of finally being where she wanted to be. She had wished for exactly this, serious philosophy and nothing but, for half her life. Now her mind seemed to be failing. Words started to look strange. She began experiencing "inarticulable atmospheric changes," as she put it—not hallucinations, really, but alterations of temporality, spatiality, depth perception, kinesthetics. Shimmerings in reality's fabric. Sidewalks would feel soft and porous. Audio and visual input would fall out of sync, creating a lag between the movement of a speaker's lips and the words' arrival at Jones' ears. Something was off. © 2017 The Social Justice Foundation
Keyword: Schizophrenia
Link ID: 24148 - Posted: 10.05.2017
Anna Gorman Kerri De Nies received the news this spring from her son's pediatrician: Her chubby-cheeked toddler has a rare brain disorder. She'd never heard of the disease — adrenoleukodystrophy, or ALD — but soon felt devastated and overwhelmed. "I probably read everything you could possibly read online — every single website," De Nies says as she cradles her son, Gregory Mac Phee. "It's definitely hard to think about what could potentially happen. You think about the worst-case scenario." ALD is a genetic brain disorder depicted in the 1992 movie Lorenzo's Oil, which portrayed a couple whose son became debilitated by the disease. The most serious form of the illness typically strikes boys between the ages of 4 and 10. Most are diagnosed too late for treatment to be successful, and they often die before their 10th birthday. The more De Nies learned about ALD, the more she realized how fortunate the family was to have discovered Gregory's condition so early. Her son's blood was tested when he was about 10 months old. Dr. Florian Eichler, a neurologist at Massachusetts General Hospital, says newborn screening is a game changer for children with the ALD, because it allows doctors to keep a close eye on kids who test positive for an ALD mutation from the beginning. © 2017 npr
Keyword: Development of the Brain
Link ID: 24147 - Posted: 10.05.2017


.gif)

