Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Clare Wilson Call them the neuron whisperers. Researchers are eavesdropping on conversations going on between brain cells in a dish. Rather than hearing the chatter, they watch neurons that have been genetically modified so that the electrical impulses moving along their branched tendrils cause sparkles of red light (see video). Filming these cells at up to 100,000 frames a second is allowing researchers to analyse their firing in unprecedented detail. Until recently, a neuron's electrical activity could only be measured with tiny electrodes. As well as being technically difficult, such "patch clamping" only reveals the voltage at those specific points. The new approach makes the neuron's entire surface fluoresce as the impulse passes by. "Now we see the whole thing sweep through," says Adam Cohen of Harvard University. "We get much more information - like how fast and where does it start and what happens at a branch." The idea is a reverse form of optogenetics – where neurons are given a gene from bacteria that make a light-sensitive protein, so the cells fire when illuminated. The new approach uses genes that make the neurons do the opposite - glow when they fire. "It's pretty cool," says Dimitri Kullmann of University College London. "It's amazing that you can dispense with electrodes." Cohen's team is using the technique to compare cells from typical brains with those from people with disorders such as motor neuron disease or amyotrophic lateral sclerosis. Rather than taking a brain sample, they remove some of the person's skin cells and grow them alongside chemicals that rewind the cells into an embryonic-like state. Another set of chemicals is used to turn these stem cells into neurons. "You can recreate something reminiscent of the person's brain in the dish," says Cohen. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 20241 - Posted: 10.25.2014
By Michael Hedrick I have a hard time making friends. Getting to trust people well enough to call them a friend takes a lot of work. It’s especially hard when you are living with schizophrenia and think everyone is making fun of you. Schizophrenia is the devil on your shoulder that keeps whispering in your ear and, no matter what you try, the little demon won’t stop. He hasn’t stopped in the almost nine years I’ve lived with the illness, and he’s not about to stop now. He’s just quieted down a bit. I’d call him my companion but that would imply a degree of friendship, and there’s no way in hell I’m the little devil’s friend. I have plenty of acquaintances, and a couple hundred “friends” on Facebook. But real friends, mostly family, I can count on one hand. For me, making friends is like climbing a vertical rock wall with no ropes, requiring a degree of thrill-seeking, and a good deal of risk. For someone to be my friend, they have to accept that I’m crazy, and even getting to the point of telling them that is daunting when all you hear is the devil’s whispering that they’re making snap judgments about you or will be going back to their real friends and laughing about you. But interestingly, in my efforts to make friends, coffee shops have helped. The simple routine of going to get your fix of liquid energy every day provides a sort of breeding ground for community. You see these people every day,whether you like it or not and, over time, friendships form. I used to live in a small town called Niwot, about five miles down the highway from Boulder, where I now live. Every morning around 6 I would go to Winot Coffee, the small independent coffee shop, and every morning, without fail, there was a guy my age sitting outside with his computer smoking clove cigarettes. Given the regularity of seeing him every morning, and given that we were some of the only 20-somethings in town, we got to talking. © 2014 The New York Times Company
Link ID: 20240 - Posted: 10.25.2014
by Neurobonkers A paper published in Nature Reviews Neuroscience last week addressed the prevalence of neuromyths among educators. The paper has been widely reported, but the lion's share of the coverage glossed over the impact that neuromyths have had in the real world. Your first thought after reading the neuromyths in the table below — which were widely believed by teachers — may well be, "so what?" It is true that some of the false beliefs are relatively harmless. For example, encouraging children to drink a little more water might perhaps result in the consumption of less sugary drinks. This may do little if anything to reduce hyperactivity but could encourage a more nutritious diet which might have impacts on problems such as Type II diabetes. So, what's the harm? The paper addressed a number of areas where neuromyths have had real world impacts on educators and policymakers, which may have resulted negatively on the provision of education. The graph above, reprinted in the Nature Reviews Neuroscience, paper has been included as empirical data in educational policy documents to provide evidence for an "allegedly scientific argument for withdrawing public funding of university education." The problem? The data is made up. The graph is in fact a model that is based on the false assumption that investment before the age of three will have many times the benefit of investment made in education later in life. The myth of three — the belief that there is a critical window to educate children before the age of three, after which point the trajectory is fixed — is one of the most persistent neuromyths. Viewed on another level, while some might say investment in early education can never be a bad thing, how about the implication that the potential of a child is fixed at such an early point in their life, when in reality their journey has just begun. © Copyright 2014, The Big Think, Inc
By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company
By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science
By J. PEDER ZANE Striking it rich is the American dream, a magnetic myth that has drawn millions to this nation. And yet, a countervailing message has always percolated through the culture: Money can’t buy happiness. From Jay Gatsby and Charles Foster Kane to Tony Soprano and Walter White, the woefully wealthy are among the seminal figures of literature, film and television. A thriving industry of gossipy, star-studded magazines and websites combines these two ideas, extolling the lifestyles of the rich and famous while exposing the sadness of celebrity. All of which raises the question: Is the golden road paved with misery? Yes, in a lot of cases, according to a growing body of research exploring the connection between wealth and happiness. Studies in behavioral economics, cognitive psychology and neuroscience are providing new insights into how a changing American economy and the wiring of the human brain can make life on easy street feel like a slog. Make no mistake, it is better to be rich than poor — psychologically as well as materially. Levels of depression, anxiety and stress diminish as incomes rise. What has puzzled researchers is that the psychological benefits of wealth seem to stop accruing once people reach an income of about $75,000 a year. “The question is, What are the factors that dampen the rewards of income?” said Scott Schieman, a professor of sociology at the University of Toronto. “Why doesn’t earning even more money — beyond a certain level — make us feel even happier and more satisfied?” The main culprit, he said, is the growing demands of work. For millenniums, leisure was wealth’s bedfellow. The rich were different because they worked less. The tables began to turn in America during the 1960s, when inherited privilege gave way to educational credentials and advancement became more closely tied to merit. © 2014 The New York Times Company
Link ID: 20236 - Posted: 10.23.2014
by Helen Thomson For the first time, doctors have opened and closed the brain's protector – the blood-brain barrier – on demand. The breakthrough will allow drugs to reach diseased areas of the brain that are otherwise out of bounds. Ultimately, it could make it easier to treat conditions such as Alzheimer's and brain cancer. The blood-brain barrier (BBB) is a sheath of cells that wraps around blood vessels (in black) throughout the brain. It protects precious brain tissue from toxins in the bloodstream, but it is a major obstacle for treating brain disorders because it also blocks the passage of drugs. Several teams have opened the barrier in animals to sneak drugs through. Now Michael Canney at Paris-based medical start-up CarThera, and his colleagues have managed it in people using an ultrasound brain implant and an injection of microbubbles. When ultrasound waves meet microbubbles in the blood, they make the bubbles vibrate. This pushes apart the cells of the BBB. With surgeon Alexandre Carpentier at Pitié-Salpêtrière Hospital in Paris, Canney tested the approach in people with a recurrence of glioblastoma, the most aggressive type of brain tumour. People with this cancer have surgery to remove the tumours and then chemotherapy drugs, such as Carboplatin, are used to try to kill any remaining tumour cells. Tumours make the BBB leaky, allowing in a tiny amount of chemo drugs: if more could get through, their impact would be greater, says Canney. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 20235 - Posted: 10.23.2014
James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."
By PAUL VITELLO Most adults do not remember anything before the age of 3 or 4, a gap that researchers had chalked up to the vagaries of the still-developing infant brain. By some accounts, the infant brain was just not equipped to remember much. Textbooks referred to the deficiency as infant amnesia. Carolyn Rovee-Collier, a developmental psychologist at Rutgers University who died on Oct. 2 at 72, challenged the theory, showing in a series of papers in the early 1980s that babies remember plenty. A 3-month-old can recall what he or she learned yesterday, she found, and a 9-month-old can remember a game for as long as a month and a half. She cited experiments suggesting that memory processes in adults and infants are virtually the same, and argued that infant memories were never lost. They just become increasingly harder to retrieve as the child grows, learns language and loses touch with the visual triggers that had kept those memories sharp — a view from between the bars of a crib, say, or the view of the floor as a crawler, not a toddler, sees it. Not all of Dr. Rovee-Collier’s theories won over the psychology establishment, which still uses the infant amnesia concept to explain why people do not remember life as a baby. But her insights about an infant’s short-term memory and ability to learn have been widely accepted, and have helped recast scientific thinking about the infant mind over the past 30 years. Since the first of her 200 papers was published, infant cognitive studies has undergone a boom in university programs around the country. It was a field that had been largely unexplored in any systematic way by the giants of psychological theory. Freud and Jean Piaget never directly addressed the subject of infant memory. William James, considered the father of American psychology, once hazarded a guess that the human baby’s mind was a place of “blooming, buzzing confusion.” © 2014 The New York Times Company
David DiSalvo @neuronarrative One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions. For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will. Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.
Link ID: 20232 - Posted: 10.23.2014
Carl Zimmer Scientists have reconstructed the genome of a man who lived 45,000 years ago, by far the oldest genetic record ever obtained from modern humans. The research, published on Wednesday in the journal Nature, provided new clues to the expansion of modern humans from Africa about 60,000 years ago, when they moved into Europe and Asia. And the genome, extracted from a fossil thighbone found in Siberia, added strong support to a provocative hypothesis: Early humans interbred with Neanderthals. “It’s irreplaceable evidence of what once existed that we can’t reconstruct from what people are now,” said John Hawks, a paleoanthropologist at the University of Wisconsin who was not involved in the study. “It speaks to us with information about a time that’s lost to us.” The discoveries were made by a team of scientists led by Svante Paabo, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Over the past three decades, Dr. Paabo and his colleagues have developed tools for plucking out fragments of DNA from fossils and reading their sequences. Early on, the scientists were able only to retrieve tiny snippets of ancient genes. But gradually, they have invented better methods for joining the overlapping fragments together, assembling larger pieces of ancient genomes that have helped shed light on the evolution of humans and their relatives. In December, they published the entirety of a Neanderthal genome extracted from a single toe bone. Comparing Neanderthal to human genomes, Dr. Paabo and his colleagues found that we share a common ancestor, which they estimated lived about 600,000 years ago. © 2014 The New York Times Company
Link ID: 20231 - Posted: 10.23.2014
By BENEDICT CAREY A Polish man who was paralyzed from the chest down after a knife attack several years ago is now able to get around using a walker and has recovered some sensation in his legs after receiving a novel nerve-regeneration treatment, according to a new report that has generated both hope and controversy. The case, first reported widely by the BBC and other British news outlets, has stirred as much excitement on the Internet as it has extreme caution among many experts. “It is premature at best, and at worst inappropriate, to draw any conclusions from a single patient,” said Dr. Mark H. Tuszynski, director of the translational neuroscience unit at the medical school of the University of California, San Diego. That patient — identified as Darek Fidyka, 40 — is the first to recover feeling and mobility after getting the novel therapy, which involves injections of cultured cells at the site of the injury and tissue grafts, the report said. The techniques have shown some promise in animal studies. But the medical team, led by Polish and English doctors, also emphasized that the results would “have to be confirmed in a larger group of patients sustaining similar types of spinal injury” before the treatment could be considered truly effective. The case report was published in the journal Cell Transplantation. The history of spinal injury treatment is studded with false hope and miracle recoveries that could never be replicated, experts said. In previous studies, scientists experimented with some of the same methods used on Mr. Fidyka, with disappointing results. © 2014 The New York Times Company
By Fergus Walsh Medical correspondent A paralysed man has been able to walk again after a pioneering therapy that involved transplanting cells from his nasal cavity into his spinal cord. Darek Fidyka, who was paralysed from the chest down in a knife attack in 2010, can now walk using a frame. The treatment, a world first, was carried out by surgeons in Poland in collaboration with scientists in London. Prof Wagih El Masri Consultant spinal injuries surgeon Details of the research are published in the journal Cell Transplantation. BBC One's Panorama programme had unique access to the project and spent a year charting the patient's rehabilitation. Darek Fidyka, 40, from Poland, was paralysed after being stabbed repeatedly in the back in the 2010 attack. He said walking again - with the support of a frame - was "an incredible feeling", adding: "When you can't feel almost half your body, you are helpless, but when it starts coming back it's like you were born again." He said what had been achieved was "more impressive than man walking on the moon". UK research team leader Prof Geoff Raisman: Paralysis treatment "has vast potential" The treatment used olfactory ensheathing cells (OECs) - specialist cells that form part of the sense of smell. OECs act as pathway cells that enable nerve fibres in the olfactory system to be continually renewed. In the first of two operations, surgeons removed one of the patient's olfactory bulbs and grew the cells in culture. Two weeks later they transplanted the OECs into the spinal cord, which had been cut through in the knife attack apart from a thin strip of scar tissue on the right. They had just a drop of material to work with - about 500,000 cells. About 100 micro-injections of OECs were made above and below the injury. BBC © 2014
By Scott Barry Kaufman “Just because a diagnosis [of ADHD] can be made does not take away from the great traits we love about Calvin and his imaginary tiger friend, Hobbes. In fact, we actually love Calvin BECAUSE of his ADHD traits. Calvin’s imagination, creativity, energy, lack of attention, and view of the world are the gifts that Mr. Watterson gave to this character.” — The Dragonfly Forest In his 2004 book “Creativity is Forever“, Gary Davis reviewed the creativity literature from 1961 to 2003 and identified 22 reoccurring personality traits of creative people. This included 16 “positive” traits (e.g., independent, risk-taking, high energy, curiosity, humor, artistic, emotional) and 6 “negative” traits (e.g., impulsive, hyperactive, argumentative). In her own review of the creativity literature, Bonnie Cramond found that many of these same traits overlap to a substantial degree with behavioral descriptions of Attention Deficit Hyperactive Disorder (ADHD)– including higher levels of spontaneous idea generation, mind wandering, daydreaming, sensation seeking, energy, and impulsivity. Research since then has supported the notion that people with ADHD are more likely to reach higher levels of creative thought and achievement than those without ADHD (see here, here, here, here, here, here, here, here, here, and here). What’s more, recent research by Darya Zabelina and colleagues have found that real-life creative achievement is associated with the ability to broaden attention and have a “leaky” mental filter– something in which people with ADHD excel. Recent work in cognitive neuroscience also suggests a connection between ADHD and creativity (see here and here). Both creative thinkers and people with ADHD show difficulty suppressing brain activity coming from the “Imagination Network“: © 2014 Scientific American
By Paula Span Maybe it’s something else. That’s what you tell yourself, isn’t it, when an older person begins to lose her memory, repeat herself, see things that aren’t there, lose her way on streets she’s traveled for decades? Maybe it’s not dementia. And sometimes, thankfully, it is indeed some other problem, something that mimics the cognitive destruction of Alzheimer’s disease or another dementia — but, unlike them, is fixable. “It probably happens more often than people realize,” said Dr. P. Murali Doraiswamy, a neuroscientist at Duke University Medical Center. But, he added, it doesn’t happen nearly as often as family members hope. Several confounding cases have appeared at Duke: A woman who appeared to have Alzheimer’s actually was suffering the effects of alcoholism. Another patient’s symptoms resulted not from dementia but from chronic depression. Dr. Doraiswamy estimates that when doctors suspect Alzheimer’s, they’re right 50 to 60 percent of the time. (The accuracy of Alzheimer’s diagnoses, even in specialized medical centers, is more haphazard than you would hope.) Perhaps another 25 percent of patients actually have other types of dementia, like Lewy body or frontotemporal — scarcely happy news, but because these diseases have different trajectories and can be exacerbated by the wrong drugs, the distinction matters. The remaining 15 to 25 percent “usually have conditions that can be reversed or at least improved,” Dr. Doraiswamy said. © 2014 The New York Times Company
Link ID: 20227 - Posted: 10.22.2014
BY Tina Hesman Saey SAN DIEGO — A Golden retriever that inherited a genetic defect that causes muscular dystrophy doesn’t have the disease, giving scientists clues to new therapies for treating muscle-wasting diseases. The dog, Ringo, was bred to have a mutation that causes Duchenne muscular dystrophy in both animals and people. His weak littermates that inherited the same mutation could barely suckle at birth. But Ringo was healthy, with muscles that function normally. One of Ringo’s sons also has the mutation but doesn’t have the disease, said geneticist Natassia Vieira of Boston Children’s Hospital and Harvard University October 19 at the annual meeting of the American Society of Human Genetics. The dogs without the disease had a second genetic variant that caused their muscles to make more of a protein called Jagged 1, Vieira and her colleagues discovered. That protein allows muscles to repair themselves. Making more of Jagged 1 appears to compensate for the wasting effect of the muscular dystrophy mutation, although the researchers don’t yet know the exact mechanism. The finding suggests that researchers may one day be able to devise treatments for people with muscular dystrophies by boosting production of Jagged 1 or other muscle repair proteins. N. M. Vieira. The muscular dystrophies: Revealing the genetic and phenotypic variability. American Society of Human Genetics Annual Meeting, San Diego, October 19, 2014. © Society for Science & the Public 2000 - 2014
Scientists say they have identified the underlying reason why some people are prone to the winter blues, or seasonal affective disorder (SAD). People with Sad have an unhelpful way of controlling the "happy" brain signalling compound serotonin during winter months, brain scans reveal. As the nights draw in, production of a transporter protein ramps up in Sad, lowering available serotonin. The work will be presented this week at a neuropsychopharmacology conference. The University of Copenhagen researchers who carried out the trial say their findings confirm what others have suspected - although they only studied 11 people with Sad and 23 healthy volunteers for comparison. Using positron emission tomography (PET) brain scans, they were able to show significant summer-to-winter differences in the levels of the serotonin transporter (SERT) protein in Sad patients. The Sad volunteers had higher levels of SERT in the winter months, corresponding to a greater removal of serotonin in winter, while the healthy volunteers did not. Winter depression Lead researcher, Dr Brenda Mc Mahon, said: "We believe that we have found the dial the brain turns when it has to adjust serotonin to the changing seasons. BBC © 2014
By Jane E. Brody Within a week of my grandsons’ first year in high school, getting enough sleep had already become an issue. Their concerned mother questioned whether lights out at midnight or 1 a.m. and awakening at 7 or 7:30 a.m. to get to school on time provided enough sleep for 14-year-olds to navigate a demanding school day. The boys, of course, said “yes,” especially since they could “catch up” by sleeping late on weekends. But the professional literature on the sleep needs of adolescents says otherwise. Few Americans these days get the hours of sleep optimal for their age, but experts agree that teenagers are more likely to fall short than anyone else. Researchers report that the average adolescent needs eight and a half to nine and a half hours of sleep each night. But in a poll taken in 2006 by the National Sleep Foundation, less than 20 percent reported getting that much rest on school nights. With the profusion of personal electronics, the current percentage is believed to be even worse. A study in Fairfax, Va., found that only 6 percent of children in the 10th grade and only 3 percent in the 12th grade get the recommended amount of sleep. Two in three teens were found to be severely sleep-deprived, losing two or more hours of sleep every night. The causes can be biological, behavioral or environmental. And the effect on the well-being of adolescents — on their health and academic potential — can be profound, according to a policy statement issued in August by the American Academy of Pediatrics. “Sleep is not optional. It’s a health imperative, like eating, breathing and physical activity,” Dr. Judith A. Owens, the statement’s lead author, said in an interview. “This is a huge issue for adolescents.” © 2014 The New York Times Company
by Amy Standen The important thing is that Meghan knew something was wrong. When I met her, she was 23, a smart, wry young woman living with her mother and stepdad in Simi Valley, about an hour north of Los Angeles. Meghan had just started a training program to become a respiratory therapist. Concerned about future job prospects, she asked NPR not to use her full name. Five years ago, Meghan's prospects weren't nearly so bright. At 19, she had been severely depressed, on and off, for years. During the bad times, she'd hide out in her room making thin, neat cuts with a razor on her upper arm. "I didn't do much of anything," Meghan recalls. "It required too much brain power." "Her depression just sucked the life out of you," Kathy, Meghan's mother, recalls. "I had no idea what to do or where to go with it." One night in 2010, Meghan's mental state took an ominous turn. Driving home from her job at McDonald's, she found herself fascinated by the headlights of an oncoming car. "I had the weird thought of, you know, I've never noticed this, but their headlights really look like eyes." To Meghan, the car seemed malicious. It wanted to hurt her. Kathy tried to reason with her. "Honey, you know it's a car, right? You know those are headlights," she recalls pressing her daughter. "You understand that this makes no sense, right?" © 2014 NPR
Link ID: 20223 - Posted: 10.21.2014
By Catherine Saint Louis KATY, Tex. — Like many parents of children with autism, Nicole Brown feared she might never find a dentist willing and able to care for her daughter, Camryn Cunningham, now a lanky 13-year-old who uses words sparingly. Finishing a basic cleaning was a colossal challenge, because Camryn was bewildered by the lights in her face and the odd noises from instruments like the saliva suctioner — not to mention how utterly unfamiliar everything was to a girl accustomed to routine. Sometimes she’d panic and bolt from the office. Then in May, Ms. Brown, 45, a juvenile supervision officer, found Dr. Amy Luedemann-Lazar, a pediatric dentist in this suburb of Houston. Unlike previous dentists, Dr. Luedemann-Lazar didn’t suggest that Camryn would need to be sedated or immobilized. Instead, she suggested weekly visits to help her learn to be cooperative, step by step, with lots of breaks so she wouldn’t be overwhelmed. Bribery helped. If she sat calmly for 10 seconds, her reward was listening to a snippet of a Beyoncé song on her sister’s iPod. This month, Camryn sat still in the chair, hands crossed on her lap, for no less than 25 minutes through an entire cleaning — her second ever — even as purple-gloved hands hovered near her face, holding a noisy tooth polisher. At the end, Dr. Luedemann-Lazar examined Camryn’s teeth and declared her cavity-free and ready to see an orthodontist. “It was like a breakthrough,” Ms. Brown said, adding, “Dr. Amy didn’t just turn her away.” Parents of children with special needs have long struggled to find dentists who will treat them. In a 2005 study, nearly three-fifths of 208 randomly chosen general dentists in Michigan said they would not provide care for children on the autism spectrum; two-thirds said the same for adults. But as more and more children receive diagnoses of autism spectrum disorder, more dentists and dental hygienists are recognizing that with accommodations, many of them can become cooperative patients. © 2014 The New York Times Company
Link ID: 20222 - Posted: 10.21.2014