Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 81 - 100 of 21086

A single variation in the gene for brain-derived neurotropic factor (BDNF) may influence obesity in children and adults, according to a new study funded by the National Institutes of Health. The study suggests that a less common version of the BDNF gene may predispose people to obesity by producing lower levels of BDNF protein, a regulator of appetite, in the brain. The authors propose that boosting BDNF protein levels may offer a therapeutic strategy for people with the genetic variation, which tends to occur more frequently in African Americans and Hispanics, than in non-Hispanic Caucasians. The study is published in the journal Cell Reports. Obesity in children and adults is a serious issue in the United States, contributing to health conditions such as heart disease, stroke and type 2 diabetes. Importantly, genetic factors can predispose a person to obesity, as well as influence the effectiveness of weight-loss strategies. The body relies on cells to process and store energy, and changes in genes that regulate these functions can cause an imbalance that leads to excessive energy storage and weight gain. “The BDNF gene has previously been linked to obesity, and scientists have been working for several years to understand how changes in this particular gene may predispose people to obesity,” said Jack A. Yanovski, M.D., Ph.D., one of the study authors and an investigator at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “This study explains how a single genetic change in BDNF influences obesity and may affect BDNF protein levels. Finding people with specific causes of obesity may allow us to evaluate effective, more-personalized treatments.”

Keyword: Obesity; Genes & Behavior
Link ID: 21585 - Posted: 10.31.2015

Heidi Ledford An analysis of 53 weight-loss studies that included more than 68,000 people has concluded that, despite their popularity, low-fat diets are no more effective than higher-fat diets for long-term weight loss. And overall, neither type of diet works particularly well. A year after their diets started, participants in the 53 studies were, on average, only about 5 kilograms (11 pounds) lighter. “That’s not that impressive,” says Kevin Hall, a physiologist at the US National Institute of Diabetes and Digestive and Kidney Diseases in Bethesda, Maryland. “All of these prescriptions for dieting seem to be relatively ineffective in the long term.” The study, published in The Lancet Diabetes and Endocrinology[1], runs counter to decades' worth of medical advice and adds to a growing consensus that the widespread push for low-fat diets was misguided. Nature looks at why low-fat diets were so popular and what diet doctors might prescribe next. Are the new findings a surprise? The advantages of low-fat diets have long been in question. “For decades we’ve been touting low-fat diets as the way to lose weight, but obesity has gone up,” says Deirdre Tobias, lead author of the study and an epidemiologist at Brigham and Women’s Hospital in Boston, Massachusetts. “It seemed evident that low-fat diets may not be the way to go.” © 2015 Nature Publishing Group,

Keyword: Obesity
Link ID: 21584 - Posted: 10.31.2015

By KATHARINE Q. SEELYE NEWTON, N.H. — When Courtney Griffin was using heroin, she lied, disappeared, and stole from her parents to support her $400-a-day habit. Her family paid her debts, never filed a police report and kept her addiction secret — until she was found dead last year of an overdose. At Courtney’s funeral, they decided to acknowledge the reality that redefined their lives: Their bright, beautiful daughter, just 20, who played the French horn in high school and dreamed of living in Hawaii, had been kicked out of the Marines for drugs. Eventually, she overdosed at her boyfriend’s grandmother’s house, where she died alone. “When I was a kid, junkies were the worst,” Doug Griffin, 63, Courtney’s father, recalled in their comfortable home here in southeastern New Hampshire. “I used to have an office in New York City. I saw them.” When the nation’s long-running war against drugs was defined by the crack epidemic and based in poor, predominantly black urban areas, the public response was defined by zero tolerance and stiff prison sentences. But today’s heroin crisis is different. While heroin use has climbed among all demographic groups, it has skyrocketed among whites; nearly 90 percent of those who tried heroin for the first time in the last decade were white. And the growing army of families of those lost to heroin — many of them in the suburbs and small towns — are now using their influence, anger and grief to cushion the country’s approach to drugs, from altering the language around addiction to prodding government to treat it not as a crime, but as a disease. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 21583 - Posted: 10.31.2015

By Diana Kwon Six years before her husband was diagnosed with Parkinson’s disease, a progressive neurodegenerative disorder marked by tremors and movement difficulties, Joy Milne detected a change in his scent. She later linked the subtle, musky odor to the disease when she joined the charity Parkinson’s UK and met others with the same, distinct smell. Being one of the most common age-related disorders, Parkinson’s affects an estimated seven million to 10 million people worldwide. Although there is currently no definitive diagnostic test, researchers hope that this newly found olfactory signature will lead help create one. Milne, a super-smeller from Perth, Scotland, wanted to share her ability with researchers. So when Tilo Kunath, a neuroscientist at the University of Edinburgh, gave a talk during a Parkinson’s UK event in 2012, she raised her hand during the Q&A session and claimed she was able to smell the disease. “I didn’t take her seriously at first,” Kunath says. “I said, ‘No, I never heard of that, next question please.’” But months later Kunath shared this anecdote with a colleague and received a surprising response. “She told me that that lady wasn’t wrong and that I should find her,” Kunath says. Once the researchers found Milne, they tested her claim by having her sniff 12 T-shirts: six that belonged to people with Parkinson’s and six from healthy individuals. Milne correctly identified 11 out of 12, but miscategorized one of the non-Parkinson’s T-shirts in the disease category. It turned out, however, she was not wrong at all—that person would be diagnosed with Parkinson’s less than a year later. © 2015 Scientific American

Keyword: Parkinsons; Chemical Senses (Smell & Taste)
Link ID: 21582 - Posted: 10.31.2015

Susan Milius Electric eels are even more shocking than biologists thought. When prey fights back, eels just — curl their tails. Muscle has evolved “into a battery” independently in two groups of fishes, explains Kenneth Catania of Vanderbilt University in Nashville. Smaller species send out slight tingles of electric current that detect the fish’s surroundings in murky nighttime water. People can handle these small fishes and not feel even a tickle. But touching the bigger Electrophorus electricus (a member of a South American group of battery-included fishes)“is reminiscent of walking into an electric fence on a farm,” Catania says. (He knows, unintentionally, from experience.) The modified muscle that works as an electricity-generating organ in the eel has just on/off power. But eels have a unique way of intensifying the effect, Catania reports October 28 in Current Biology. Catania has tussled with eels using what he calls his electric eel chew toy — a dead fish on a stick with electrodes inside the carcass to measure current. When fighting difficult prey Iike the recalcitrant toy, eels curl their tails toward the fish struggling in their jaws. This bend puts the electrically negative tail-end of the long battery organ closer to the electrically positive front end, effectively concentrating the electric field on the prey. An eel’s tail curl can double the strength of the electric field convulsing the prey. © Society for Science & the Public 2000 - 2015.

Keyword: Animal Communication; Aggression
Link ID: 21581 - Posted: 10.29.2015

By Diana Kwon | In the human form of mad cow disease, called Creutzfeldt-Jakob, a person's brain deteriorates—literally developing holes that cause rapidly progressing dementia. The condition is fatal within one year in 90 percent of cases. The culprits behind the disease are prions—misfolded proteins that can induce normal proteins around them to also misfold and accumulate. Scientists have known that these self-propagating, pathological proteins cause some rare brain disorders, such as kuru in Papua New Guinea. But growing evidence suggests that prions are at play in many, if not all, neurodegenerative disorders, including Alzheimer's, Huntington's and Parkinson's, also marked by aggregations of malformed proteins. Until recently, there was no evidence that the abnormal proteins found in people who suffer from these well-known diseases could be transmitted directly from person to person. The tenor of that discussion suddenly changed this September when newly published research in the journal Nature provided the first hint such human-to-human transmission may be possible. (Scientific American is part of Springer Nature.) For the study, John Collinge, a neurologist at University College London, and his colleagues conducted autopsies on eight patients who died between the ages of 36 and 51 from Creutzfeldt-Jakob. All the subjects had acquired the disease after treatment with growth hormone later found to be contaminated with prions. The surprise came when the researchers discovered that six of the brains also bore telltale signs of Alzheimer's—in the form of clumps of beta-amyloid proteins, diagnostic for the disease—even though the patients should have been too young to exhibit such symptoms. © 2015 Scientific American,

Keyword: Prions; Alzheimers
Link ID: 21580 - Posted: 10.29.2015

By Christian Jarrett Neuroscientists, for obvious reasons, are really interested in finding out what’s different about the brains of people with unpleasant personalities, such as narcissists, or unsavory habits, like porn addiction. Their hope is that by studying these people’s brains we might learn more about the causes of bad character, and ways to helpfully intervene. Now to the list of character flaws that've received the brain-scanner treatment we can apparently add sexism — a new Japanese study published in Scientific Reports claims to have found its neurological imprint. The researchers wanted to know whether there is something different about certain individuals’ brains that potentially predisposes them to sexist beliefs and attitudes (of course, as with so much neuroscience research like this, it’s very hard to disentangle whether any observed brain differences are the cause or consequence of the trait or behavior that’s being studied, a point I’ll come back to). More specifically, they were looking to see if people who publicly endorse gender inequality have brains that are anatomically different from people who believe in gender equality. In short, it seems the answer is yes. Neuroscientist Hikaru Takeuchi at Tohoku University and his colleagues have identified two brain areas where people who hold sexist attitudes have different levels of gray-matter density (basically, a measure of how many brain cells are packed into a given area), as compared with people who profess a belief in gender equality (their study doesn’t speak to any subconsciously held sexist beliefs). What’s more, these neural differences were correlated with psychological characteristics that could help explain some people’s sexist beliefs. © 2015, New York Media LLC.

Keyword: Attention; Emotions
Link ID: 21579 - Posted: 10.29.2015

By Nicholas Bakalar Certain personality traits are often attributed to oldest, middle and youngest children. But a new study found that birth order itself had no effect on character, though it may slightly affect intelligence. Researchers analyzed three large ongoing collections of data including more than 20,000 people: a British study that follows the lives of people who were born in one particular week in 1958, a German study of private households started in 1984 and a continuing study of Americans born between 1980 and 1984. They searched for differences in extroversion, emotional stability, agreeableness, conscientiousness, self-reported intellect, IQ, imagination and openness to experience. They analyzed families with sisters and brothers, large and small age gaps and different numbers of siblings. They even looked to see if being a middle child correlated with any particular trait. But no matter how they spliced the data, they could find no association of birth order with any personality characteristic. The study, in Proceedings of the National Academy of Sciences, did find evidence that older children have a slight advantage in IQ scores, but the difference was apparent only in a large sample, with little significance for any individual. The lead author, Julia M. Rohrer, a graduate student at the University of Leipzig, said that birth order can have an effect — if your older brother bullied you, for example. “But these effects are highly idiosyncratic,” she said. “There is no such thing as a typical older, middle or younger sibling. It’s important to stop believing that you are the way you are because of birth order.” © 2015 The New York Times Company

Keyword: Development of the Brain; Intelligence
Link ID: 21578 - Posted: 10.29.2015

A drug for Alzheimer’s seems to delay the point at which a person with the condition needs to be moved into a nursing home. Donepezil is usually given to people with moderate forms of the disease, but continuing to take the drug once the disease becomes more severe seems to prolong the period of time a person can remain in their own home. Previously, the drug was not thought to benefit people once they had developed more severe forms of Alzheimer’s. But a study that followed 295 people with moderate to severe Alzheimer’s disease found that those who continued to take donepezil were nearly half as likely to end up in a care home within the next year. “It could mean thousands of patients per year not going into care homes,” says Robert Howard of University College London, who led the study. His team found that those who continued to take donepezil had a 20 per cent chance of being moved into a care home within the first year of the trial, compared to 37 per cent in those who stopped taking the drug. However the effect didn’t last. The trial lasted for three years, and after the first year, those who taking donepezil were just as likely to be moved into a home than those who weren’t, suggesting that the drug does not have a longer-term effect on the care needs of those with Alzheimer’s. “For every six patients treated with donepezil for 12 months, you would prevent one moving into a nursing home,” says Howard. “It’s a modest effect, but it’s important if it’s your mother or your wife.” © Copyright Reed Business Information Ltd.

Keyword: Alzheimers
Link ID: 21577 - Posted: 10.28.2015

Looking at brain tissue from mice, monkeys and humans, scientists have found that a molecule known as growth and differentiation factor 10 (GDF10) is a key player in repair mechanisms following stroke. The findings suggest that GDF10 may be a potential therapy for recovery after stroke. The study, published in Nature Neuroscience, was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health. “These findings help to elucidate the mechanisms of repair following stroke. Identifying this key protein further advances our knowledge of how the brain heals itself from the devastating effects of stroke, and may help to develop new therapeutic strategies to promote recovery,” said Francesca Bosetti, Ph.D., stroke program director at NINDS. Stroke can occur when a brain blood vessel becomes blocked, preventing nearby tissue from getting essential nutrients. When brain tissue is deprived of oxygen and nutrients, it begins to die. Once this occurs, repair mechanisms, such as axonal sprouting, are activated as the brain attempts to overcome the damage. During axonal sprouting, healthy neurons send out new projections (“sprouts”) that re-establish some of the connections lost or damaged during the stroke and form new ones, resulting in partial recovery. Before this study, it was unknown what triggered axonal sprouting. Previous studies suggested that GDF10 was involved in the early stages of axonal sprouting, but its exact role in the process was unclear. S. Thomas Carmichael, M.D., Ph.D., and his colleagues at the David Geffen School of Medicine at the University of California Los Angeles took a closer look at GDF10 to identify how it may contribute to axonal sprouting.

Keyword: Stroke; Regeneration
Link ID: 21576 - Posted: 10.28.2015

by Helen Thompson Five, six, seven, eight! All together now, let's spread those jazz hands and get moving, because synchronized dancing improves our tolerance of pain and helps us bond as humans, researchers suggest October 28 in Biology Letters. A team of psychologists at the University of Oxford taught high school students varied dance routines — each requiring different levels of exertion and synchronized movement — and then tested their pain tolerance with the sharp squeeze of a blood pressure cuff. Statistically, routines with more coordinated choreography and full body movement produced higher pain thresholds and sunny attitudes toward others in the group. Coordinated dancing with a group and exerting more energy may independently promote the release of pain-blocking endorphins as well as increase social bonding, the team writes. |© Society for Science & the Public 2000 - 2015

Keyword: Pain & Touch
Link ID: 21575 - Posted: 10.28.2015

When we hear speech, electrical waves in our brain synchronise to the rhythm of the syllables, helping us to understand what’s being said. This happens when we listen to music too, and now we know some brains are better at syncing to the beat than others. Keith Doelling at New York University and his team recorded the brain waves of musicians and non-musicians while listening to music, and found that both groups synchronised two types of low-frequency brain waves, known as delta and theta, to the rhythm of the music. Synchronising our brain waves to music helps us decode it, says Doelling. The electrical waves collect the information from continuous music and break it into smaller chunks that we can process. But for particularly slow music, the non-musicians were less able to synchronise, with some volunteers saying they couldn’t keep track of these slower rhythms. Rather than natural talent, Doelling thinks musicians are more comfortable with slower tempos because of their musical training. As part of his own musical education, he remembers being taught to break down tempo into smaller subdivisions. He suggests that grouping shorter beats together in this way is what helps musicians to process slow music better. One theory is that musicians have heard and played much more music, allowing them to acquire “meta-knowledge”, such as a better understanding of how composers structure pieces. This could help them detect a broader range of tempos, says Usha Goswami of the University of Cambridge. © Copyright Reed Business Information Ltd.

Keyword: Hearing; Learning & Memory
Link ID: 21574 - Posted: 10.27.2015

By Jan Hoffman As the first semester of the school year reaches the halfway mark, countless college freshmen are becoming aware that their clothes are feeling rather snug. While the so-called freshman 15 may be hyperbole, studies confirm that many students do put on five to 10 pounds during that first year away from home. Now new research suggests that an underlying cause for the weight gain may be the students’ widely vacillating patterns of sleep. A study in the journal Behavioral Sleep Medicine looked at the sleep habits of first-semester freshmen. Researchers followed 132 first-year students at Brown University who kept daily sleep diaries. After nine weeks, more than half of them had gained nearly six pounds. There are many poor sleep habits that might have exacerbated their weight gains, a growing body of research indicates. Was it abbreviated sleep? Optimally, experts say, teenagers need about nine hours and 15 minutes a night. These freshmen averaged about seven hours and 15 minutes. In a study earlier this year, in the journal PLOS One, researchers found that when teenagers are sleep-deprived, they more readily reach for candy and desserts. Or were the Brown students’ late bedtimes the scale-tipping factor? On average, they went to bed around 1:30 a.m. A study this month in the journal Sleep that followed teenagers into adulthood found that each hour later bedtime was pushed during the school or workweek was associated with about a two-point increase in body mass index. While both the amount of sleep and the lateness of bedtime may have played a role, the researchers in the Brown study identified a new sleep factor for predicting weight gain: variability, or the extent to which a student’s bedtime and waking time changed daily. © 2015 The New York Times Company

Keyword: Sleep; Obesity
Link ID: 21573 - Posted: 10.27.2015

By Jessica Schmerler Young brains are plastic, meaning their circuitry can be easily rewired to promote learning. By adulthood, however, the brain has lost much of its plasticity and can no longer readily recover lost function after, say, a stroke. Now scientists have successfully restored full youthful plasticity in adult mice by transplanting young neurons into their brain—curing their severe visual impairments in the process. In a groundbreaking study published in May in Neuron, a team of neuroscientists led by Sunil Gandhi of the University of California, Irvine, transplanted embryonic mouse stem cells into the brains of other mice. The cells were primed to become inhibitory neurons, which tamp down brain activity. Prior to this study, “it was widely doubted that the adult brain would allow these cells to disperse, integrate and reactivate plasticity,” says Melissa Davis, first author of the study. Scientists have been attempting such a feat for years, refining their methods along the way, and the Irvine team finally saw success: the cells were integrated in the brain and caused large-scale rewiring, restoring the high-level plasticity of early development. In visually impaired mice, the transplant allowed for the restoration of normal vision, as demonstrated by tests of visual nerve signals and a swimming maze test. The scientists have not yet tested the transplanting technique for other neurological disorders, but they believe the technique has potential for many conditions and injuries depending on how, exactly, the new neurons restore plasticity. It is not yet known whether the proliferation of the transplanted cells accounts for the restored plasticity or if the new cells trigger plasticity in existing neurons. If the latter, the treatment could spur the rewiring and healing of the brain following traumatic brain injury or stroke. © 2015 Scientific American

Keyword: Vision; Stem Cells
Link ID: 21572 - Posted: 10.27.2015

By GINA KOLATA Three diseases, leading killers of Americans, often involve long periods of decline before death. Two of them — heart disease and cancer — usually require expensive drugs, surgeries and hospitalizations. The third, dementia, has no effective treatments to slow its course. So when a group of researchers asked which of these diseases involved the greatest health care costs in the last five years of life, the answer they found might seem surprising. The most expensive, by far, was dementia. The study looked at patients on Medicare. The average total cost of care for a person with dementia over those five years was $287,038. For a patient who died of heart disease it was $175,136. For a cancer patient it was $173,383. Medicare paid almost the same amount for patients with each of those diseases — close to $100,000 — but dementia patients had many more expenses that were not covered. On average, the out-of-pocket cost for a patient with dementia was $61,522 — more than 80 percent higher than the cost for someone with heart disease or cancer. The reason is that dementia patients need caregivers to watch them, help with basic activities like eating, dressing and bathing, and provide constant supervision to make sure they do not wander off or harm themselves. None of those costs were covered by Medicare. For many families, the cost of caring for a dementia patient often “consumed almost their entire household wealth,” said Dr. Amy S. Kelley, a geriatrician at Icahn School of Medicine at Mt. Sinai in New York and the lead author of the paper published on Monday in the Annals of Internal Medicine. © 2015 The New York Times Company

Keyword: Alzheimers
Link ID: 21571 - Posted: 10.27.2015

By Dina Fine Maron When powerful street drugs collectively known as synthetic pot are smoked, the resulting high mimics the effects of marijuana. Yet these man-made cannabinoids are not marijuana at all. The drugs, more commonly called spice, fake weed or K2, are made up of any number of dried, shredded plants sprayed with chemicals that live in a murky legality zone. They are highly dangerous—and their use is on the rise. Synthetic pot, which first hit the market in the early 2000s, has especially caught the attention of public health officials in the past couple of years, stemming from a surge in hospitalizations and violent episodes. Although the drugs act on the same brain pathway as weed's active ingredient, they can trigger harsher reactions, including heart attacks, strokes, kidney damage and delusions. Between June and early August usage of these drugs led to roughly 2,300 emergency room visits in New York State alone. Nationwide more than 6,000 incidents involving spice have been reported to U.S. poison-control centers this year—about double the number of calls in 2013. Ever changing recipes make it possible for spice sellers to elude the authorities. Each time an ingredient is banned, producers swap in another compound. The drugs are then sold on the Internet or at gas stations and convenience stores at prices lower than genuine marijuana. The changing formulations also pose a challenge for researchers trying to match the chemicals with their side effects or to develop tests to identify them in a user's system. “The drugs are present in blood for only a short period, so it's very difficult to detect them,” says Marilyn Huestis, chief of the Chemistry and Drug Metabolism Section at the National Institute on Drug Abuse. © 2015 Scientific American

Keyword: Drug Abuse
Link ID: 21570 - Posted: 10.27.2015

Bret Stetka Sometime around 1907, well before the modern randomized clinical trial was routine, American psychiatrist Henry Cotton began removing decaying teeth from his patients in hopes of curing their mental disorders. If that didn't work he moved on to more invasive excisions: tonsils, testicles, ovaries and, in some cases, colons. Cotton was the newly appointed director of the New Jersey State Hospital for the Insane and was acting on a theory proposed by influential Johns Hopkins psychiatrist Adolph Meyer, under whom Cotton had studied, that psychiatric illness is the result of chronic infection. Meyer's idea was based on observations that patients with high fevers sometimes experience delusions and hallucinations. In 1921 he published a well-received book on the theory called The Defective Delinquent and Insane: the Relation of Focal Infections to Their Causation, Treatment and Prevention. A few years later The New York Times wrote, "eminent physicians and surgeons testified that the New Jersey State Hospital for the Insane was the most progressive institution in the world for the care of the insane, and that the newer method of treating the insane by the removal of focal infection placed the institution in a unique position with respect to hospitals for the mentally ill." Eventually Cotton opened a hugely successful private practice, catering to the infected molars of Trenton, N.J., high society. © 2015 npr

Keyword: Depression; Neuroimmunology
Link ID: 21569 - Posted: 10.26.2015

By Dina Fine Maron Early-life exposure to anesthesia does not appear to lead to long-term cognitive problems, researchers announced today. New evidence from the first, randomized anesthesia trial in kids provides the strongest indication yet that exposing young children to anesthesia—at least for a brief time—will not saddle them with developmental deficits. The news comes just a couple of weeks after a medical advisory group reiterated its concerns about such exposures among children younger than four years. Previously, multiple animal and human studies have linked such exposure with cognitive impairment, but none of the information on humans came from a gold-standard, randomized study design that could help eliminate other reasons to explain such a connection. This is a “reassuring finding, but it is not the final answer,” says Dean Andropoulos, anesthesiologist in chief at Texas Children’s Hospital and an expert who was not involved in the work. The new study assesses only what happens to youngsters after a relatively brief bout with anesthetics, so it is possible that longer or repeated exposures to such chemicals may still cause neurodevelopmental issues. There may also be deficits in anesthesia-exposed children that are not measurable until later in life. The study followed more than 500 infants undergoing hernia repair across the U.S., Australia, the U.K., Canada, the Netherlands, New Zealand and Italy. The surgeries lasted an average of roughly an hour. About half of the children were randomly selected to be put under with general anesthesia, and the other half stayed awake during the surgery and received targeted anesthesic in a specific body region. The kids in the study were all younger than 60 weeks and were matched by where they had the surgery and whether they were born prematurely. © 2015 Scientific American

Keyword: Development of the Brain; Intelligence
Link ID: 21568 - Posted: 10.26.2015

Richard A. Friedman YOU can increase the size of your muscles by pumping iron and improve your stamina with aerobic training. Can you get smarter by exercising — or altering — your brain? Stories from Our Advertisers This is hardly an idle question considering that cognitive decline is a nearly universal feature of aging. Starting at age 55, our hippocampus, a brain region critical to memory, shrinks 1 to 2 percent every year, to say nothing of the fact that one in nine people age 65 and older has Alzheimer’s disease. The number afflicted is expected to grow rapidly as the baby boom generation ages. Given these grim statistics, it’s no wonder that Americans are a captive market for anything, from supposed smart drugs and supplements to brain training, that promises to boost normal mental functioning or to stem its all-too-common decline. The very notion of cognitive enhancement is seductive and plausible. After all, the brain is capable of change and learning at all ages. Our brain has remarkable neuroplasticity; that is, it can remodel and change itself in response to various experiences and injuries. So can it be trained to enhance its own cognitive prowess? The multibillion-dollar brain training industry certainly thinks so and claims that you can increase your memory, attention and reasoning just by playing various mental games. In other words, use your brain in the right way and you’ll get smarter. A few years back, a joint study by BBC and Cambridge University neuroscientists put brain training to the test. Their question was this: Do brain gymnastics actually make you smarter, or do they just make you better at doing a specific task? For example, playing the math puzzle KenKen will obviously make you better at KenKen. But does the effect transfer to another task you haven’t practiced, like a crossword puzzle? © 2015 The New York Times Company

Keyword: Learning & Memory; Intelligence
Link ID: 21567 - Posted: 10.26.2015

By Diana Kwon Microglia, the immune cells of the brain, have long been the underdogs of the glia world, passed over for other, flashier cousins, such as astrocytes. Although microglia are best known for being the brain’s primary defenders, scientists now realize that they play a role in the developing brain and may also be implicated in developmental and neurodegenerative disorders. The change in attitude is clear, as evidenced by the buzz around this topic at this year’s Society for Neuroscience (SfN) conference, which took place from October 17 to 21 in Chicago, where scientists discussed their role in both health and disease. Activated in the diseased brain, microglia find injured neurons and strip away the synapses, the connections between them. These cells make up around 10 percent of all the cells in the brain and appear during early development. For decades scientists focused on them as immune cells and thought that they were quiet and passive in the absence of an outside invader. That all changed in 2005, when experimenters found that microglia were actually the fastest-moving structures in a healthy adult brain. Later discoveries revealed that their branches were reaching out to surrounding neurons and contacting synapses. These findings suggested that these cellular scavengers were involved in functions beyond disease. The discovery that microglia were active in the healthy brain jump-started the exploration into their underlying mechanisms: Why do these cells hang around synapses? And what are they doing? © 2015 Scientific American

Keyword: Glia; Neuroimmunology
Link ID: 21566 - Posted: 10.26.2015