Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 20245

By Nicholas Bakalar A new study suggests that early to bed and early to rise makes a man healthy — although not necessarily wealthy or wise. Korean researchers recruited 1,620 men and women, ages 47 to 59, and administered a questionnaire to establish whether they were morning people or night owls. They found 480 morning types, 95 night owls, and 1,045 who fit into neither group. The scientists measured all for glucose tolerance, body composition and waist size, and gathered information on other health and behavioral characteristics. The study is online in The Journal of Clinical Endocrinology & Metabolism. After controlling for an array of variables, they found that compared with morning people, men who were night owls were significantly more likely to have diabetes, and women night owls were more than twice as likely to have metabolic syndrome — high blood sugar levels, excess body fat around the waist, and abnormal lipid readings. The reasons for the effect are unclear, but the scientists suggest that consuming more calories after 8 p.m. and exposure to artificial light at night can both affect metabolic regulation. Can a night owl become a morning person? “Yes,” said the lead author, Dr. Nan Hee Kim, an endocrinologist at the Korea University College of Medicine. “It can be modified by external cues such as light, activity and eating behavior. But it isn’t known if this would improve the metabolic outcomes.” © 2015 The New York Times Company

Keyword: Sleep
Link ID: 20781 - Posted: 04.10.2015

Jon Hamilton Researchers have discovered the exact structure of the receptor that makes our sensory nerves tingle when we eat sushi garnished with wasabi. And because the "wasabi receptor" is also involved in pain perception, knowing its shape should help pharmaceutical companies develop new drugs to fight pain. The receptor, which scientists call TRPA1, is "an important molecule in the pain pathway," says David Julius, a professor of physiology at the University of California, San Francisco and an author of a paper published in this week's Nature. "A dream of mine is that some of the work we do will translate into medicines people can take for chronic pain." Julius led a team that discovered the receptor about a decade ago. Since then, researchers have shown that TRPA1 receptors begin sending distress signals to the brain whenever they encounter pungent chemical irritants, including not only wasabi but tear gas and air pollution from cars or wood fires. The receptors also become activated in response to chemicals released by the body itself when tissue becomes inflamed from an injury or a disease like rheumatoid arthritis. © 2015 NPR

Keyword: Pain & Touch
Link ID: 20780 - Posted: 04.10.2015

By Emily Underwood A splashy headline appeared on the websites of many U.K. newspapers this morning, claiming that men whose brothers or fathers have been convicted of a sex offense are “five times more likely to commit sex crimes than the average male” and that this increased risk of committing rape or molesting a child “may run in a family’s male genes.” The study, published online today in the International Journal of Epidemiology, analyzed data from 21,566 male sex offenders convicted in Sweden between 1973 and 2009 and concluded that genetics may account for at least 40% of the likelihood of committing a sex crime. (Women, who commit less than 1% of Sweden’s sexual offenses, were omitted from the analysis.) The scientists have suggested that the new research could be used to help identify potential offenders and target high-risk families for early intervention efforts. But independent experts—and even the researchers who led the work, to a certain degree—warn that the study has some serious limitations. Here are a few reasons to take its conclusions, and the headlines, with a generous dash of salt. Alternate explanations: Most studies point to early life experiences, such as childhood abuse, as the most important risk factor for becoming a perpetrator of abuse in adulthood. The new study, however, did not include any detail about the convicted sex criminals’ early life exposure to abuse. Instead, by comparing fathers with sons, and full brothers and half-brothers reared together or apart, the scientists attempted to tease out the relative contributions of shared environment and shared genes to the risk of sexual offending. Based on their analyses, the researchers concluded that shared environment accounted for just 2% of the risk of sexual offense, while genetics accounted for roughly 40%. Although there is likely some genetic contribution to sexual offending—perhaps related to impulsivity or sex drive—the group “may be overestimating the role of genes” because their assumptions were inaccurate, says Fred Berlin, a psychiatrist and sexologist at Johns Hopkins University in Baltimore, Maryland. © 2015 American Association for the Advancement of Science.

Keyword: Aggression; Genes & Behavior
Link ID: 20779 - Posted: 04.10.2015

By Rachel Feltman If you give a mouse an eating disorder, you might just figure out how to treat the disease in humans. In a new study published Thursday in Cell Press, researchers created mice who lacked a gene associated with disordered eating in humans. Without it, the mice showed behaviors not unlike those seen in humans with eating disorders: They tended to be obsessive compulsive and have trouble socializing, and they were less interested in eating high-fat food than the control mice. The findings could lead to novel drug treatments for some of the 24 million Americans estimated to suffer from eating disorders. In a 2013 study, the same researchers went looking for genes that might contribute to the risk of an eating disorder. Anorexia nervosa and bulimia nervosa aren't straightforwardly inherited -- there's definitely more to an eating disorder than your genes -- but it does seem like some families might have higher risks than others. Sure enough, the study of two large families, each with several members who had eating disorders, yielded mutations in two interacting genes. In one family, the estrogen-related receptor α (ESRRA) gene was mutated. The other family had a mutation on another gene that seemed to affect how well ESRRA could do its job. So in the latest study, they created mice that didn't have ESRRA in the parts of the brain associated with eating disorders. "You can't go testing this kind of gene expression in a human," lead author and University of Iowa neuorscientist Michael Lutter said. "But in mice, you can manipulate the expression of the gene and then look at how it changes their behavior."

Keyword: Anorexia & Bulimia; Genes & Behavior
Link ID: 20778 - Posted: 04.10.2015

by Anil Ananthaswamy HOLD that thought. When it comes to consciousness, the brain may be doing just that. It now seems that conscious perception requires brain activity to hold steady for hundreds of milliseconds. This signature in the pattern of brainwaves can be used to distinguish between levels of impaired consciousness in people with brain injury. The new study by Aaron Schurger at the Swiss Federal Institute of Technology in Lausanne doesn't explain the so-called "hard problem of consciousness" – how roughly a kilogram of nerve cells is responsible for the miasma of sensations, thoughts and emotions that make up our mental experience. However, it does chip away at it, and support the idea that it may one day be explained in terms of how the brain processes information. Neuroscientists think that consciousness requires neurons to fire in such a way that they produce a stable pattern of brain activity. The exact pattern will depend on what the sensory information is, but once information has been processed, the idea is that the brain should hold a pattern steady for a short period of time – almost as if it needs a moment to read out the information. In 2009, Schurger tested this theory by scanning 12 people's brains with fMRI machines. The volunteers were shown two images simultaneously, one for each eye. One eye saw a red-on-green line drawing and the other eye saw green-on-red. This confusion caused the volunteers to sometimes consciously perceive the drawing and sometimes not. © Copyright Reed Business Information Ltd.

Keyword: Consciousness; Brain imaging
Link ID: 20777 - Posted: 04.10.2015

By Ariana Eunjung Cha Autism has always been a tricky disorder to diagnose. There’s no such thing as a blood test, cheek swap or other accepted biological marker so specialists must depend on parent and teacher reports, observations and play assessments. Figuring out a child's trajectory once he or she is diagnosed is just as challenging. The spectrum is wide and some are destined to be on the mild end and be very talkative, sometimes almost indistinguishable from those without the disorder in some settings, while others will suffer from a more severe form and have trouble being able to speak basic words. Now scientists believe that they have a way to distinguish between those paths, at least in terms of language ability, in the toddler years using brain imaging. In an article published Thursday in the journal Neuron, scientists at the University of California-San Diego have found that children with autism spectrum disorder, or ASD, with good language outcomes have strikingly distinct patterns of brain activation as compared to those with poor language outcomes and typically developing toddlers. "Why some toddlers with ASD get better and develop good language and others do not has been a mystery that is of the utmost importance to solve," Eric Courchesne, one of the study’s authors and co-director of the University of California-San Diego's Autism Center, said in a statement. The images of the children in the study -- MRIs of the brain -- were taken at 12 to 29 months while their language was assessed one to two years later at 30 to 48 months.

Keyword: Autism; Language
Link ID: 20776 - Posted: 04.10.2015

Mo Costandi In 2009, researchers at the University of California, Santa Barbara performed a curious experiment. In many ways, it was routine — they placed a subject in the brain scanner, displayed some images, and monitored how the subject's brain responded. The measured brain activity showed up on the scans as red hot spots, like many other neuroimaging studies. Except that this time, the subject was an Atlantic salmon, and it was dead. Dead fish do not normally exhibit any kind of brain activity, of course. The study was a tongue-in-cheek reminder of the problems with brain scanning studies. Those colorful images of the human brain found in virtually all news media may have captivated the imagination of the public, but they have also been subject of controversy among scientists over the past decade or so. In fact, neuro-imagers are now debating how reliable brain scanning studies actually are, and are still mostly in the dark about exactly what it means when they see some part of the brain "light up." Functional magnetic resonance imaging (fMRI) measures brain activity indirectly by detecting changes in the flow of oxygen-rich blood, or the blood oxygen-level dependent (BOLD) signal, with its powerful magnets. The assumption is that areas receiving an extra supply of blood during a task have become more active. Typically, researchers would home in on one or a few "regions of interest," using 'voxels,' tiny cube-shaped chunks of brain tissue containing several million neurons, as their units of measurement.

Keyword: Brain imaging
Link ID: 20775 - Posted: 04.10.2015

By Megan Griffith-Greene The idea of playing a game to make you sharper seems like a no-brainer. That's the thinking behind a billion-dollar industry selling brain training games and programs designed to boost cognitive ability. But an investigation by CBC's Marketplace reveals that brain training games such as Lumosity may not make your brain perform better in everyday life. Lumosity Brain training games, such as Lumosity, are a billion-dollar industry. Many people are worried about maintaining their brain health and want to prevent a decline in their mental abilities. (CBC) Almost 15 per cent of Canadians over the age of 65 are affected by some kind of dementia. And many people of all ages are worried about maintaining their brain health and possibly preventing a decline in their mental abilities. "I don't think there's anything to say that you can train your brain to be cognitively better in the way that we know that we can train our bodies to be physically better," neuroscientist Adrian Owen told Marketplace co-host Tom Harrington. To test how effective the games are at improving cognitive function, Marketplace partnered with Owen, who holds the Canada Excellence Research Chair in Cognitive Neuroscience and Imaging at the Brain and Mind Institute at Western University. A group of 54 adults, including Harrington, did the brain training at least three times per week for 15 minutes or more over a period of between two and a half and four weeks. The group underwent a complete cognitive assessment at the beginning and end of the training to see if there had been any change as the result of the training program. ©2015 CBC/Radio-Canada.

Keyword: Learning & Memory; Alzheimers
Link ID: 20774 - Posted: 04.10.2015

Jordan Gaines Lewis Hodor hodor hodor. Hodor hodor? Hodor. Hodor-hodor. Hodor! Oh, um, excuse me. Did you catch what I said? Fans of the hit HBO show Game of Thrones, the fifth season of which premieres this Sunday, know what I’m referencing, anyway. Hodor is the brawny, simple-minded stableboy of the Stark family in Winterfell. His defining characteristic, of course, is that he only speaks a single word: “Hodor.” But those who read the A Song of Ice and Fire book series by George R R Martin may know something that the TV fans don’t: his name isn’t actually Hodor. According to his great-grandmother Old Nan, his real name is Walder. “No one knew where ‘Hodor’ had come from,” she says, “but when he started saying it, they started calling him by it. It was the only word he had.” Whether he intended it or not, Martin created a character who is a textbook example of someone with a neurological condition called expressive aphasia. In 1861, French physician Paul Broca was introduced to a man named Louis-Victor Leborgne. While his comprehension and mental functioning remained relatively normal, Leborgne progressively lost the ability to produce meaningful speech over a period of 20 years. Like Hodor, the man was nicknamed Tan because he only spoke a single word: “Tan.”

Keyword: Language
Link ID: 20773 - Posted: 04.10.2015

|By Gareth Cook The wait has been long, but the discipline of neuroscience has finally delivered a full-length treatment of the zombie phenomenon. In their book, Do Zombies Dream of Undead Sheep?, scientists Timothy Verstynen and Bradley Voytek cover just about everything you might want to know about the brains of the undead. It's all good fun, and if you learn some serious neuroscience along the way, well, that's fine with them, too. Voytek answered questions from contributing editor Gareth Cook. How is it that you and your co-author came to write a book about zombies? Clearly, it is an urgent public health threat, but I would not have expected a book from neuroscientists on the topic. Indeed! You think you're prepared for the zombie apocalypse and then—BAM!—it happens, and only then do you realize how poorly prepared you really were. Truly the global concern of our time. Anyway, this whole silly thing started when Tim and I would get together to watch zombie movies with our wives and friends. Turns out when you get some neuroscientists together to watch zombie movies, after a few beers they start to diagnose them and mentally dissect their brains. Back in the summer of 2010 zombie enthusiast and author—and head of the Zombie Research Society—Matt Mogk got in touch with me to see if we were interested in doing something at the intersection of zombies and neuroscience. © 2015 Scientific American

Keyword: Miscellaneous
Link ID: 20772 - Posted: 04.10.2015

Cari Romm “As humans, we can identify galaxies light-years away. We can study particles smaller than an atom,” President Barack Obama said in April 2013, “But we still haven’t unlocked the mystery of the three pounds of matter that sits between our ears.” The observation was part of the president’s announcement of the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, an effort to fast-track the development of new technology that will help scientists understand the workings of the human brain and its diseases. With progress, though, comes a whole new set of ethical questions. Can drugs used to treat conditions like ADHD, for example, also be used to make healthy people into sharper, more focused versions of themselves—and should they? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? Can brain scans submitted as courtroom evidence reveal anything about a defendant’s intent? Can a person with Alzheimer’s truly consent to testing that may help scientists better understand their disease? To address these questions, the Presidential Commission for the Study of Bioethical Issues, an independent advisory group, recently released the second volume of a report examining the issues that may arise as neuroscience advances. The commission outlined three areas it deemed particularly fraught: cognitive enhancement, consent, and the use of neuroscience in the legal system. © 2015 by The Atlantic Monthly Group

Keyword: Drug Abuse; Learning & Memory
Link ID: 20771 - Posted: 04.08.2015

|By Simon Makin People can control prosthetic limbs, computer programs and even remote-controlled helicopters with their mind, all by using brain-computer interfaces. What if we could harness this technology to control things happening inside our own body? A team of bioengineers in Switzerland has taken the first step toward this cyborglike setup by combining a brain-computer interface with a synthetic biological implant, allowing a genetic switch to be operated by brain activity. It is the world's first brain-gene interface. The group started with a typical brain-computer interface, an electrode cap that can register subjects' brain activity and transmit signals to another electronic device. In this case, the device is an electromagnetic field generator; different types of brain activity cause the field to vary in strength. The next step, however, is totally new—the experimenters used the electromagnetic field to trigger protein production within human cells in an implant in mice. The implant uses a cutting-edge technology known as optogenetics. The researchers inserted bacterial genes into human kidney cells, causing them to produce light-sensitive proteins. Then they bioengineered the cells so that stimulating them with light triggers a string of molecular reactions that ultimately produces a protein called secreted alkaline phosphatase (SEAP), which is easily detectable. They then placed the human cells plus an LED light into small plastic pouches and inserted them under the skin of several mice. © 2015 Scientific American

Keyword: Robotics; Genes & Behavior
Link ID: 20770 - Posted: 04.08.2015

By Jonathan Webb Science reporter, BBC News Living in total darkness, the animals' eyes have disappeared over millions of years A study of blind crustaceans living in deep, dark caves has revealed that evolution is rapidly withering the visual parts of their brain. The findings catch evolution in the act of making this adjustment - as none of the critters have eyes, but some of them still have stumpy eye-stalks. Three different species were studied, each representing a different subgroup within the same class of crustaceans. The research is published in the journal BMC Neuroscience. The class of "malocostracans" also includes much better-known animals like lobsters, shrimps and wood lice, but this study focussed on three tiny and obscure examples that were only discovered in the 20th Century. It is the first investigation of these mysterious animals' brains. "We studied three species. All of them live in caves, and all of them are very rare or hardly accessible," said lead author Dr Martin Stegner, from the University of Rostock in Germany. Specifically, his colleagues retrieved the specimens from the coast of Bermuda, from Table Mountain in South Africa, and from Monte Argentario in Italy. One of the species was retrieved from caves on the coast of Bermuda The animals were preserved rather than living, so the team could not observe their tiny brains in action. But by looking at the physical shape of the brain, and making comparisons with what we know about how the brain works in their evolutionary relatives, the researchers were able to assign jobs to the various lobes, lumps and spindly structures they could see under the microscope. © 2015 BBC.

Keyword: Evolution; Vision
Link ID: 20769 - Posted: 04.08.2015

Tom Bawden Scientists have deciphered the secrets of gibbon “speech” – discovering that the apes are sophisticated communicators employing a range of more than 450 different calls to talk to their companions. The research is so significant that it could provide clues on the evolution of human speech and also suggests that other animal species could speak a more precise language than has been previously thought, according to lead author Dr Esther Clarke of Durham University. Her study found that gibbons produce different categories of “hoo” calls – relatively quiet sounds that are distinct from their more melodic “song” calls. These categories of call allow the animals to distinguish when their fellow gibbons are foraging for food, alerting them to distant noises or warning others about the presence of predators. In addition, Dr Clarke found that each category of “hoo” call can be broken down further, allowing gibbons to be even more specific in their communication. A warning about lurking raptor birds, for example, sounds different to one about pythons or clouded leopards – being pitched at a particularly low frequency to ensure it is too deep for the birds of prey to hear. The warning call denoting the presence of tigers and leopards is the same because they belong to the same class of big cats, the research found. © independent.co.uk

Keyword: Language; Evolution
Link ID: 20768 - Posted: 04.08.2015

Do Alcoholics Anonymous participants do better at abstinence than nonparticipants because they are more motivated? Or is it because of something inherent in the A.A. program? How researchers answered these questions in a recent study offers insight into challenges of evidence-based medicine and evidence-informed policy. The study, published in the journal Alcoholism: Clinical and Experimental Research, teased apart a treatment effect (improvement due to A.A. itself) and a selection effect (driven by the type of people who seek help). The investigators found that there is a genuine A.A. treatment effect. Going to an additional two A.A. meetings per week produced at least three more days of alcohol abstinence per month. Separating treatment from selection effects is a longstanding problem in social and medical science. Their entanglement is one of the fundamental ways in which evidence of correlation fails to be a sign of causation. For many years, researchers and clinicians have debated whether the association of A.A. with greater abstinence was caused by treatment or a correlation that arises from the type of people who seek it. Such confounding is often addressed with an experiment in which individuals are randomly assigned to either a treatment or a nontreatment (or control) group in order to remove the possibility of self-selection. The treatment effect is calculated by comparing outcomes obtained by participants in each group. Several studies of A.A. have applied this approach. For instance, Kimberly Walitzer, Kurt Dermen and Christopher Barrick randomized alcoholics to receive treatment that strongly encouraged and supported A.A. participation or a control group. The former exhibited a greater degree of abstinence. In an ideal randomized controlled trial (R.C.T.), everyone selected for treatment receives it and no one in the control group does. The difference in outcomes is the treatment effect, free of bias from selection. That’s the ideal. However, in practice, randomized controlled trials can still suffer selection problems. © 2015 The New York Times Company

Keyword: Drug Abuse
Link ID: 20767 - Posted: 04.08.2015

By ERICA GOODE He was described, in the immediate aftermath of the Germanwings crash, as a cheerful and careful pilot, a young man who had dreamed of flying since boyhood. But in the days since, it has seemed increasingly clear that Andreas Lubitz, 27, the plane’s co-pilot, was something far more sinister: the perpetrator of one of the worst mass murder-suicides in history. If what researchers have learned about such crimes is any indication, this notoriety may have been just what Mr. Lubitz wanted. The actions now attributed to Mr. Lubitz — taking 149 unsuspecting people with him to a horrifying death — seem in some ways unfathomable, and his full motives may never be fully understood. But studies over the last decades have begun to piece together characteristics that many who carry out such violence seem to share, among them a towering narcissism, a strong sense of grievance and a desire for infamy. Adam Lankford, an associate professor of criminal justice at the University of Alabama, said that in his research on mass killers who also took their own lives, he has found “a significant number of cases where they mention a desire for fame, glory or attention as a motive.” Before Adam Lanza, 20, the Sandy Hook Elementary School shooter, killed 20 children, six adults and himself in 2012, he wrote in an online forum, “Just look at how many fans you can find for all different types of mass murderers.” Robert Hawkins, 19, who committed suicide after killing eight people at a shopping mall in Omaha in 2007, left a note saying “I’m gonna be famous,” punctuating the sentence with an expletive. And Dylan Klebold, 17, of Columbine High School fame, bragged that the goal was to cause “the most deaths in U.S. history…we’re hoping. We’re hoping.” “Directors will be fighting over this story,” Mr. Klebold said in a video made before the massacre. © 2015 The New York Times Company

Keyword: Aggression; Depression
Link ID: 20766 - Posted: 04.07.2015

Arran Frood A psychedelic drink used for centuries in healing ceremonies is now attracting the attention of biomedical scientists as a possible treatment for depression. Researchers from Brazil last month published results from the first clinical test of a potential therapeutic benefit for ayahuasca, a South American plant-based brew1. Although the study included just six volunteers and no placebo group, the scientists say that the drink began to reduce depression in patients within hours, and the effect was still present after three weeks. They are now conducting larger studies that they hope will shore up their findings. The work forms part of a renaissance in studying the potential therapeutic benefits of psychedelic or recreational drugs — research that was largely banned or restricted worldwide half a century ago. Ketamine, which is used medically as an anaesthetic, has shown promise as a fast-acting antidepressant; psilocybin, a hallucinogen found in ‘magic mushrooms’, can help to alleviate anxiety in patients with advanced-stage cancer2; MDMA (ecstasy) can alleviate post-traumatic stress disorder; and patients who experience debilitating cluster headaches have reported that LSD eases their symptoms. Ayahuasca, a sacramental drink traditionally brewed from the bark of a jungle vine (Banisteriopsis caapi) and the leaves of a shrub (Psychotria viridis), contains ingredients that are illegal in most countries. But a booming ayahuasca industry has developed in South America, where its religious use is allowed, and where thousands of people each year head to rainforest retreats to sample its intense psychedelic insights. © 2015 Nature Publishing Group,

Keyword: Depression; Drug Abuse
Link ID: 20765 - Posted: 04.07.2015

By Kate Galbraith Most evenings, before watching late-night comedy or reading emails on his phone, Matt Nicoletti puts on a pair of orange-colored glasses that he bought for $8 off the Internet. “My girlfriend thinks I look ridiculous in them,” he said. But Mr. Nicoletti, a 30-year-old hospitality consultant in Denver, insists that the glasses, which can block certain wavelengths of light emitted by electronic screens, make it easier to sleep. Studies have shown that such light, especially from the blue part of the spectrum, inhibits the body’s production of melatonin, a hormone that helps people fall asleep. Options are growing for blocking blue light, though experts caution that few have been adequately tested for effectiveness and the best solution remains avoiding brightly lit electronics at night. A Swiss study of 13 teenage boys, published in August in The Journal of Adolescent Health, showed that when the boys donned orange-tinted glasses, also known as blue blockers and shown to prevent melatonin suppression, in the evening for a week, they felt “significantly more sleepy” than when they wore clear glasses. The boys looked at their screens, as teenagers tend to do, for at least a few hours on average before going to bed, and were monitored in the lab. Older adults may be less affected by blue light, experts say, since the yellowing of the lens and other changes in the aging eye filter out increasing amounts of blue light. But blue light remains a problem for most people, and an earlier study of 20 adults ages 18 to 68 found that those who wore amber-tinted glasses for three hours before bed improved their sleep quality considerably relative to a control group that wore yellow-tinted lenses, which blocked only ultraviolet light. Devices such as smartphones and tablets are often illuminated by light-emitting diodes, or LEDs, that tend to emit more blue light than incandescent products. Televisions with LED backlighting are another source of blue light, though because they are typically viewed from much farther away than small screens like phones, they may have less of an effect, said Debra Skene, a professor of neuroendocrinology at the University of Surrey in England. © 2015 The New York Times Company

Keyword: Sleep; Vision
Link ID: 20764 - Posted: 04.07.2015

By Jan Hoffman As adults age, vision deteriorates. One common type of decline is in contrast sensitivity, the ability to distinguish gradations of light to dark, making it possible to discern where one object ends and another begins. When an older adult descends a flight of stairs, for example, she may not tell the edge of one step from the next, so she stumbles. At night, an older driver may squint to see the edge of white road stripes on blacktop. Caught in the glare of headlights, he swerves. But new research suggests that contrast sensitivity can be improved with brain-training exercises. In a study published last month in Psychological Science, researchers at the University of California, Riverside, and Brown University showed that after just five sessions of behavioral exercises, the vision of 16 people in their 60s and 70s significantly improved. After the training, the adults could make out edges far better. And when given a standard eye chart, a task that differed from the one they were trained on, they could correctly identify more letters. “There’s an idea out there that everything falls apart as we get older, but even older brains are growing new cells,” said Allison B. Sekuler, a professor of psychology, neuroscience and behavior at McMaster University in Ontario, who was not involved in the new study. “You can teach an older brain new tricks.” The training improved contrast sensitivity in 16 young adults in the study as well, although the older subjects showed greater gains. That is partly because the younger ones, college students, already had reasonably healthy vision and there was not as much room for improvement. Before the training, the vision of each adult, young and older, was assessed. The exercises were fine-tuned at the beginning for each individual so researchers could measure improvements, said Dr.G. John Andersen, the project’s senior adviser and a psychology professor at the University of California, Riverside. © 2015 The New York Times Company

Keyword: Vision; Development of the Brain
Link ID: 20763 - Posted: 04.07.2015

By KEN BELSON One of the limitations of studying chronic traumatic encephalopathy, or C.T.E., the degenerative brain disease linked to repeated head trauma, has been that researchers have been able to detect it only in tissue obtained posthumously. A study published Monday by Proceedings of the National Academy of Sciences, though, suggests that researchers trying to develop a test that will detect the disease in living patients have taken a small step forward. The study, conducted at U.C.L.A., included 14 retired N.F.L. players who suffered from mood swings, depression and cognitive problems associated with C.T.E. The players were given PET, or positron emission tomography, scans that revealed tau protein deposits in their brains, a signature of C.T.E. Although the results were not conclusive, the distribution of tau in their brains was consistent with those found in the autopsies of players who had C.T.E. The 14 players were compared with 24 patients with Alzheimer’s disease and 28 patients in a control group with no significant cognitive problems. The scans showed that the tau deposits in the 14 players were “distinctly different” from those in the patients with Alzheimer’s disease. “There seems to be an emerging new pattern we haven’t seen in any known forms of dementia, and it is definitely not normal,” said Dr. Julian Bailes, a coauthor of the study and the chairman of neurosurgery at NorthShore Neurological Institute in Evanston, Ill. © 2015 The New York Times Company

Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 20762 - Posted: 04.07.2015