Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017
By MATT RICHTEL Amid an opioid epidemic, the rise of deadly synthetic drugs and the widening legalization of marijuana, a curious bright spot has emerged in the youth drug culture: American teenagers are growing less likely to try or regularly use drugs, including alcohol. With minor fits and starts, the trend has been building for a decade, with no clear understanding as to why. Some experts theorize that falling cigarette-smoking rates are cutting into a key gateway to drugs, or that antidrug education campaigns, long a largely failed enterprise, have finally taken hold. But researchers are starting to ponder an intriguing question: Are teenagers using drugs less in part because they are constantly stimulated and entertained by their computers and phones? The possibility is worth exploring, they say, because use of smartphones and tablets has exploded over the same period that drug use has declined. This correlation does not mean that one phenomenon is causing the other, but scientists say interactive media appears to play to similar impulses as drug experimentation, including sensation-seeking and the desire for independence. Or it might be that gadgets simply absorb a lot of time that could be used for other pursuits, including partying. Nora Volkow, director of the National Institute on Drug Abuse, says she plans to begin research on the topic in the next few months, and will convene a group of scholars in April to discuss it. The possibility that smartphones were contributing to a decline in drug use by teenagers, Dr. Volkow said, was the first question she asked when she saw the agency’s most recent survey results. The survey, “Monitoring the Future,” an annual government-funded report measuring drug use by teenagers, found that past-year use of illicit drugs other than marijuana was at the lowest level in the 40-year history of the project for eighth, 10th and 12th graders. © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23357 - Posted: 03.15.2017
Heidi Ledford Like a zombie that keeps on kicking, legal battles over mutant mice used for Alzheimer’s research are haunting the field once again — four years after the last round of lawsuits. In the latest case, the University of South Florida (USF) in Tampa has sued the US National Institutes of Health (NIH) for authorizing the distribution of a particular type of mouse used in the field. The first pre-trial hearing in the case is set to begin in a federal court on 21 March. The university holds a patent on the mouse, but the NIH has contracted the Jackson Laboratory, a non-profit organization in Bar Harbor, Maine, to supply the animals to researchers. The USF is now claiming that it deserves some of the money that went to the contractor. If the suit, filed in December 2015, is successful, it could set a precedent for other universities, cautions Robert Cook-Deegan, an intellectual-property scholar at the Washington DC centre of Arizona State University in Tempe. And that would threaten the affordability of and access to lab animals used to investigate. “It feels greedy to me,” Cook-Deegan says. “If other universities start doing this, all it does is push up the cost of research tools.” The mice, on which the USF filed a patent in 1997, express mutated forms of two genes1. These modifications help researchers to study how amyloid plaques develop in the brain, and enable them to investigate behavioural changes that manifest before those plaques appear. © 2017 Macmillan Publishers Limited,
Link ID: 23356 - Posted: 03.15.2017
By Warren Cornwall The number of years someone spends behind bars can hinge on whether they were clearly aware that they were committing a crime. But how is a judge or jury to know for sure? A new study suggests brain scans can distinguish between hardcore criminal intent and simple reckless behavior, but the approach is far from being ready for the courtroom. The study is unusual because it looks directly at the brains of people while they are engaged in illicit activity, says Liane Young, a Boston College psychologist who was not involved in the work. Earlier research, including work by her, has instead generally looked at the brains of people only observing immoral activity. Researchers led by Read Montague, a neuroscientist at Virginia Tech Carilion Research Insitute in Roanoke and at University College London, used functional magnetic resonance imaging (fMRI), which can measure brain activity based on blood flow. They analyzed the brains of 40 people—a mix of men and women mostly in their 20s and 30s—as they went through scenarios that simulated trying to smuggle something through a security checkpoint. In some cases, the people knew for certain they had contraband in a suitcase. In other cases, they chose from between two and five suitcases, with only one containing contraband (and thus they weren’t sure they were carrying contraband). The risk of getting caught also varied based on how many of the 10 security checkpoints had a guard stationed there. The results showed distinctive patterns of brain activity for when the person knew for certain the suitcase had contraband and when they only knew there was a chance of it, the team reports today in the Proceedings of the National Academy of Sciences. But there was an unexpected twist. Those differing brain patterns only showed up when people were first shown how many security checkpoints were guarded, and then offered the suitcases. In that case, a computer analysis of the fMRI images correctly classified people as knowing or reckless between 71% and 80% of the time. © 2017 American Association for the Advancement of Science
Jon Hamilton An orangutan named Rocky is helping scientists figure out when early humans might have uttered the first word. Rocky, who is 12 and lives at the Indianapolis Zoo, has shown that he can control his vocal cords much the way people do. He can learn new vocal sounds and even match the pitch of sounds made by a person. "Rocky, and probably other great apes, can do things with their vocal apparatus that, for decades, people have asserted was impossible," says Rob Shumaker, the zoo's director, who has studied orangutans for more than 30 years. Rocky's abilities suggest that our human ancestors could have begun speaking 10 million years ago, about the time humans and great apes diverged, Shumaker says. Until now, many scientists thought that speech required changes in the brain and vocal apparatus that evolved more recently, during the past 2 million years. The vocal abilities of orangutans might have gone undetected had it not been for Rocky, an ape with an unusual past and a rare relationship with people. Rocky was separated from his mother soon after he was born, and spent his early years raised largely by people, and working in show business. "He was certainly the most visible orangutan in entertainment at the time," says Shumaker. "TV commercials, things like that."
By Catherine Offord A few years ago, UK composer and technology reporter LJ Rich participated in a music technology competition as part of a project with the BBC. The 24-hour event brought together various musicians, and entailed staying awake into the wee hours trying to solve technical problems related to music. Late into the night, during a break from work, Rich thought of a way to keep people’s spirits up. “At about four in the morning, I remember playing different tastes to people on a piano in the room we were working in,” she says. For instance, “to great amusement, during breakfast I played people the taste of eggs.” It didn’t take long before Rich learned, for the first time, that food’s association with music was not as universally appreciated as she had assumed. “You realize everybody else doesn’t perceive the world that way,” she says. “For me, it was quite a surprise to find that people didn’t realize that certain foods had different keys.” Rich had long known she had absolute pitch—the ability to identify a musical note, such as B flat, without any reference. But that night, she learned she also has what’s known as synesthesia, a little-understood mode of perception that links senses such as taste and hearing in unusual ways, and is thought to be present in around 4 percent of the general population. It’s a difficult phenomenon to get to the bottom of. Like Rich, many synesthetes are unaware their perception is atypical; what’s more, detecting synesthesia usually relies on self-reported experiences—an obstacle for standardized testing. But a growing body of evidence suggests that Rich is far from being alone in possessing both absolute pitch and synesthesia. © 1986-2017 The Scientist
Link ID: 23353 - Posted: 03.14.2017
There is widespread interest among teachers in the use of neuroscientific research findings in educational practice. However, there are also misconceptions and myths that are supposedly based on sound neuroscience that are prevalent in our schools. We wish to draw attention to this problem by focusing on an educational practice supposedly based on neuroscience that lacks sufficient evidence and so we believe should not be promoted or supported. Generally known as “learning styles”, it is the belief that individuals can benefit from receiving information in their preferred format, based on a self-report questionnaire. This belief has much intuitive appeal because individuals are better at some things than others and ultimately there may be a brain basis for these differences. Learning styles promises to optimise education by tailoring materials to match the individual’s preferred mode of sensory information processing. There are, however, a number of problems with the learning styles approach. First, there is no coherent framework of preferred learning styles. Usually, individuals are categorised into one of three preferred styles of auditory, visual or kinesthetic learners based on self-reports. One study found that there were more than 70 different models of learning styles including among others, “left v right brain,” “holistic v serialists,” “verbalisers v visualisers” and so on. The second problem is that categorising individuals can lead to the assumption of fixed or rigid learning style, which can impair motivation to apply oneself or adapt. Finally, and most damning, is that there have been systematic studies of the effectiveness of learning styles that have consistently found either no evidence or very weak evidence to support the hypothesis that matching or “meshing” material in the appropriate format to an individual’s learning style is selectively more effective for educational attainment. Students will improve if they think about how they learn but not because material is matched to their supposed learning style.
Keyword: Learning & Memory
Link ID: 23352 - Posted: 03.14.2017
An international team of researchers has conducted the first study of its kind to look at the genomic underpinnings of obesity in continental Africans and African-Americans. They discovered that approximately 1 percent of West Africans, African-Americans and others of African ancestry carry a genomic variant that increases their risk of obesity, a finding that provides insight into why obesity clusters in families. Researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and their African collaborators published their findings March 13, 2017, in the journal Obesity. People with genomic differences in the semaphorin-4D (SEMA4D) gene were about six pounds heavier than those without the genomic variant, according to the study. Most of the genomic studies conducted on obesity to date have been in people of European ancestry, despite an increased risk of obesity in people of African ancestry. Obesity is a global health problem, contributing to premature death and morbidity by increasing a person’s risk of developing diabetes, hypertension, heart disease and some cancers. While obesity mostly results from lifestyle and cultural factors, including excess calorie intake and inadequate levels of physical activity, it has a strong genomic component. The burden of obesity is, however, not the same across U.S. ethnic groups, with African-Americans having the highest age-adjusted rates of obesity, said Charles N. Rotimi, Ph.D., chief of NHGRI’s Metabolic, Cardiovascular and Inflammatory Disease Genomics Branch and director of the Center for Research on Genomics and Global Health (CRGGH) at NIH. CRGGH examines the socio-cultural and genomic factors at work in health disparities — the negative health outcomes that impact certain groups of people — so they can be translated into policies that reduce or eliminate healthcare inequalities in the United States and globally.
Richard A. Friedman Jet lag makes everyone miserable. But it makes some people mentally ill. There’s a psychiatric hospital not far from Heathrow Airport that is known for treating bipolar and schizophrenic travelers, some of whom are occasionally found wandering aimlessly through the terminals. A study from the 1980s of 186 of those patients found that those who’d traveled from the west had a higher incidence of mania, while those who’d traveled from the east had a higher incidence of depression. I saw the same thing in one of my patients who suffered from manic depression. When he got depressed after a vacation to Europe, we assumed he was just disappointed about returning to work. But then he had a fun trip out West and returned home in what’s called a hypomanic state: He was expansive, a fount of creative ideas. It was clear that his changes in mood weren’t caused by the vacation blues, but by something else. The problem turned out to be a disruption in his circadian rhythm. He didn’t need drugs; he needed the right doses of sleep and sunlight at the right time. It turns out that that prescription could treat much of what ails us. Clinicians have long known that there is a strong link between sleep, sunlight and mood. Problems sleeping are often a warning sign or a cause of impending depression, and can make people with bipolar disorder manic. Some 15 years ago, Dr. Francesco Benedetti, a psychiatrist in Milan, and colleagues noticed that hospitalized bipolar patients who were assigned to rooms with views of the east were discharged earlier than those with rooms facing the west — presumably because the early morning light had an antidepressant effect. The notion that we can manipulate sleep to treat mental illness has also been around for many years. Back in the late 1960s, a German psychiatrist heard about a woman in Tübingen who was hospitalized for depression and claimed that she normally kept her symptoms in check by taking all-night bike rides. He subsequently demonstrated in a group of depressed patients that a night of complete sleep deprivation produced an immediate, significant improvement in mood in about 60 percent of the group. © 2017 The New York Times Company
By Knvul Sheikh As we get older, we start to think a little bit more slowly, we are less able to multitask and our ability to remember things gets a little wobblier. This cognitive transformation is linked to a steady, widespread thinning of the cortex, the brain's outermost layer. Yet the change is not inevitable. So-called super agers retain their good memory and thicker cortex as they age, a recent study suggests. Researchers believe that studying what makes super agers different could help unlock the secrets to healthy brain aging and improve our understanding of what happens when that process goes awry. “Looking at successful aging could provide us with biomarkers for predicting resilience and for things that might go wrong in people with age-related diseases like Alzheimer's and dementia,” says study co-author Alexandra Touroutoglou, a neuroscientist at Harvard Medical School. Touroutoglou and her team gave standard recall tests to a group of 40 participants between the ages of 60 and 80 and 41 participants aged 18 to 35. Among the older participants, 17 performed as well as or better than adults four to five decades younger. When the researchers looked at MRI scans of the super agers' brains, they found that their brains not only functioned more like young brains, they also looked very similar. Two brain networks in particular seemed to be protected from shrinking: the default mode network, which helps to store and recall new information, and the salience network, which is associated with directing attention and identifying important details. In fact, the thicker these regions were, the better the super agers' memory was. © 2017 Scientific American,
Is there life after death for our brains? It depends. Loretta Norton, a doctoral student at Western University in Canada, was curious, so she and her collaborators asked critically ill patients and their families if they could record brain activity in the half hours before and after life support was removed. They ended up recording four patients with electroencephalography, better known as EEG, which uses small electrodes attached to a person’s head to measure electrical activity in the brain. In three patients, the EEG showed brain activity stopping up to 10 minutes before the person’s heart stopped beating. But in a fourth, the EEG picked up so-called delta wave bursts up to 10 minutes after the person’s heart stopped. Delta waves are associated with deep sleep, also known as slow-wave sleep. In living people, neuroscientists consider slow-wave sleep to be a key process in consolidating memories. The study also raises questions about the exact moment when death occurs. Here’s Neuroskeptic: Another interesting finding was that the actual moment at which the heart stopped was not associated with any abrupt change in the EEG. The authors found no evidence of the large “delta blip” (the so-called “death wave“), an electrical phenomena which has been observed in rats following decapitation. With only four patients, it’s difficult to draw any sort of broad conclusion from this study. But it does suggest that death may be a gradual process as opposed to a distinct moment in time. © 1996-2017 WGBH Educational Foundation
Link ID: 23348 - Posted: 03.13.2017
By Michael Price The objects and people children play with as early as toddlerhood may provide clues to their eventual sexual orientation, reveals the largest study of its kind. The investigation, which tracked more than 4500 kids over the first 15 years of their lives, seeks to answer one of the most controversial questions in the social sciences, but experts are mixed on the findings. “Within its paradigm, it’s one of the better studies I’ve seen,” says Anne Fausto-Sterling, professor emerita of biology and gender studies at Brown University. The fact that it looks at development over time and relies on parents’ observations is a big improvement over previous studies that attempted to answer similar questions based on respondents’ own, often unreliable, memories, she says. “That being said … they’re still not answering questions of how these preferences for toys or different kinds of behaviors develop in the first place.” The new study builds largely on research done in the 1970s by American sex and gender researcher Richard Green, who spent decades investigating sexuality. He was influential in the development of the term “gender identity disorder” to describe stress and confusion over one’s sex and gender, though the term—and Green’s work more broadly—has come under fire from many psychologists and social scientists today who say it’s wrong to label someone’s gender and sexuality “disordered.” In the decades since, other studies have reported that whether a child plays along traditional gender lines can predict their later sexual orientation. But these have largely been criticized for their small sample sizes, for drawing from children who exhibit what the authors call “extreme” gender nonconformity, and for various other methodological shortcomings. © 2017 American Association for the Advancement of Science
By STEPH YIN Despite being just the size of a rice grain, robber flies, which live all over the world, are champion predators. In field experiments, they can detect targets the size of sand grains from nearly two feet away — 100 times the fly’s body length — and intercept them in under half a second. What’s more, they never miss their mark. A team led by scientists at the University of Cambridge has started to unveil the secrets to the robber fly’s prowess. In a study published Thursday in Current Biology, the team outlined the mechanics of the fly’s pursuit, from its impressive eye anatomy to how it makes a successful catch every time. Notably, the researchers observed a behavior never before described in a flying animal: About 30 centimeters from its prey, the insect slows, turns slightly and brings itself in for a close catch. “This ‘lock-on’ phase and change in behavior during a flight is quite remarkable,” said Sam Fabian, a graduate student at Cambridge and an author of the study. “We would actually expect them to do something very simple — just accelerate and hit the target.” The scientists surveyed robber flies in the field using a “fly teaser,” which consisted of beads on a rapidly moving fishing line controlled by a motor. As the flies charged at the bait, the researchers captured their movements using high-speed cameras. At the start of the robber fly’s conquest, it sits on a perch and scans the sky for passing prey. When it glimpses a potential meal, it takes flight, maintaining a steady angle between itself and its target. This proactive strategy, using a “constant bearing angle,” is also employed by fish, bats and sailors, Mr. Fabian said. © 2017 The New York Times Company
Link ID: 23346 - Posted: 03.11.2017
By Diana Kwon Deep in the Amazon rainforests of Bolivia live the Tsimane’, a tribe that has remained relatively untouched by Western civilization. Tsimane’ people possess a unique characteristic: they do not cringe at musical tones that sound discordant to Western ears. The vast majority of Westerners prefer consonant chords to dissonant ones, based on the intervals between the musical notes that compose the chords. One particularly notable example of this is the Devil’s Interval, or flatted fifth, which received its name in the Middle Ages because the sound it produced was deemed so unpleasant that people associated it with sinister forces. The flatted fifth later became a staple of numerous jazz, blues, and rock-and-roll songs. Over the years, scientists have gathered compelling evidence to suggest that an aversion to dissonance is innate. In 1996, in a letter to Nature, Harvard psychologists, Marcel Zentner and Jerome Kagan, reported on a study suggesting that four-month-old infants preferred consonant intervals to dissonant ones. Researchers subsequently replicated these results: one lab discovered the same effect in two-month-olds and another in two-day-old infants of both deaf and hearing parents. Some scientists even found these preferences in certain animals, such as young chimpanzees and baby chickens. “Of course the ambiguity is [that] even young infants have quite a bit of exposure to typical Western music,” says Josh McDermott, a researcher who studies auditory cognition at MIT. “So the counter-argument is that they get early exposure, and that shapes their preference.” © 1986-2017 The Scientist
By Aylin Woodward Noise is everywhere, but that’s OK. Your brain can still keep track of a conversation in the face of revving motorcycles, noisy cocktail parties or screaming children – in part by predicting what’s coming next and filling in any blanks. New data suggests that these insertions are processed as if the brain had really heard the parts of the word that are missing. “The brain has evolved a way to overcome interruptions that happen in the real world,” says Matthew Leonard at the University of California, San Francisco. We’ve known since the 1970s that the brain can “fill in” inaudible sections of speech, but understanding how it achieves this phenomenon – termed perceptual restoration – has been difficult. To investigate, Leonard’s team played volunteers words that were partially obscured or inaudible to see how their brains responded. The experiment involved people who already had hundreds of electrodes implanted into their brain to monitor their epilepsy. These electrodes detect seizures, but can also be used to record other types of brain activity. The team played the volunteers recordings of a word that could either be “faster” or “factor”, with the middle sound replaced by noise. Data from the electrodes showed that their brains responded as if they had actually heard the missing “s” or “c” sound. © Copyright Reed Business Information Ltd.
By JESS BIDGOOD SALEM, Mass. — A few years ago, Bevil Conway, then a neuroscientist at Wellesley College, got an interesting request: Could he give a lecture to the curators and other staff at the Peabody Essex Museum, the art and culture museum here? So Mr. Conway gathered his slides and started from the beginning, teaching the basics of neuroscience — “How neurons work, how neurons talk to each other, issues of evolutionary biology,” Mr. Conway said — to people who run an institution best known for its venerable collections of maritime and Asian art. It was an early step in what has become a galvanizing mission for the museum’s director, Dan L. Monroe: harnessing the lessons of brain science to make the museum more engaging as attendance is falling around the country. “If one’s committed to creating more meaningful and impactful art experiences, it seems a good idea to have a better idea about how our brains work,” he said. “That was the original line of thinking that started us down this path.” The museum, known as P.E.M., has been looking at neuroscience to incorporate its lessons into exhibitions ever since. In an effort to build shows that engage the brain, it has tried breaking up exhibition spaces into smaller pieces; posting questions and quotes on the wall, instead of relying only on explanatory wall text; and experimenting with elements like smell and sound in visual exhibitions. And those efforts are about to increase. The museum recently received a $130,000 grant from the Barr Foundation, a Boston-based philanthropic organization, to bring a neuroscience researcher on staff, add three neuroscientists to the museum as advisers and publish a guide that will help other museums incorporate neuroscience into their exhibition planning. “A lot of what we’re seeing in museums right now is the interpretation of pieces, or artwork,” said E. San San Wong, a senior program officer with the foundation. “What this is looking at is: How do we more actively engage people with art, in multiple senses?” © 2017 The New York Times Company
Many epilepsy patients in Australia are turning to medicinal cannabis to manage their seizures, a survey has shown. The nationwide survey found 14% of people with epilepsy had used cannabis products to manage the condition. Of those, 90% of adults and 71% of children with epilepsy, according to their parents, reported success in managing seizures. GW Pharmaceuticals doubles in value after cannabis drug success in epilepsy trial Read more Published in the journal Epilepsy & Behaviour, the Epilepsy Action Australia study, in partnership with the Lambert Initiative at the University of Sydney, surveyed 976 respondents to examine cannabis use in people with epilepsy, reasons for use and any perceived benefits self-reported by consumers. The main reason given for trying cannabis products was to seek a treatment with “more favourable” side-effects compared with standard antiepileptic drugs. The lead author of the study, Anastatsia Suraeve from the Lambert Initiative, said researchers had gained further insight into the reasons that influence use. “Despite the limitations of a retrospective online survey, we cannot ignore that a significant proportion of adults and children with epilepsy are using cannabis-based products in Australia, and many are self-reporting considerable benefits to their condition,” Suraeve said. “More systematic clinical studies are urgently needed to help us better understand the role of cannabinoids in epilepsy,” she said. © 2017 Guardian News and Media Limited
Susan Milius Catch sight of someone scratching and out of nowhere comes an itch, too. Now, it turns out mice suffer the same strange phenomenon. Tests with mice that watched itchy neighbors, or even just videos of scratching mice, provide the first clear evidence of contagious scratching spreading mouse-to-mouse, says neuroscientist Zhou-Feng Chen of Washington University School of Medicine in St. Louis. The quirk opens new possibilities for exploring the neuroscience behind the spread of contagious behaviors. For the ghostly itch, experiments trace scratching to a peptide nicknamed GRP and areas of the mouse brain better known for keeping the beat of circadian rhythms, Chen and colleagues found. They report the results in the March 10 Science. In discovering this, “there were lots of surprises,” Chen says. One was that mice, nocturnal animals that mostly sniff and whisker-brush their way through the dark, would be sensitive to the sight of another mouse scratching. Yet Chen had his own irresistible itch to test the “crazy idea,” he says. Researchers housed mice that didn’t scratch any more than normal within sight of mice that flicked and thumped their paws frequently at itchy skin. Videos recorded instances of normal mice looking at an itch-prone mouse mid-scratch and, shortly after, scratching themselves. In comparison, mice with not-very-itchy neighbors looked at those neighbors at about the same frequency but rarely scratched immediately afterward. |© Society for Science & the Public 2000 - 2017.
By Abby Olena Researchers have shown that a hormone secreted by bone, called lipocalin 2 (LCN2), suppresses appetite in mice. The results, published today (March 8) in Nature, suggest that LCN2 crosses the rodents’ blood-brain barrier and binds a receptor in the hypothalamus. The team also found a link between body weight and LCN2 levels in people with type 2 diabetes. The authors “have identified a protein that’s secreted from bone that has a pretty significant impact on feeding behavior,” Lora Heisler of the University of Aberdeen in Scotland, who did not participate in the work, told The Scientist. “And the fact that they found that some supporting evidence in humans is really exciting.” “We have found a new role for bone as an endocrine organ, and that is its ability to regulate appetite,” said study coauthor Stavroula Kousteni of Columbia University in New York City. Scientists had previously identified LCN2 as a protein expressed in fat cells, but Kousteni and colleagues showed that it is enriched 10-fold in osteoblasts. When they generated mice without LCN2 in their osteoblasts, levels of the circulating hormone dropped 67 percent. These mice ate more than control animals and showed increases in fat mass and body weight. When the authors injected LCN2 into wild-type or obese mice, the rodents ate less food. The treated animals showed decreases in body weight, fat mass, and weight gain. LCN2 injections also led to increases in insulin levels and glucose tolerance, the scientists showed. © 1986-2017 The Scientist
By Andy Coghlan Tiny particles secreted in response to head injury in the brains of mice could help explain how inflammation spreads and ultimately boosts the risk of developing dementia. Head injuries are increasingly being linked to cognitive problems and degenerative brain disease in later life. Mysterious particles a micrometre in diameter have previously been found in the spinal fluid of people with traumatic brain injury, but their function has remained unknown. Now Alan Faden at the University of Maryland School of Medicine in Baltimore and his colleagues have discovered that activated immune cells called microglia secrete such microparticles in response to brain injury, and they seem to spread inflammation well beyond the injury site itself. They can even cause brain inflammation when injected into uninjured animals. The particles have receptors that latch onto cells, and are packed with chemicals such as interleukins, which trigger inflammation, and fragments of RNA capable of switching whole suites of genes on or off. When Faden injured the brains of sedated mice, the microparticles spread well beyond the site of damage. Further experiments on cultured microglial cells revealed that the microparticles activate resting microglia, making them capable of triggering further inflammation themselves. © Copyright Reed Business Information Ltd.