Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Carl Zimmer As much as we may try to deny it, Earth’s cycle of day and night rules our lives. When the sun sets, the encroaching darkness sets off a chain of molecular events spreading from our eyes to our pineal gland, which oozes a hormone called melatonin into the brain. When the melatonin latches onto neurons, it alters their electrical rhythm, nudging the brain into the realm of sleep. At dawn, sunlight snuffs out the melatonin, forcing the brain back to its wakeful pattern again. We fight these cycles each time we stay up late reading our smartphones, suppressing our nightly dose of melatonin and waking up grumpy the next day. We fly across continents as if we could instantly reset our inner clocks. But our melatonin-driven sleep cycle lags behind, leaving us drowsy in the middle of the day. Scientists have long wondered how this powerful cycle got its start. A new study on melatonin hints that it evolved some 700 million years ago. The authors of the study propose that our nightly slumbers evolved from the rise and fall of our tiny oceangoing ancestors, as they swam up to the surface of the sea at twilight and then sank in a sleepy fall through the night. To explore the evolution of sleep, scientists at the European Molecular Biology Laboratory in Germany study the activity of genes involved in making melatonin and other sleep-related molecules. Over the past few years, they’ve compared the activity of these genes in vertebrates like us with their activity in a distantly related invertebrate — a marine worm called Platynereis dumerilii. The scientists studied the worms at an early stage, when they were ball-shaped 2-day-old larvae. The ocean swarms with juvenile animals like these. Many of them spend their nights near the ocean surface, feeding on algae and other bits of food. Then they spend the day at lower depths, where they can hide from predators and the sun’s ultraviolet rays. © 2014 The New York Times Company
BY Bethany Brookshire In this sweet, sweet world we live in, losing weight can be a dull and flavorless experience. Lovely stove-popped popcorn drenched in butter gives way to dry microwaved half-burnt kernels covered in dusty yellow powder. The cookies and candy that help us get through the long afternoons are replaced with virtuous but boring apples and nuts. Even the sugar that livens up our coffee gets a skeptical eye: That’s an extra 23 calories per packet you shouldn’t be eating. What makes life sweet for those of us who are counting calories is artificial sweeteners. Diet soda gives a sweet carbonated fix. A packet of artificial sweetener in your coffee or tea makes it a delicious morning dose. But a new study, published September 17 in Nature, found that the artificial sweetener saccharin has an unintended side effect: It alters the bacterial composition of the gut in mice and humans. The new bacterial neighborhood brings with it higher blood glucose levels, putting the humans and the murine counterparts at risk for diabetes. Many people wondered if the study’s effects were real. We all knew that sugar was bad, but now the scientists are coming for our Splenda! It seems more than a little unfair. But this study was a long time coming. The scientific community has been studying artificial sweeteners and their potential hazards for a long time. And while the new study adds to the literature, there are other studies, currently ongoing and planned for the future, that will determine the extent and necessity of our artificially sweetened future. © Society for Science & the Public 2000 - 2014.
James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class. The improvements in this case came in executive control, which consists of inhibition (resisting distraction, maintaining focus), working memory, and cognitive flexibility (switching between tasks). The images above show the brain activity in the group of kids who did the program as opposed to the group that didn't. It's the kind of difference that's so dramatic it's a little unsettling. The study only lasted nine months, but when you're only seven years old, nine months is a long time to be sitting in class with a blue head. © 2014 by The Atlantic Monthly Group.
Link ID: 20152 - Posted: 10.02.2014
|By Nathan Collins Step aside, huge magnets and radioactive tracers—soon some brain activity will be revealed by simply training dozens of red lights on the scalp. A new study in Nature Photonics finds this optical technique can replicate functional MRI experiments, and it is more comfortable, more portable and less expensive. The method is an enhancement of diffuse optical tomography (DOT), in which a device shines tiny points of red light at a subject's scalp and analyzes the light that bounces back. The red light reflects off red hemoglobin in the blood but does not interact as much with tissues of other colors, which allows researchers to recover an fMRI-like image of changing blood flow in the brain at work. For years researchers attempting to use DOT have been limited by the difficulty of packing many heavy light sources and detectors into the small area around the head. They also needed better techniques for analyzing the flood of data that the detectors collected. Now researchers at Washington University in St. Louis and the University of Birmingham in England report they have solved those problems and made the first high-density DOT (HD-DOT) brain scans. The team first engineered a “double halo” structure to support the weight of 96 lights and 92 detectors, more than double the number in earlier arrays. The investigators also dealt with the computing challenges associated with that many lights—for example, they figured out how to filter out interference from blood flow in the scalp and other tissues. The team then used HD-DOT to successfully replicate fMRI studies of vision and language processing—a task impossible for other fMRI alternatives, such as functional near-infrared spectroscopy or electroencephalography, which do not cover a large enough swath of the brain or have sufficient resolution to pinpoint active brain areas. Finally, the team scanned the brains of people who have implanted electrodes for Parkinson's disease—something fMRI can never do because the machine generates electromagnetic waves that can destroy electronic devices such as pacemakers. © 2014 Scientific American
Keyword: Brain imaging
Link ID: 20151 - Posted: 10.02.2014
By CATHERINE SAINT LOUIS Driven by a handful of reports of poliolike symptoms in children, federal health officials have asked the nation’s physicians to report cases of children with limb weakness or paralysis along with specific spinal-cord abnormalities on a magnetic resonance imaging test. As a respiratory illness known as enterovirus 68 is sickening thousands of children from coast to coast, officials are trying to figure out if the weakness could be linked to the virus. The emergence of several cases of limb weakness among children in Colorado put doctors on alert in recent months. The Centers for Disease Control and Prevention issued an advisory on Friday, and this week, other cases of unexplained muscle weakness or paralysis came to light in Michigan, Missouri and Massachusetts. The C.D.C. is investigating the cases of 10 children hospitalized at Children’s Hospital Colorado with unexplained arm or leg weakness since Aug. 9. Some of the children, who range in age from 1 to 18, also developed symptoms like facial drooping, double vision, or difficulty swallowing or talking. Four of them tested positive for enterovirus 68, also known as enterovirus D68, which has recently caused severe respiratory illness in children in 41 states and the District of Columbia. One tested positive for rhinovirus, which can cause the common cold. Two tested negative. Two patients’ specimens are still being processed; another was never tested. It is unclear whether the muscle weakness is connected to the viral outbreak. “It’s one possibility we are looking at, but certainly not the only possibility,” said Mark Pallansch, director of the C.D.C.’s division of viral diseases. © 2014 The New York Times Company
Keyword: Movement Disorders
Link ID: 20150 - Posted: 10.02.2014
By Smitha Mundasad Health reporter, BBC News Measuring people's sense of smell in later life could help doctors predict how likely they are to be alive in five years' time, a PLOS One study suggests. A survey of 3,000 adults found 39% with the poorest sense of smell were dead within five years - compared to just 10% who identified odours correctly. Scientists say the loss of smell sense does not cause death directly, but may be an early warning sign. They say anyone with long-lasting changes should seek medical advice. Researchers from the University of Chicago asked a representative sample of adults between the ages of 57-85 to take part in a quick smell test. The assessment involved identifying distinct odours encased on the tips of felt-tip pens. The smells included peppermint, fish, orange, rose and leather. Five years later some 39% of adults who had the lowest scores (4-5 errors) had passed away, compared with 19% with moderate smell loss and just 10% with a healthy sense of smell (0-1 errors). And despite taking issues such as age, nutrition, smoking habits, poverty and overall health into account, researchers found those with the poorest sense of smell were still at greatest risk. Lead scientist, Prof Jayant Pinto, said: "We think loss of the sense of smell is like the canary in the coal mine. BBC © 2014
By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.
Have you ever wrongly suspected that other people are out to harm you? Have you been convinced that you’re far more talented and special than you really are? Do you sometimes hear things that aren’t actually there? These experiences – paranoia, grandiosity and hallucinations in the technical jargon – are more common among the general population than is usually assumed. But are people who are susceptible simply “made that way”? Are they genetically predisposed, in other words, or have their life experiences made them more vulnerable to these things? It’s an old debate: which is more important, nature or nurture? Scientists nowadays tend to agree that human psychology is a product of a complex interaction between genes and experience – which is all very well, but where does the balance lie? Scientists (including one of the authors of this blog) recently conducted the first ever study among the general population of the relative contributions of genes and environment to the experience of paranoia, grandiosity and hallucinations. How did we go about the research? First, it is important to be clear about the kinds of experience we measured. By paranoia, we mean the unfounded or excessive fear that other people are out to harm us. Grandiosity denotes an unrealistic conviction of one’s abilities and talents. Hallucinations are sensory experiences (hearing voices, for instance) that aren’t caused by external events. Led by Dr Angelica Ronald at Birkbeck, University of London, the team analysed data on almost 5,000 pairs of 16-year-old twins. This is the classical twin design, a standard method for gauging the relative influence of genes and environment. Looking simply at family traits isn’t sufficient: although family members share many genes, they also tend to share many of the same experiences. This is why studies involving twins are so useful. © 2014 Guardian News and Media Limited
Link ID: 20147 - Posted: 10.02.2014
by Jason M. Breslow As the NFL nears an end to its long-running legal battle over concussions, new data from the nation’s largest brain bank focused on traumatic brain injury has found evidence of a degenerative brain disease in 76 of the 79 former players it’s examined. The findings represent a more than twofold increase in the number of cases of chronic traumatic encephalopathy, or CTE, that have been reported by the Department of Veterans Affairs’ brain repository in Bedford, Mass. Researchers there have now examined the brain tissue of 128 football players who, before their deaths, played the game professionally, semi-professionally, in college or in high school. Of that sample, 101 players, or just under 80 percent, tested positive for CTE. To be sure, players represented in the data represent a skewed population. CTE can only be definitively identified posthumously, and many of the players who have donated their brains for research suspected that they may have had the disease while still alive. For example, former Chicago Bears star Dave Duerson committed suicide in 2011 by shooting himself in the chest, reportedly to preserve his brain for examination. Nonetheless, Dr. Ann McKee, the director of the brain bank, believes the findings suggest a clear link between football and traumatic brain injury. “Obviously this high percentage of living individuals is not suffering from CTE,” said McKee, a neuropathologist who directs the brain bank as part of a collaboration between the VA and Boston University’s CTE Center. But “playing football, and the higher the level you play football and the longer you play football, the higher your risk.” ©1995-2014 WGBH Educational Foundation
Keyword: Brain Injury/Concussion
Link ID: 20146 - Posted: 10.01.2014
By Gretchen Reynolds Exercise may help to safeguard the mind against depression through previously unknown effects on working muscles, according to a new study involving mice. The findings may have broad implications for anyone whose stress levels threaten to become emotionally overwhelming. Mental health experts have long been aware that even mild, repeated stress can contribute to the development of depression and other mood disorders in animals and people. Scientists have also known that exercise seems to cushion against depression. Working out somehow makes people and animals emotionally resilient, studies have shown. But precisely how exercise, a physical activity, can lessen someone’s risk for depression, a mood state, has been mysterious. So for the new study, which was published last week in Cell, researchers at the Karolinska Institute in Stockholm delved into the brains and behavior of mice in an intricate and novel fashion. Mouse emotions are, of course, opaque to us. We can’t ask mice if they are feeling cheerful or full of woe. Instead, researchers have delineated certain behaviors that indicate depression in mice. If animals lose weight, stop seeking out a sugar solution when it’s available — because, presumably, they no longer experience normal pleasures — or give up trying to escape from a cold-water maze and just freeze in place, they are categorized as depressed. And in the new experiment, after five weeks of frequent but intermittent, low-level stress, such as being restrained or lightly shocked, mice displayed exactly those behaviors. They became depressed. The scientists could then have tested whether exercise blunts the risk of developing depression after stress by having mice run first. But, frankly, from earlier research, they knew it would. They wanted to parse how. So they bred pre-exercised mice. © 2014 The New York Times Company
Link ID: 20145 - Posted: 10.01.2014
By Sarah C. P. Williams A wind turbine, a roaring crowd at a football game, a jet engine running full throttle: Each of these things produces sound waves that are well below the frequencies humans can hear. But just because you can’t hear the low-frequency components of these sounds doesn’t mean they have no effect on your ears. Listening to just 90 seconds of low-frequency sound can change the way your inner ear works for minutes after the noise ends, a new study shows. “Low-frequency sound exposure has long been thought to be innocuous, and this study suggests that it’s not,” says audiology researcher Jeffery Lichtenhan of the Washington University School of Medicine in in St. Louis, who was not involved in the new work. Humans can generally sense sounds at frequencies between 20 and 20,000 cycles per second, or hertz (Hz)—although this range shrinks as a person ages. Prolonged exposure to loud noises within the audible range have long been known to cause hearing loss over time. But establishing the effect of sounds with frequencies under about 250 Hz has been harder. Even though they’re above the lower limit of 20 Hz, these low-frequency sounds tend to be either inaudible or barely audible, and people don’t always know when they’re exposed to them. For the new study, neurobiologist Markus Drexl and colleagues at the Ludwig Maximilian University in Munich, Germany, asked 21 volunteers with normal hearing to sit inside soundproof booths and then played a 30-Hz sound for 90 seconds. The deep, vibrating noise, Drexl says, is about what you might hear “if you open your car windows while you’re driving fast down a highway.” Then, they used probes to record the natural activity of the ear after the noise ended, taking advantage of a phenomenon dubbed spontaneous otoacoustic emissions (SOAEs) in which the healthy human ear itself emits faint whistling sounds. © 2014 American Association for the Advancement of Science
Link ID: 20144 - Posted: 10.01.2014
|By Brian Bienkowski and Environmental Health News Babies born to mothers with high levels of perchlorate during their first trimester are more likely to have lower IQs later in life, according to a new study. The research is the first to link pregnant women's perchlorate levels to their babies’ brain development. It adds to evidence that the drinking water contaminant may disrupt thyroid hormones that are crucial for proper brain development. Perchlorate, which is both naturally occurring and manmade, is used in rocket fuel, fireworks and fertilizers. It has been found in 4 percent of U.S. public water systems serving an estimated 5 to 17 million people, largely near military bases and defense contractors in the U.S. West, particularly around Las Vegas and in Southern California. “We would not recommend action on perchlorate levels from this study alone, although our report highlights a pressing need for larger studies of perchlorate levels from the general pregnant population and those with undetected hypothyroidism,” the authors from the United Kingdom, Italy and Boston wrote in the study published in The Journal of Clinical Endocrinology & Metabolism. The Environmental Protection Agency for decades has debated setting a national drinking water standard for perchlorate. The agency in 2011 announced it would start developing a standard, reversing an earlier decision. In the meantime, two states, California and Massachusetts, have set their own standards. © 2014 Scientific American
Michael Häusser Use light to read out and control neural activity! This idea, so easily expressed and understood, has fired the imagination of neuroscientists for decades. The advantages of using light as an effector are obvious1: it is noninvasive, can be precisely targeted with exquisite spatial and temporal precision, can be used simultaneously at multiple wavelengths and locations, and can report the presence or activity of specific molecules. However, despite early progress2 and encouragement3, it is only recently that widely usable approaches for optical readout and manipulation of specific neurons have become available. These new approaches rely on genetically encoded proteins that can be targeted to specific neuronal subtypes, giving birth to the term 'optogenetics' to signal the combination of genetic targeting and optical interrogation4. On the readout side, highly sensitive probes have been developed for imaging synaptic release, intracellular calcium (a proxy for neural activity) and membrane voltage. On the manipulation side, a palette of proteins for both activation and inactivation of neurons with millisecond precision using different wavelengths of light have been identified and optimized. The extraordinary versatility and power of these new optogenetic tools are spurring a revolution in neuroscience research, and they have rapidly become part of the standard toolkit of thousands of research labs around the world. Although optogenetics may not yet be a household word (though try it on your mother; she may surprise you), there can be no better proof that optogenetics has become part of the scientific mainstream than the 2013 Brain Prize being awarded to the sextet that pioneered optogenetic manipulation (http://www.thebrainprize.org/flx/prize_winners/prize_winners_2013/) and the incorporation of optogenetics as a central plank in the US National Institutes of Health BRAIN Initiative5. Moreover, there is growing optimism about the prospect of using optogenetic probes not only to understand mechanisms of disease in animal models but also to treat disease in humans, particularly in more accessible parts of the brain such as the retina6. © 2014 Macmillan Publishers Limited
Keyword: Brain imaging
Link ID: 20142 - Posted: 10.01.2014
It's not just humans who want the latest gadget. Wild chimpanzees that see a friend making and using a nifty new kind of tool are likely to make one for themselves, scientists report. "Our study adds new evidence supporting the hypothesis that some of the behavioural diversity seen in wild chimpanzees is the result of social transmission and can therefore be interpreted as cultural," an international research team writes today in the journal PLOS ONE. The findings suggest that the ability of individuals to learn from one another originated long ago in a common ancestor of chimpanzees and humans, the researchers add. "This study tells us that chimpanzee culture changes over time, little by little, by building on previous knowledge found within the community," said Thibaud Gruber, a co-author of the study, in a statement. "This is probably how our early ancestors' cultures also changed over time." Scientists already knew that chimpanzees in different groups have certain behaviours unique to their group, such as using a particular kind of tool. They suspected that wild chimpanzees learn those behaviours from other chimpanzees within their group, as scientists have observed in captive chimps. But they could never be sure. The new study documents the spread of two new behaviours among chimpanzees living in Uganda's Budongo Forest. It shows that chimps learned one of them — the making and use of a new tool called a moss sponge — by observing other chimps who had already adopted the behaviour. Chimps dip the tool in water and then put it in their mouth to drink. © CBC 2014
Link ID: 20141 - Posted: 10.01.2014
|By Tanya Lewis and LiveScience Dolphins can now add magnetic sense to their already impressive resume of abilities, new research suggests. When researchers presented the brainy cetaceans with magnetized or unmagnetized objects, the dolphins swam more quickly toward the magnets, the new study found. The animals may use their magnetic sense to navigate based on the Earth's magnetic field, the researchers said. A number of different animals are thought to possess this magnetic sense, called "magnetoreception," including turtles, pigeons, rodents, insects, bats and even deer (which are related to dolphins), said Dorothee Kremers, an animal behavior expert at the University of Rennes, in France, and co-author of the study published today (Sept. 29) in the journal Naturwissenschaften. "Inside the ocean, the magnetic field would be a very good cue to navigate," Kremers told Live Science. "It seems quite plausible for dolphins to have a magnetic sense." Some evidence suggests both dolphin and whale migration routes and offshore live strandings may be related to the Earth's magnetic field, but very little research has investigated whether these animals have a magnetic sense. Kremers and her colleagues found just one study that looked at how dolphins reacted to magnetic fields in a pool; that study found dolphins didn't show any response to the magnetic field. But the animals in that study weren't free to move around, and were trained to give certain responses. © 2014 Scientific American
Keyword: Animal Migration
Link ID: 20140 - Posted: 10.01.2014
By Jia You Fish larvae emit sound—much to the surprise of biologists. A common coral reef fish in Florida, the gray snapper—Lutjanus griseus (pictured above)—hatches in the open ocean and spends its juvenile years in food-rich seagrass beds hiding from predators before settling in the reefs as an adult. To study how larval snappers orient themselves in the dark, marine biologists deployed transparent acrylic chambers equipped with light and sound sensors under the water to capture the swimming schools as they travel to the seagrass beds on new-moon nights. The larval snappers make a short “knock” sound that adults also make, as well as a long “growl” sound, the team reports online today in Biology Letters. The researchers suspect that the larvae use the acoustic signals to communicate with one another and stay together in schools. If so, human noise pollution could be interrupting their communications—even adult fish have been found to “yell” to be heard above boat noises. © 2014 American Association for the Advancement of Science.
Wild marmosets in the Brazilian forest can learn quite successfully from video demonstrations featuring other marmosets, Austrian scientists have reported, showing not only that marmosets are even better learners than previously known, but that video can be used successfully in experiments in the wild. Tina Gunhold, a cognitive biologist at the University of Vienna, had worked with a population of marmoset monkeys in a bit of Brazilian forest before this particular experiment. The forest is not wilderness. It lies near some apartment complexes, and the marmosets are somewhat used to human beings. But the monkeys are wild, and each extended family group has its own foraging territory. Dr. Gunhold and her colleagues reported in the journal Biology Letters this month that they had tested 12 family groups, setting up a series of video monitors, each with a kind of complicated box that they called an “artificial fruit.” All the boxes contained food. Six of the monitors showed just an unchanging image of a marmoset near a similar box. Three of them showed a marmoset opening the box by pulling a drawer, and three others a marmoset lifting a lid to get at the food. Marmosets are very territorial and would not tolerate a strange individual on their turf, but the image of a strange marmoset on video didn’t seem to bother them. Individual marmosets “differed in their reactions to the video,” Dr. Gunhold said. “Some were more shy, some more bold. The younger ones were more attracted to the video, perhaps because of greater curiosity.” © 2014 The New York Times Company
By Larry Greenemeier Former Grateful Dead percussionist Mickey Hart takes pride in his brain. Large, anatomically realistic 3-D animations representing the inner workings of his gray and white matter have graced video screens at several science and technology conferences. These “Glass Brain” visualizations use imaging and advanced computing systems to depict in colorful detail the fiber pathways that make Hart’s brain tick. The researchers behind the project hope it will also form the basis of a new type of tool for the diagnosis and treatment of neurological disorders. Each Glass Brain animation overlays electroencephalography (EEG) data collected in real time atop a magnetic resonance imaging (MRI) scan—in this case Hart’s—to illustrate how different brain areas communicate with each other. Special algorithms coded into software digitally reconstruct this activity within the brain. The result is a tour of the brain that captures both the timing and location of brain signals. Hart demonstrated the Glass Brain at a computer conference in San Jose, Calif., this past March by playing a video game called NeuroDrummer on stage. The drummer is working with the Studio Bee digital animation house in San Francisco as well as the Glass Brain’s creators to develop NeuroDrummer into a tool that can determine whether teaching someone to keep a drumbeat might help improve the neural signals responsible for cognition, memory and other functions. The Glass Brain’s brain trust includes the University of California, San Francisco’s Neuroscape Lab as well as the University of California, San Diego’s Swartz Center for Computational Neuroscience, EEG maker Cognionics, Inc. and NVIDIA, a maker of extremely fast graphics processing unit (GPU) computer chips and host of the conference where Hart performed. © 2014 Scientific American,
Keyword: Brain imaging
Link ID: 20137 - Posted: 09.30.2014
By David Z. Hambrick, Fernanda Ferreira, and John M. Henderson A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012. What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.” One player needed 22 times more deliberate practice than another player to become a master. © 2014 The Slate Group LLC.
Keyword: Learning & Memory
Link ID: 20136 - Posted: 09.30.2014
By Gary Stix If it’s good for the heart, it could also be good for the neurons, astrocytes and oligodendrocytes, cells that make up the main items on the brain’s parts list. The heart-brain adage comes from epidemiological studies that show that people with cardiovascular risk factors such as high-blood pressure and elevated cholesterol levels, may be more at risk for Alzheimer’s and other dementias. This connection between heart and brain has also led to some disappointments: clinical trials of lipid-lowering statins have not helped patients diagnosed with Alzheimer’s, although epidemiological studies suggest that long-term use of the drugs may help prevent Alzheimer’s and other dementias. The link between head and heart is still being pursued because new Alzheimer’s drugs have failed time and again. One approach that is now drawing some interest looks at the set of proteins that carry around fats in the brain. These lipoproteins could potentially act as molecular sponges that mop up the amyloid-beta peptide that clogs up connections among brain cells in Alzheimer’s. One of these proteins—Apolipoprotein J, also known as clusterin—intrigues researchers because of the way it interacts with amyloid-beta and the status of its gene as a risk factor for Alzheimer’s. A researcher from the University of Minnesota, Ling Li, recently presented preliminary work at the Alzheimer’s Disease Drug Discovery Foundation annual meeting that showed that, at least in a lab dish, a molecule made up of a group of amino acids from APOJ is capable of protecting against the toxicity of the amyloid-beta peptide. It also quelled inflammation and promoted the health of synapses—the junctions where one brain cell encounters another. Earlier work by another group showed that the peptide prevented the development of lesions in the blood vessels of animals.
Link ID: 20135 - Posted: 09.30.2014