Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
by John Bohannon WASHINGTON, D.C.—People may grow wiser with age, but they don't grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they've fingered a culprit. A study presented here last week at the annual meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments. This isn't the first time that microbleeds have been suspected as a cause of cognitive decline. "We have known [about them] for some time thanks to neuroimaging studies," says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however. Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. "It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body's oxygen expenditure." Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds. © 2010 American Association for the Advancement of Science
Keyword: Stroke; Learning & Memory
Link ID: 18202 - Posted: 05.30.2013
By Suzanne Corkin My friend’s father was a neurosurgeon. As a child, I had no idea what a neurosurgeon did. Years later, when I was a graduate student in the Department of Psychology at McGill University, this man reentered my life. While reading articles on memory in medical journals, I came across a report by a doctor who had performed a brain operation to cure a young man’s intractable epilepsy. The operation caused the patient to lose his capacity to establish new memories. The doctor who coauthored the article was my friend’s father, William Beecher Scoville. The patient was Henry. This childhood connection to Henry’s neurosurgeon made reading about the “amnesic patient, H.M.” more compelling. Later, when I joined Brenda Milner’s laboratory at the Montreal Neurological Institute, Henry’s case fell into my lap. For my PhD thesis, I was able to test him in 1962 when he came to Milner’s lab for scientific study. She had been the first psychologist to test Henry after his operation, and her 1957 paper with Scoville, describing Henry’s operation and its awful consequences, revolutionized the science of memory. I was trying to expand the scientific understanding of Henry’s amnesia by examining his memory through his sense of touch, his somatosensory system. My initial investigation with him was focused and brief, lasting one week. After I moved to MIT, however, Henry’s extraordinary value as a research participant became clear to me, and I went on to study him for the rest of his life, forty-six years. Since his death, I have dedicated my work to linking fifty-five years of rich behavioral data to what we will learn from his autopsied brain. © 2012 Popular Science
Keyword: Learning & Memory
Link ID: 18201 - Posted: 05.30.2013
By Gary Stix The Obama administration’s Big Brain project—$100 million for a map of some sort of what lies beneath the skull—has captured the attention of the entire field of neuroscience. The magnitude of the cash infusion can’t help but draw notice, eliciting both huzzahs mixed with gripes that the whole effort might sap support for other perhaps equally worthy neuro-related endeavors. The Brain Activity Map Project—or the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative—is intended to give researchers tools to elicit the real-time functioning of neural circuits, providing a better picture of what happens in the brain when immersed in thought or when brain cells are beset by a degenerative condition like Parkinson’s or Alzheimer’s. Current technologies are either too slow or lack the resolution to achieve these goals. One strength of the organizers—perhaps a portent of good things to come—is that they don’t seem to mind opening themselves to public critiques. At a planning meeting earlier this month, George Whitesides, the eminent Harvard chemist and veteran of big government ventures in support of nanotechnology, weighed in on how the project appeared to an informed outsider. Edited excerpting of some of his comments follows. This posting is a bit long, but Whitesides is eloquent and it’s worth reading what he has to say because his views apply to any large-scale sci-tech foray. Whitesides began his talk after listening to a steady cavalcade of big-name neuroscientists furnish their personal wish lists for the program: ultrasound to induce focal lesions, more fruit fly studies to find computational nervous system primitives, more studies on zebra fish, studies on wholly new types of model organisms, avoiding too much emphasis on practical applications and so on. © 2013 Scientific American
Keyword: Brain imaging
Link ID: 18200 - Posted: 05.30.2013
Brain cells have been grown from skin cells of adults with Down's syndrome in research that could shed new light on the condition. US scientists found a reduction in connections among the brain cells and possible faults in genes that protect the body from ageing. The research in the Proceedings of the National Academy of Sciences gives an insight into early brain development. Down's syndrome results from an extra copy of one chromosome. This generally causes some level of learning disability and a range of distinctive physical features. A team led by Anita Bhattacharyya, a neuroscientist at the Waisman Center at the University of Wisconsin-Madison, grew brain cells from skin cells of two individuals with Down's syndrome. This involved reprogramming skin cells to transform them into a type of stem cell that could be turned into any cell in the body. Brain cells were then grown in the lab, providing a way to look at early brain development in Down's syndrome. One significant finding was a reduction in connections among the neurons, said Dr Bhattacharyya. "They communicate less, are quieter. This is new, but it fits with what little we know about the Down syndrome brain." Brain cells communicate through connections known as synapses. The brain cells in Down's syndrome individuals had only about 60% of the usual number of synapses and synaptic activity. BBC © 2013
Keyword: Development of the Brain; Genes & Behavior
Link ID: 18199 - Posted: 05.28.2013
How long a toddler sleeps at night depends in part on genes but environmental factors seem to make more of a difference for naps, a study of nearly 1,000 Canadian twins suggests. Researchers asked parents of 405 identical and 586 fraternal infants born in the Montreal area to answer questions about daytime and nighttime sleep habits at ages six months, 18 months, 30 months and 48 months. Twin girls sleep. Canadian researchers studied the sleep habits of twins, with the results being published this week.Twin girls sleep. Canadian researchers studied the sleep habits of twins, with the results being published this week. (iStock) "This study is the first to show that daytime sleep duration in early childhood is strongly influenced by environmental factors," Dr. Jacques Montplaisir from the University of Montreal and his colleagues concluded in Monday's issue of the journal Pediatrics. At most ages, genetics accounted for between 47 and 58 per cent of nighttime sleep duration. The majority of children slept 10 or 11 continuous hours at night. The exception was nighttime sleep at 18 months, which the researchers called "a critical environmental time-window" for establishing sleep patterns. On the other hand, genes never explained more than about one-third of daytime nap time. Environmental factors like family routines accounted for between 33 and 79 per cent of whether or not twins napped and for how long. © CBC 2013
Keyword: Sleep; Genes & Behavior
Link ID: 18198 - Posted: 05.28.2013
By Scicurious My eye was caught last week by a piece in Scientific American proper asking “is ketamine the next big depression drug?” It’s a good piece, and I appreciate the balance in the article, but I was also kind of surprised that…it took so long. There have been previous media rumblings (and blog) about ketamine through the years, so I’m rather curious as to why the article came out now (maybe there’s another new paper out? I didn’t see any referenced and couldn’t find anything). To be honest, while yes, ketamine has a lot of interesting potential, it’s not really quite as “new” as you might think. The first major clinical reports of ketamine as an effective antidepressant actually date back to 2000. Since then, scientists have been spending a lot of time trying to figure out WHY a drug usually used to knock out horses, or abused for its perception-changing qualities, acts as an antidepressant. And not just any antidepressant, but an almost miracle drug (maybe), helping people who respond to no other treatment, and with effects of a single dose occurring within hours (currently antidepressants take weeks), and lasting for weeks. And for all that…they don’t know how it works. So I saw the article, and I wanted to write a bit of follow-up. Because yes, while we don’t know QUITE how ketamine works…we have some ideas. And here is one of them. Ketamine isn’t like the current drugs used as antidepressants. Current drugs, like Prozac affect chemical neurotransmitters like serotonin. © 2013 Scientific American
Keyword: Depression
Link ID: 18197 - Posted: 05.28.2013
Linda Carroll TODAY contributor Each day brings Jenn McNary another dose of hope and heartache as she watches one son get healthier while the other becomes sicker. Both of McNary's sons were born with Duchene muscular dystrophy. Max, 11, is receiving an experimental therapy that appears to be making him better, while 14-year-old Austin is slowly dying. Austin was too sick to be included in the clinical trials for a promising new drug called Eteplirsen. “He can’t get into a chair, out of his wheelchair, into his bed and onto the toilet,” McNary told NBC’s Janet Shamlian. Max, however, was exactly what researchers were looking for. He was put on Eteplirsen, and now he's back to running around, climbing stairs and even playing soccer. “It’s a miracle,” McNary said. “It really is a miracle drug. This is something that nobody ever expected and he looks like an almost normal 11-year-old.” Eteplirsen is designed to partially repair one of the common genetic mutations that causes DMD. Even a partial repair may enough to improve life for boys struck by the condition, which results from a defect in the dystrophin gene. That gene resides on the X chromosome, which is why only boys end up with DMD. Boys get one X and one Y chromosome. Girls get two copies of the X chromosome — one from their mother and one from their father — so even if they inherit a defective copy from their mom, they get a healthy one from their dad. Although they won’t suffer symptoms, girls wind up with a 50 percent chance of being carriers for DMD.
Keyword: Muscles; Genes & Behavior
Link ID: 18196 - Posted: 05.28.2013
Obese mothers tend to have kids who become obese. Now provocative research suggests weight-loss surgery may help break that unhealthy cycle in an unexpected way — by affecting how their children's genes behave. In a first-of-a-kind study, Canadian researchers tested children born to obese women, plus their brothers and sisters who were conceived after the mother had obesity surgery. Youngsters born after mom lost lots of weight were slimmer than their siblings. They also had fewer risk factors for diabetes or heart disease later in life. More intriguing, the researchers discovered that numerous genes linked to obesity-related health problems worked differently in the younger siblings than in their older brothers and sisters. Clearly diet and exercise play a huge role in how fit the younger siblings will continue to be, and it's a small study. But the findings suggest the children born after mom's surgery might have an advantage. "The impact on the genes, you will see the impact for the rest of your life," predicted Marie-Claude Vohl of Laval University in Quebec City. She helped lead the work reported Monday in the journal Proceedings of the National Academy of Sciences. Why would there be a difference? It's not that mom passed on different genes, but how those genes operate in her child's body. The idea: Factors inside the womb seem to affect the dimmer switches that develop on a fetus' genes — chemical changes that make genes speed up or slow down or switch on and off. That in turn can greatly influence health. © CBC 2013
Keyword: Obesity; Development of the Brain
Link ID: 18195 - Posted: 05.28.2013
Chris Palmer Once thought to be a low-level form of pain, itch is instead a distinct sensation with a dedicated neural circuit linking cells in the periphery of the body to the brain, a study in mice suggests. Neuroscientists Mark Hoon and Santosh Mishra of the National Institute of Dental and Craniofacial Research in Bethesda, Maryland, searched for the molecule that encodes the sensation of itch by screening genes in sensory neurons that are activated by touch, heat, pain and itch. They found that one particular protein, called natriuretic polypeptide b, or Nppb, was expressed in only a subset of these neurons. Mutant mice lacking Nppb did not respond to itch-inducing compounds, but did respond normally to heat and pain. The researchers also found that when they injected Nppb in the mice's necks, it put them into a self-scratching frenzy. This occurred both in the mutants and in control mice. “Our research reveals the primary transmitter used by itch sensory neurons and confirms that itch is detected by specialized sensory neurons,” says Hoon. Hoon and Mishra went on to find neurons bearing receptors for Nppb in the spinal cord. Injection of a toxin made from soapwort seeds that targeted these spinal-cord neurons blocked itch responses, but not other sensory responses, suggesting that information about the itch sensation is transmitted along a distinct pathway. The researchers' results are published today in Science1. The result “explains problems in the literature and provides a very testable hypothesis for how itch works”, says Glenn Giesler, a neuroscientist at the University of Minnesota in Minneapolis. © 2013 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 18194 - Posted: 05.25.2013
By Susan Milius Cockroaches that don’t fall for traps’ sweet poisons have evolved taste cells that register sugar as bitter. In certain groups of the widespread German cockroach (Blattella germanica), nerve cells that normally detect bitter, potentially toxic compounds now also respond to glucose, says entomologist Coby Schal of North Carolina State University in Raleigh. The “bitter” reaction suppresses the “sweet” response from other nerve cells, and the roach stops eating, Schal and his colleagues report in the May 24 Science. Normally roaches love sugar. But with these populations, a dab of jelly with glucose in it makes them “jump back,” Schal says. “The response is: ‘Yuck! Terrible!’” This quirk of roach taste explains why glucose-baited poison traps stopped working among certain roaches, Schal says. Such bait traps combining a pesticide with something delicious became popular during the mid-1980s. But in 1993, Jules Silverman, also a coauthor on the new paper, reported roaches avoiding these once-appealing baits. “This is a fascinating piece of work because it shows how quickly, and how simply, the sense of taste can evolve,” says neurobiologist Richard Benton of the University of Lausanne in Switzerland. What pest-control manufacturers put in their roach baits now, and whether some still use glucose, isn’t public, Schal says. But humankind’s arms race with cockroaches could have started long ago, “in the caves,” he says. In this back-and-forth struggle, it’s important “to understand what the cockroach is doing from a molecular basis.” © Society for Science & the Public 2000 - 2013
Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 18193 - Posted: 05.25.2013
The reason we struggle to recall memories from our early childhood is down to high levels of neuron production during the first years of life, say Canadian researchers. The formation of new brain cells increases the capacity for learning but also clears the mind of old memories. The findings were presented to the Canadian Association of Neuroscience. An expert at City University in London said the mouse study called into question some psychological theories. Neurogenesis, or the formation of new neurons in the hippocampus - a region of the brain known to be important for learning and remembering, reaches its peak before and after birth. It then declines steadily during childhood and adulthood. Dr Paul Frankland and Dr Sheena Josselyn, from the Hospital for Sick Children in Toronto and the University of Toronto, wanted to find out how the process of new neuron generation impacted on memory storage. They carried out their research on younger and older mice in the lab. Early amnesia In adult mice, they found that increasing neurogenesis after memory formation was enough to bring about forgetting. In infant mice, they discovered that decreasing neurogenesis after memory formation meant that the normal forgetting observed at this age did not occur. Their research suggests a direct link between a reduction in neuron growth and increased memory recall. They found the opposite to be true also - a decreased ability to remember when neurogenesis is increased (as happens during infancy). BBC © 2013
Keyword: Learning & Memory; Neurogenesis
Link ID: 18192 - Posted: 05.25.2013
By DANIEL BERGNER Linneah sat at a desk at the Center for Sexual Medicine at Sheppard Pratt in the suburbs of Baltimore and filled out a questionnaire. She read briskly, making swift checks beside her selected answers, and when she was finished, she handed the pages across the desk to Martina Miller, who gave her a round of pills. The pills were either a placebo or a new drug called Lybrido, created to stoke sexual desire in women. Checking her computer, Miller pointed out gently that Linneah hadn’t been doing her duty as a study participant. Over the past eight weeks, she took the tablets before she planned to have sex, and for every time she put a pill on her tongue, she was supposed to make an entry in her online diary about her level of lust. “I know, I know,” Linneah said. She is a 44-year-old part-time elementary-school teacher, and that day she wore red pants and a canary yellow scarf. (She asked that only a nickname be used to protect her privacy.) “It’s a mess. I keep forgetting.” Miller, a study coordinator, began a short interview, typing Linneah’s replies into a database that the medication’s Dutch inventor, Adriaan Tuiten, will present to the Food and Drug Administration this summer or fall as part of his campaign to win the agency’s approval and begin marketing what might become the first female-desire drug in America. “Thinking about your desire now,” Miller said, “would you say it is absent, very low, low, reasonable or present?” “Low.” This was no different from Linneah’s reply at the trial’s outset two months before. © 2013 The New York Times Company
Keyword: Sexual Behavior; Emotions
Link ID: 18191 - Posted: 05.25.2013
Helen Shen Bexarotene, a cancer drug touted as a potential treatment for Alzheimer’s disease, may not be the blockbuster remedy scientists were hoping for, according to several analyses published in Science on 24 May1–4. Four independent research groups report that they failed to fully replicate striking results published in the journal last year5 by Gary Landreth, a neuroscientist at Case Western Reserve University School of Medicine in Cleveland, Ohio, and his colleagues. Landreth's team reported that the drug bexarotene could lower brain concentrations of the β-amyloid protein that has long been suspected as a key contributor to Alzheimer’s disease, and could even reverse cognitive impairments in diseased mice. But the study garnered particular attention for its claim that the drug could clear 50% of amyloid plaques — sticky clumps of the protein thought to interfere with brain function — in as little as 72 hours. “That attracted a lot of folks to try to replicate these studies,” says Philip Wong, a neuroscientist at Johns Hopkins University in Baltimore, Maryland. “No drug at the present moment can do things like that.” None of the follow-up studies published this week replicated the effects of bexarotene on plaques. Two groups did, however, confirm Landreth’s finding that the drug reduced levels of a soluble, free-floating form of β-amyloid, which can aggregate in plaques4. Not all of the papers examined memory in mice, but one group led by Radosveta Koldamova, a neuroscientist at the University of Pittsburgh in Pennsylvania, found that bexarotene treatment led to cognitive improvements1. © 2013 Nature Publishing Group
Keyword: Alzheimers
Link ID: 18190 - Posted: 05.25.2013
People with higher IQs are slow to detect large background movements because their brains filter out non-essential information, say US researchers. Instead, they are good at detecting small moving objects. The findings come in a study of 53 people given a simple, visual test in Current Biology. The results could help scientists understand what makes a brain more efficient and more intelligent. In the study, individuals watched short video clips of black and white bars moving across a computer screen. Some clips were small and filled only the centre of the screen, while others filled the whole screen. The participants' sole task was to identify in which direction the bars were drifting - to the right or to the left. Participants also took a standardised intelligence test. The results showed that people with higher IQ scores were faster at noticing the movement of the bars when observing the smallest image - but they were slower at detecting movement in the larger images. Michael Melnick of the University of Rochester, who was part of the research team said the results were very clear. "From previous research, we expected that all participants would be worse at detecting the movement of large images, but high IQ individuals were much, much worse. The authors explain that in most scenarios, background movement is less important than small moving objects in the foreground, for example driving a car, walking down a hall or moving your eyes across the room. BBC © 2013
Keyword: Intelligence; Attention
Link ID: 18189 - Posted: 05.25.2013
By Neuroskeptic Newly discovered papers have shed light on a fascinating episode in the history of neuroscience: Weighing brain activity with the balance The story of the early Italian neuroscientist Dr Angelo Mosso and his ‘human circulation balance’ is an old one – I remember reading about it as a student, in the introductory bit of a textbook on fMRI – but until now, the exact details were murky. In the new paper, Italian neuroscientists Sandrone and colleagues report that they’ve unearthed Mosso’s original manuscripts from an archive in Milan. Mosso worked in the late 19th century, an era that was – in retrospect – right at the dawn of modern neuroscience. A major question at that time was the relationship between brain function and blood flow. His early work included studies of the blood pressure in the brains of individuals with skull defects. His most ambitious project, however, was his balance – or as he sometimes called it, according to his daughter, his ‘metal cradle’ or ‘machine to weigh the soul’. It was in essence just a large balance. A volunteer lay on a table, their head on one side of the scale’s pivot and their feet on the other. It was carefully adjusted so that the two sides were perfectly balanced. The theory was that if mental activity caused increased brain blood flow, it ought to increase the weight of the head relative to the rest of the body, so that side of the balance would fall.
Keyword: Brain imaging
Link ID: 18188 - Posted: 05.23.2013
By JOHN NOBLE WILFORD Modern mothers love to debate how long to breast-feed, a topic that stirs both guilt and pride. Now — in a very preliminary finding — the Neanderthals are weighing in. By looking at barium levels in the fossilized molar of a Neanderthal child, researchers concluded that the child had been breast-fed exclusively for the first seven months, followed by seven months of mother’s milk supplemented by other food. Then the barium pattern in the tooth enamel “returned to baseline prenatal levels, indicating an abrupt cessation of breast-feeding at 1.2 years of age,” the scientists reported on Wednesday in the journal Nature. While that timetable conforms with the current recommendations of the American Academy of Pediatrics — which suggests that mothers exclusively breast-feed babies for six months and continue for 12 months if possible — it represents a much shorter span of breast-feeding than practiced by apes or a vast majority of modern humans. The average age of weaning in nonindustrial populations is about 2.5 years; in chimpanzees in the wild, it is about 5.3 years. Of course, living conditions were much different for our evolutionary cousins, the Neanderthals, extinct for the last 30,000 years. The findings, which drew strong skepticism from some scientists, were meant to highlight a method of linking barium levels in teeth to dietary changes. In the Nature report, researchers from the United States and Australia described tests among human infants and captive macaques showing that traces of the element barium in tooth enamel appeared to accurately reflect transitions from mother’s milk through weaning. The barium levels rose during breast-feeding and fell off sharply on weaning. © 2013 The New York Times Company
Keyword: Development of the Brain; Evolution
Link ID: 18187 - Posted: 05.23.2013
Scientists have reversed behavioral and brain abnormalities in adult mice that resemble some features of schizophrenia by restoring normal expression to a suspect gene that is over-expressed in humans with the illness. Targeting expression of the gene Neuregulin1, which makes a protein important for brain development, may hold promise for treating at least some patients with the brain disorder, say researchers funded by the National Institutes of Health. Like patients with schizophrenia, adult mice biogenetically-engineered to have higher Neuregulin 1 levels showed reduced activity of the brain messenger chemicals glutamate and GABA. The mice also showed behaviors related to aspects of the human illness. For example, they interacted less with other animals and faltered on thinking tasks. “The deficits reversed when we normalized Neuregulin 1 expression in animals that had been symptomatic, suggesting that damage which occurred during development is recoverable in adulthood,” explained Lin Mei, M.D., Ph.D.External Web Site Policy , of the Medical College of Georgia at Georgia Regents University, a grantee of NIH’s National Institute of Mental Health (NIMH). “While mouse models can’t really do full justice to a complex brain disorder that impairs our most uniquely human characteristics, this study demonstrates the potential of dissecting the workings of intermediate components of disorders in animals to discover underlying mechanisms and new treatment targets,” said NIMH Director Thomas R. Insel, M.D. “Hopeful news about how an illness process that originates early in development might be reversible in adulthood illustrates the promise of such translational research.” Schizophrenia is thought to stem from early damage to the developing fetal brain, traceable to a complex mix of genetic and environmental causes. Although genes identified to date account for only a small fraction of cases, evidence has implicated variation in the Neuregulin 1 gene. For example, postmortem studies have found that it is overexpressed in the brain's thinking hub, or prefrontal cortex, of some people who had schizophrenia. It codes for a chemical messenger that plays a pivotal role in communication between brain cells, as well as in brain development.
Keyword: Schizophrenia; Genes & Behavior
Link ID: 18186 - Posted: 05.23.2013
Virginia Hughes Late in the morning on 20 February, more than 200 people packed an auditorium at the Harvard School of Public Health in Boston, Massachusetts. The purpose of the event, according to its organizers, was to explain why a new study about weight and death was absolutely wrong. The report, a meta-analysis of 97 studies including 2.88 million people, had been released on 2 January in the Journal of the American Medical Association (JAMA)1. A team led by Katherine Flegal, an epidemiologist at the National Center for Health Statistics in Hyattsville, Maryland, reported that people deemed 'overweight' by international standards were 6% less likely to die than were those of 'normal' weight over the same time period. The result seemed to counter decades of advice to avoid even modest weight gain, provoking coverage in most major news outlets — and a hostile backlash from some public-health experts. “This study is really a pile of rubbish, and no one should waste their time reading it,” said Walter Willett, a leading nutrition and epidemiology researcher at the Harvard school, in a radio interview. Willett later organized the Harvard symposium — where speakers lined up to critique Flegal's study — to counteract that coverage and highlight what he and his colleagues saw as problems with the paper. “The Flegal paper was so flawed, so misleading and so confusing to so many people, we thought it really would be important to dig down more deeply,” Willett says. © 2013 Nature Publishing Group,
Keyword: Obesity
Link ID: 18185 - Posted: 05.23.2013
By Bruce Bower Chaser isn’t just a 9-year-old border collie with her breed’s boundless energy, intense focus and love of herding virtually anything. She’s a grammar hound. In experiments directed by her owner, psychologist John Pilley of Wofford College in Spartanburg, S.C., Chaser demonstrated her grasp of the basic elements of grammar by responding correctly to commands such as “to ball take Frisbee” and its reverse, “to Frisbee take ball.” The dog had previous, extensive training to recognize classes of words including nouns, verbs and prepositions. “Chaser intuitively discovered how to comprehend sentences based on lots of background learning about different types of words,” Pilley says. He reports the results May 13 in Learning and Motivation. Throughout the first three years of Chaser’s life, Pilley and a colleague trained the dog to recognize and fetch more than 1,000 objects by name. Using praise and play as reinforcements, the researchers also taught Chaser the meaning of different types of words, such as verbs and prepositions. As a result, Chaser learned that phrases such as “to Frisbee” meant that she should take whatever was in her mouth to the named object. Exactly how the dog gained her command of grammar is unclear, however. Pilley suspects that Chaser first mentally linked each of two nouns she heard in a sentence to objects in her memory. Then the canine held that information in mind while deciding which of two objects to bring to which of two other objects. Pilley’s work follows controversial studies of grammar understanding in dolphins and a pygmy chimp. © Society for Science & the Public 2000 - 2013
Keyword: Language; Evolution
Link ID: 18184 - Posted: 05.22.2013
Jeremy Laurance Iodine deficiency is widespread amongst pregnant women in the UK and may be harming the cognitive development of their children, scientists have found. The first large study of the problem in the UK has revealed that two-thirds of expectant mothers had a mild to moderate deficiency in the mineral, which was associated with significantly lower IQ and reading ability in their children at the ages of eight and nine. Iodine is essential for growth and development of the brain, and pregnant women need 50 per cent more. Researchers said women should ensure they are getting enough from their diet – milk, yogurt and fish are the best sources – and that any pregnancy supplement they take contains iodine. But they warned that kelp and seaweed supplements should be avoided as they contain variable levels of iodine and could lead to overdose. Severe iodine deficiency is known to cause brain damage and is the biggest cause of mental deficiency in the developing world. But mild to moderate iodine deficiency has been little studied – until now. Researchers from the Universities of Surrey and Bristol examined records of 1,000 mothers who were part of the Children of the 90s study which has followed the development of children born to 14,000 mothers in Avon since 1990-91. They found that 67 per cent of the mothers had levels of iodine below that recommended by the World Health Organisation. Their children were divided into groups according to how well they performed on IQ and reading tests at eight and nine. The results showed those whose mothers had low iodine levels were 60 per cent more likely to be in the bottom group. © independent.co.uk
Keyword: Intelligence; Development of the Brain
Link ID: 18183 - Posted: 05.22.2013