Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 41 - 60 of 5221

By Elizabeth Kolbert C57BL/6J mice are black, with pink ears and long pink tails. Inbred for the purposes of experimentation, they exhibit a number of infelicitous traits, including a susceptibility to obesity, a taste for morphine, and a tendency to nibble off their cage mates’ hair. They’re also tipplers. Given access to ethanol, C57BL/6J mice routinely suck away until the point that, were they to get behind the wheel of a Stuart Little-size roadster, they’d get pulled over for D.U.I. Not long ago, a team of researchers at Temple University decided to take advantage of C57BL/6Js’ bad habits to test a hunch. They gathered eighty-six mice and placed them in Plexiglas cages, either singly or in groups of three. Then they spiked the water with ethanol and videotaped the results. Half of the test mice were four weeks old, which, in murine terms, qualifies them as adolescents. The other half were twelve-week-old adults. When the researchers watched the videos, they found that the youngsters had, on average, outdrunk their elders. More striking still was the pattern of consumption. Young male C57BL/6Js who were alone drank roughly the same amount as adult males. But adolescent males with cage mates went on a bender; they spent, on average, twice as much time drinking as solo boy mice and about thirty per cent more time than solo girls. The researchers published the results in the journal Developmental Science. In their paper, they noted that it was “not possible” to conduct a similar study on human adolescents, owing to the obvious ethical concerns. But, of course, similar experiments are performed all the time, under far less controlled circumstances. Just ask any college dean. Or ask a teen-ager.

Keyword: Development of the Brain; Attention
Link ID: 21345 - Posted: 08.27.2015

While some research suggests that a diet high in omega-3 fatty acids can protect brain health, a large clinical trial by researchers at the National Institutes of Health found that omega-3 supplements did not slow cognitive decline in older persons. With 4,000 patients followed over a five-year period, the study is one of the largest and longest of its kind. It was published today in the Journal of the American Medical Association. “Contrary to popular belief, we didn’t see any benefit of omega-3 supplements for stopping cognitive decline,” said Emily Chew, M.D., . Dr. Chew leads the Age-Related Eye Disease Study (AREDS), which was designed to investigate a combination of nutritional supplements for slowing age-related macular degeneration (AMD), a major cause of vision loss among older Americans. That study established that daily high doses of certain antioxidants and minerals — called the AREDS formulation — can help slow the progression to advanced AMD. A later study, called AREDS2, tested the addition of omega-3 fatty acids to the AREDS formula. But the omega-3’s made no difference. Omega-3 fatty acids are made by marine algae and are concentrated in fish oils; they are believed to be responsible for the health benefits associated with regularly eating fish, such as salmon, tuna, and halibut.*Where studies have surveyed people on their dietary habits and health, they’ve found that regular consumption of fish is associated with lower rates of AMD, cardiovascular disease, and possibly dementia. “We’ve seen data that eating foods with omega-3 may have a benefit for eye, brain, and heart health,” Dr. Chew explained.

Keyword: Alzheimers
Link ID: 21340 - Posted: 08.26.2015

By Laura Sanders By tweaking a single gene, scientists have turned average mice into supersmart daredevils. The findings are preliminary but hint at therapies that may one day ease the symptoms of such disorders as Alzheimer’s disease and schizophrenia, scientists report August 14 in Neuropsychopharmacology. The altered gene provides instructions for a protein called phosphodiesterase-4B, or PDE4B, which has been implicated in schizophrenia. It’s too early to say whether PDE4B will turn out to be a useful target for drugs that treat these disorders, cautions pharmacologist Ernesto Fedele of the University of Genoa in Italy. Nonetheless, the protein certainly deserves further investigation, he says. The genetic change interfered with PDE4B’s ability to do its job breaking down a molecular messenger called cAMP. Mice designed to have this disabled form of PDE4B showed a suite of curious behaviors, including signs of smarts, says study coauthor Alexander McGirr of the University of British Columbia. Compared with normal mice, these mice more quickly learned which objects in a cage had been moved to a new location, for instance, and could better recognize a familiar mouse after 24 hours. “The system is primed and ready to learn, and it doesn’t require the same kind of input as a normal mouse,” McGirr says. These mice also spent more time than usual exploring brightly lit spaces, spots that normal mice avoid. But this devil-may-care attitude sometimes made the “smart” mice blind to risky situations. The mice were happy to spend time poking around an area that had been sprinkled with bobcat urine. “Not being afraid of cat urine is not a good thing for a mouse,” McGirr says. © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory; Schizophrenia
Link ID: 21338 - Posted: 08.26.2015

Helen Thomson Genetic changes stemming from the trauma suffered by Holocaust survivors are capable of being passed on to their children, the clearest sign yet that one person’s life experience can affect subsequent generations. The conclusion from a research team at New York’s Mount Sinai hospital led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who had either been interned in a Nazi concentration camp, witnessed or experienced torture or who had had to hide during the second world war. They also analysed the genes of their children, who are known to have increased likelihood of stress disorders, and compared the results with Jewish families who were living outside of Europe during the war. “The gene changes in the children could only be attributed to Holocaust exposure in the parents,” said Yehuda. Her team’s work is the clearest example in humans of the transmission of trauma to a child via what is called “epigenetic inheritance” - the idea that environmental influences such as smoking, diet and stress can affect the genes of your children and possibly even grandchildren. The idea is controversial, as scientific convention states that genes contained in DNA are the only way to transmit biological information between generations. However, our genes are modified by the environment all the time, through chemical tags that attach themselves to our DNA, switching genes on and off. Recent studies suggest that some of these tags might somehow be passed through generations, meaning our environment could have and impact on our children’s health. © 2015 Guardian News and Media Limited

Keyword: Epigenetics; Stress
Link ID: 21325 - Posted: 08.22.2015

Almost fully-formed brain grown in a lab. Woah: Scientists grow first nearly fully-formed human brain. Boffins raise five-week-old fetal human brain in the lab for experimentation. On Tuesday, all the above appeared as headlines for one particular story. What was it all about? Mini-brains 3 to 4 millimetres across have been grown in the lab before, but if a larger brain had been created – and the press release publicising the claim said it was the size of a pencil eraser – that would be a major breakthrough. New Scientist investigated the claims. The announcement was made by Rene Anand, a neuroscientist at Ohio State University in Columbus, at a military health research meeting in Florida. Anand says he has grown a brain – complete with a cortex, midbrain and brainstem – in a dish, comparable in maturity to that of a fetus aged 5 weeks. Anand and his colleague Susan McKay started with human skin cells, which they turned into induced pluripotent stem cells (iPSCs) using a tried-and-tested method. By applying an undisclosed technique, one that a patent has been applied for, the pair say they were able to encourage these stem cells to form a brain. “We are replicating normal development,” says Anand. He says they hope to be able to create miniature models of brains experiencing a range of diseases, such as Parkinson’s and Alzheimer’s. Inconclusive evidence But not everyone is convinced, especially as Anand hasn’t published his results. Scientists we sent Anand’s poster presentation to said that although the team has indeed grown some kind of miniature collection of cells, or “organoid”, in a dish, the structure isn’t much like a fetal brain. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 21322 - Posted: 08.22.2015

Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.

Keyword: Obesity; Genes & Behavior
Link ID: 21321 - Posted: 08.20.2015

Helen Thomson An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease. Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases. It could also be used to test drugs for conditions such as Alzheimer’s and Parkinson’s, since the regions they affect are in place during an early stage of brain development. The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed, claimed Rene Anand of Ohio State University, Columbus, who presented the work today at the Military Health System Research Symposium in Fort Lauderdale, Florida. Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand. Anand and his colleagues claim to have reproduced 99% of the brain’s diverse cell types and genes. They say their brain also contains a spinal cord, signalling circuitry and even a retina. The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.” © 2015 Guardian News and Media Limited

Keyword: Development of the Brain
Link ID: 21316 - Posted: 08.19.2015

—By Chris Mooney It is still considered highly uncool to ascribe a person's political beliefs, even in part, to that person's biology: hormones, physiological responses, even brain structures and genes. And no wonder: Doing so raises all kinds of thorny, non-PC issues involving free will, determinism, toleration, and much else. There's just one problem: Published scientific research keeps going there, with ever increasing audacity (not to mention growing stacks of data). The past two weeks have seen not one but two studies published in scientific journals on the biological underpinnings of political ideology. And these studies go straight at the role of genes and the brain in shaping our views, and even our votes. First, in the American Journal of Political Science, a team of researchers including Peter Hatemi of Penn State University and Rose McDermott of Brown University studied the relationship between our deep-seated tendencies to experience fear—tendencies that vary from person to person, partly for reasons that seem rooted in our genes—and our political beliefs. What they found is that people who have more fearful disposition also tend to be more politically conservative, and less tolerant of immigrants and people of races different from their own. As McDermott carefully emphasizes, that does not mean that every conservative has a high fear disposition. "It's not that conservative people are more fearful, it's that fearful people are more conservative," as she puts it. I interviewed the paper's lead author, Peter Hatemi, about his research for my 2012 book The Republican Brain. Hatemi is both a political scientist and also a microbiologist, and as he stressed to me, "nothing is all genes, or all environment." These forces combine to make us who we are, in incredibly intricate ways. ©2015 Mother Jones

Keyword: Emotions; Genes & Behavior
Link ID: 21313 - Posted: 08.19.2015

Helen Thomson Serious mood disorders such as bipolar may be the price humans have had to pay for our intelligence and creativity. That’s according to new research which links high childhood IQ to an increased risk of experiencing manic bipolar traits in later life. Researchers examined data from a large birth cohort to identify the IQ of 1,881 individuals at age eight. These same individuals were then assessed for manic traits at the age of 22 or 23. The statements they provided were part of a checklist widely used to diagnose bipolar disorder. Each person was given a score out of 100 related to how many manic traits they had previously experienced. Individuals who scored in the top 10% of manic features had a childhood IQ almost 10 points higher than those who scored in the lowest 10%. This correlation appeared strongest for those with high verbal IQ. “Our study offers a possible explanation for how bipolar disorder may have been selected through generations,” said Daniel Smith of the University of Glasgow , who led the study. “There is something about the genetics underlying the disorder that are advantageous. One possibility is that serious disorders of mood - such as bipolar disorder - are the price that human beings have had to pay for more adaptive traits such as intelligence, creativity and verbal proficiency.” Smith emphasises that as things stand, having a high IQ is only an advantage: “A high IQ is not a clear-cut risk factor for bipolar, but perhaps the genes that confer intelligence can get expressed as illness in the context of other risk factors, such as exposure to maternal influenza in the womb or childhood sexual abuse.” © 2015 Guardian News and Media Limited

Keyword: Schizophrenia; Genes & Behavior
Link ID: 21312 - Posted: 08.19.2015

By Zoe Kleinman Technology reporter, BBC News More than 200 academics have signed an open letter criticising controversial new research suggesting a link between violent video games and aggression. The findings were released by the American Psychological Association. It set up a taskforce that reviewed hundreds of studies and papers published between 2005 and 2013. The American Psychological Association concluded while there was "no single risk factor" to blame for aggression, violent video games did contribute. "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression," said the report. "It is the accumulation of risk factors that tends to lead to aggressive or violent behaviour. The research reviewed here demonstrates that violent video game use is one such risk factor." However, a large group of academics said they felt the methodology of the research was deeply flawed as a significant part of material included in the study had not been subjected to peer review. "I fully acknowledge that exposure to repeated violence may have short-term effects - you would be a fool to deny that - but the long-term consequences of crime and actual violent behaviour, there is just no evidence linking violent video games with that," Dr Mark Coulson, associate professor of psychology at Middlesex University and one of the signatories of the letter told the BBC. "If you play three hours of Call of Duty you might feel a little bit pumped, but you are not going to go out and mug someone." © 2015 BBC

Keyword: Aggression
Link ID: 21310 - Posted: 08.19.2015

By Perri Klass, A little more than a year ago, the American Academy of Pediatrics issued a policy statement saying that all pediatric primary care should include literacy promotion, starting at birth. That means pediatricians taking care of infants and toddlers should routinely be advising parents about how important it is to read to even very young children. The policy statement, which I wrote with Dr. Pamela C. High, included a review of the extensive research on the links between growing up with books and reading aloud, and later language development and school success. But while we know that reading to a young child is associated with good outcomes, there is only limited understanding of what the mechanism might be. Two new studies examine the unexpectedly complex interactions that happen when you put a small child on your lap and open a picture book. This month, the journal Pediatrics published a study that used functional magnetic resonance imaging to study brain activity in 3-to 5-year-old children as they listened to age-appropriate stories. The researchers found differences in brain activation according to how much the children had been read to at home. Children whose parents reported more reading at home and more books in the home showed significantly greater activation of brain areas in a region of the left hemisphere called the parietal-temporal-occipital association cortex. This brain area is “a watershed region, all about multisensory integration, integrating sound and then visual stimulation,” said the lead author, Dr. John S. Hutton, a clinical research fellow at Cincinnati Children’s Hospital Medical Center. This region of the brain is known to be very active when older children read to themselves, but Dr. Hutton notes that it also lights up when younger children are hearing stories. What was especially novel was that children who were exposed to more books and home reading showed significantly more activity in the areas of the brain that process visual association, even though the child was in the scanner just listening to a story and could not see any pictures. © 2015 The New York Times Company

Keyword: Language; Development of the Brain
Link ID: 21308 - Posted: 08.18.2015

Every brain cell has a nucleus, or a central command station. Scientists have shown that the passage of molecules through the nucleus of a star-shaped brain cell, called an astrocyte, may play a critical role in health and disease. The study, published in the journal Nature Neuroscience, was partially funded by the National Institutes of Health (NIH). “Unexpectedly we may have discovered a hidden pathway to understanding how astrocytes respond to injury and control brain processes. The pathway may be common to many brain diseases and we’re just starting to follow it,” said Katerina Akassoglou, Ph.D., a senior investigator at the Gladstone Institute for Neurological Disease, a professor of neurology at the University of California, San Francisco, and a senior author of the study. Some neurological disorders are associated with higher than normal brain levels of the growth factor TGF-beta, including Alzheimer's disease and brain injury. Previous studies found that after brain injury, astrocytes produce greater amounts of p75 neurotrophin receptor (p75NTR), a protein that helps cells detect growth factors. The cells also react to TGF-beta by changing their shapes and secreting proteins that alter neuronal activity. Dr. Akassoglou’s lab showed that eliminating the p75NTR gene prevented hydrocephalus in mice genetically engineered to have astrocytes that produce higher levels of TGF-beta. Hydrocephalus is a disorder that fills the brain with excess cerebral spinal fluid. Eliminating the p75NTR gene also prevented astrocytes in the brains of the mice from forming scars after injuries and restored gamma oscillations, which are patterns of neuronal activity associated with learning and memory.

Keyword: Brain Injury/Concussion; Glia
Link ID: 21307 - Posted: 08.18.2015

By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.

Keyword: Learning & Memory; Genes & Behavior
Link ID: 21306 - Posted: 08.18.2015

Alison Abbott The octopus genome offers clues to how cephalopods evolved intelligence to rival the craftiest vertebrates. With its eight prehensile arms lined with suckers, camera-like eyes, elaborate repertoire of camouflage tricks and spooky intelligence, the octopus is like no other creature on Earth. Added to those distinctions is an unusually large genome, described in Nature1 on 12 August, that helps to explain how a mere mollusc evolved into an otherworldly being. “It’s the first sequenced genome from something like an alien,” jokes neurobiologist Clifton Ragsdale of the University of Chicago in Illinois, who co-led the genetic analysis of the California two-spot octopus (Octopus bimaculoides). The work was carried out by researchers from the University of Chicago, the University of California, Berkeley, the University of Heidelberg in Germany and the Okinawa Institute of Science and Technology in Japan. The scientists also investigated gene expression in twelve different types of octopus tissue. “It’s important for us to know the genome, because it gives us insights into how the sophisticated cognitive skills of octopuses evolved,” says neurobiologist Benny Hochner at the Hebrew University of Jerusalem in Israel, who has studied octopus neurophysiology for 20 years. Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs. © 2015 Nature Publishing Group

Keyword: Intelligence; Genes & Behavior
Link ID: 21295 - Posted: 08.13.2015

Ashley Yeager A mouse scurries across a round table rimmed with Dixie cup–sized holes. Without much hesitation, the rodent heads straight for the hole that drops it into a box lined with cage litter. Any other hole would have led to a quick fall to the floor. But this mouse was more than lucky. It had an advantage — human glial cells were growing in its brain. Glia are thought of as the support staff for the brain’s nerve cells, or neurons, which transmit and receive the brain’s electrical and chemical signals. Named for the Greek term for “glue,” glia have been known for nearly 170 years as the cells that hold the brain’s bits together. Some glial cells help feed neurons. Other glia insulate nerve cell branches with myelin. Still others attack brain invaders responsible for infection or injury. Glial cells perform many of the brain’s most important maintenance jobs. But recent studies suggest they do a lot more. Glia can shape the conversation between neurons, speeding or slowing the electrical signals and strengthening neuron-to-neuron connections. When scientists coaxed human glia to grow in the brains of baby mice, the mice grew up to be supersmart, navigating tabletops full of holes and mastering other tasks much faster than normal mice. This experiment and others suggest that glia may actually orchestrate learning and memory, says neuroscientist R. Douglas Fields. “Glia aren’t doing vibrato. That’s for the neurons,” says Fields, of the National Institute of Child Health and Human Development in Bethesda, Md. “Glia are the conductors.” © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory; Glia
Link ID: 21289 - Posted: 08.12.2015

Could taking iodine pills in pregnancy help to raise children’s IQ? Some researchers suggest women in the UK should take such supplements, but others say the evidence is unclear, and that it could even harm development. Iodine is found in dairy foods and fish, and is used in the body to make thyroid hormone, which is vital for brain development in the womb. In some parts of the world, such as inland areas where little fish is consumed or the soil is low in iodine, severe deficiencies can markedly lower intelligence in some people. In most affected areas, iodine is now added to salt. The UK was not thought to need this step, but in 2013 a large study of urine samples from pregnant women found that about two-thirds had mild iodine deficiency, and that the children of those with the lowest levels had the lowest IQs. Now another team has combined data from this study with other data to calculate that if all women in the UK were given iodine supplements from three months before pregnancy until they finished breastfeeding, average IQ would increase by 1.2 points per child. And the children of mothers who were most iodine deficient would probably benefit more, says Kate Jolly of the University of Birmingham, who was involved in the study. “We are talking about very small differences but on a population basis it could mean quite a lot,” she says. The team calculated that providing these iodine supplements would be worth the cost to the UK’s National Health Service because it would boost the country’s productivity. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain; Intelligence
Link ID: 21286 - Posted: 08.12.2015

April Dembosky Developers of a new video game for your brain say theirs is more than just another get-smarter-quick scheme. Akili, a Northern California startup, insists on taking the game through a full battery of clinical trials so it can get approval from the Food and Drug Administration — a process that will take lots of money and several years. So why would a game designer go to all that trouble when there's already a robust market of consumers ready to buy games that claim to make you smarter and improve your memory? Think about all the ads you've heard for brain games. Maybe you've even passed a store selling them. There's one at the mall in downtown San Francisco — just past the cream puff stand and across from Jamba Juice — staffed on my visit by a guy named Dominic Firpo. "I'm a brain coach here at Marbles: The Brain Store," he says. Brain coach? "Sounds better than sales person," Firpo explains. "We have to learn all 200 games in here and become great sales people so we can help enrich peoples' minds." He heads to the "Word and Memory" section of the store and points to one product that says it will improve your focus and reduce stress in just three minutes a day. "We sold out of it within the first month of when we got it," Firpo says. The market for these "brain fitness" games is worth about $1 billion and is expected to grow to $6 billion in the next five years. Game makers appeal to both the young and the older with the common claim that if you exercise your memory, you'll be able to think faster and be less forgetful. Maybe bump up your IQ a few points. "That's absurd," says psychology professor Randall Engle from the Georgia Institute of Technology. © 2015 NPR

Keyword: ADHD; Intelligence
Link ID: 21278 - Posted: 08.10.2015

Tina Hesman Saey Memory Transfer Seen — Experiments with rats, showing how chemicals from one rat brain influence the memory of an untrained animal, indicate that tinkering with the brain of humans is also possible. In the rat tests, brain material from an animal trained to go for food either at a light flash or at a sound signal was injected into an untrained rat. The injected animals then "remembered" whether light or sound meant food. — Science News Letter, August 21, 1965 Update: After this report, scientists from eight labs attempted to repeat the memory transplants. They failed, as they reported in Science in 1966. Science fiction authors and futurists often predict that a person’s memories might be transferred to another person or a computer, but the idea is likely to remain speculation, says neuroscientist Eric Kandel, who won a Nobel Prize in 2000 for his work on memory. Brain wiring is too intricate and complicated to be exactly replicated, and scientists are still learning about how memories are made, stored and retrieved. W. L. Byrne et al. Technical Comments: Memory Transfer. Science Vol. 153, August 5, 1966, p. 658. doi:10.1126/science.153.3736.658 © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory
Link ID: 21271 - Posted: 08.08.2015

By Christian Jarrett One of the saddest things about loneliness is that it leads to what psychologists call a “negative spiral.” People who feel isolated come to dread bad social experiences and they lose faith that it’s possible to enjoy good company. The usual result, as Melissa Dahl recently noted, is more loneliness. This hardly seems adaptive, but experts say it’s because we’ve evolved to enter a self-preservation mode when we’re alone. Without the backup of friends and family, our brains become alert to threat, especially the potential danger posed by strangers. Until now, much of the evidence to support this account has come from behavioral studies. For example, when shown a video depicting a social scene, lonely people spend more time than others looking at signs of social threat, such as a person being ignored by their friends or one person turning their back on another. Unpublished work also shows that lonely people’s attention seems to be grabbed more quickly by words that pertain to social threat, such as rejected or unwanted. Now the University of Chicago’s husband-and-wife research team of Stephanie and John Cacioppo — leading authorities on the psychology and neuroscience of loneliness — have teamed up with their colleague, Stephen Balogh, to provide the first evidence that lonely people’s brains, compared to the non-lonely, are exquisitely alert to the difference between social and nonsocial threats. The finding, reported online in the journal Cortex, supports their broader theory that, for evolutionary reasons, loneliness triggers a cascade of brain-related changes that put us into a socially nervous, vigilant mode. The researchers used a loneliness questionnaire to recruit 38 very lonely people and 32 people who didn’t feel lonely (note that loneliness was defined here as the subjective feeling of isolation, as opposed to the number of friends or close relatives one has). Next, the researchers placed an electrode array of 128 sensors on each of the participants’ heads, allowing them to record the participants’ brain waves using an established technique known as electro-encephalography (EEG) that’s particularly suited to measuring brain activity changes over very short time periods. © 2015, New York Media LLC.

Keyword: Depression; Learning & Memory
Link ID: 21267 - Posted: 08.05.2015

Laura Sanders A type of brain cell formerly known for its supporting role has landed a glamorous new job. Astrocytes, a type of glial cell known to feed, clean and guide the growth of their flashier nerve cell neighbors, also help nerve cells send electrical transmissions, scientists report in the Aug. 5 Journal of Neuroscience. The results are the latest in scientists’ efforts to uncover the mysterious and important ways in which cells other than nerve cells keep the nervous system humming. Astrocytes deliver nutrients to nerve cells, flush waste out of the brain (SN: 9/22/12) and even help control appetite (SN: 6/28/14). The latest study suggests that these star-shaped cells also help electrical messages move along certain nerve cells’ message-sending axons, a job already attributed to other glial cells called oligodendrocytes and Schwann cells. Courtney Sobieski of Washington University School of Medicine in St. Louis and colleagues grew individual rat nerve cells in a single dish that contained patches of astrocytes. Some nerve cells grew on the patches; others did not. The nerve cells deprived of astrocyte contact showed signs of sluggishness. The researchers think that astrocytes guide nerve cell growth in a way that enables the nerve cells to later fire off quick and precise messages. It’s not clear how the astrocytes do that, but the results suggest that proximity is the key: Astrocytes needed to be close to the nerve cell to help messages move. © Society for Science & the Public 2000 - 2015

Keyword: Glia; Development of the Brain
Link ID: 21264 - Posted: 08.05.2015