Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Sandra G. Boodman ‘That’s it — I’m done,” Rachel Miller proclaimed, the sting of the neurologist’s judgment fresh as she recounted the just-concluded appointment to her husband. Whatever was wrong with her, Miller decided after that 2009 encounter, she was not willing to risk additional humiliation by seeing another doctor who might dismiss her problems as psychosomatic. The Baltimore marketing executive had spent the previous two years trying to figure out what was causing her bizarre symptoms, some of which she knew made her sound delusional. Her eyes felt “weird,” although her vision was 20/20. Normal sounds seemed hugely amplified: at night when she lay in bed, her breathing and heartbeat were deafening. Water pounding on her back in the shower sounded like a roar. She was plagued by dizziness. “I had started to feel like a person in one of those stories where someone has been committed to a mental hospital by mistake or malice and they desperately try to appear sane,” recalled Miller, now 53. She began to wonder if she really was crazy; numerous tests had ruled out a host of possible causes, including a brain tumor. Continuing to look for answers seemed futile, since all the doctors she had seen had failed to come up with anything conclusive. “My attitude was: If it’s something progressive like MS [multiple sclerosis] or ALS [amyotrophic lateral sclerosis], it’ll get bad enough that someone will eventually figure it out.” Figuring it out would take nearly three more years and was partly the result of an oddity that Miller mentioned to another neurologist, after she lifted her moratorium on seeing doctors.
Link ID: 20353 - Posted: 11.25.2014
By Victoria Colliver Marianne Austin watched her mother go blind from age-related macular degeneration, an eye disease that affects about 10 million older Americans. Now that Austin has been diagnosed with the same condition, she wants to avoid her mother’s experience. “I’ve seen what can happen and the devastation it can cause,” said Austin, 67, of Atherton, who found out she had the disease last year. “I call it having seen the movie. I don’t like that ending, I want to change the movie, and I don’t want to wait 10 years until something is proven in research.” About 10 percent of patients diagnosed with age-related macular degeneration will develop the form of the disease that causes permanent blindness. It’s unclear just how much genetics plays a role, so there’s no definitive way to predict who will progress to that stage or when that would happen. But a team of Stanford doctors think they may have found a way. In a study, published this month in the medical journal Investigative Ophthalmology and Visual Science, researchers analyzed data from 2,146 retinal scans from 244 macular degeneration patients at Stanford from 2008 to 2013. They then created an algorithm that predicted whether a particular patient would be likely to develop the form of the disease that causes blindness within less than a year, three years or up to five years. For those with macular degeneration to go blind, the disease has to advance from what is known as the “dry” form to the “wet” form. The sooner a doctor can notice changes, the better chance there is to save a patient’s vision.
|By Christof Koch Point to any one organ in the body, and doctors can tell you something about what it does and what happens if that organ is injured by accident or disease or is removed by surgery—whether it be the pituitary gland, the kidney or the inner ear. Yet like the blank spots on maps of Central Africa from the mid-19th century, there are structures whose functions remain unknown despite whole-brain imaging, electroencephalographic recordings that monitor the brain's cacophony of electrical signals and other advanced tools of the 21st century. Consider the claustrum. It is a thin, irregular sheet of cells, tucked below the neocortex, the gray matter that allows us to see, hear, reason, think and remember. It is surrounded on all sides by white matter—the tracts, or wire bundles, that interconnect cortical regions with one another and with other brain regions. The claustra—for there are two of them, one on the left side of the brain and one on the right—lie below the general region of the insular cortex, underneath the temples, just above the ears. They assume a long, thin wisp of a shape that is easily overlooked when inspecting the topography of a brain image. Advanced brain-imaging techniques that look at the white matter fibers coursing to and from the claustrum reveal that it is a neural Grand Central Station. Almost every region of the cortex sends fibers to the claustrum. These connections are reciprocated by other fibers that extend back from the claustrum to the originating cortical region. Neuroanatomical studies in mice and rats reveal a unique asymmetry—each claustrum receives input from both cortical hemispheres but only projects back to the overlying cortex on the same side. Whether or not this is true in people is not known. Curiouser and curiouser, as Alice would have said. © 2014 Scientific American
Link ID: 20350 - Posted: 11.24.2014
By Amy Ellis Nutt Debbie Hall undergoes external brain stimulation at Ohio State's Wexner Medical Center. Hall was partially paralyzed on her left side after a stroke. Doctors are conducting a study to see if a device known as NexStim can `prep` a stroke victim's brain immediately prior to physical therapy so that the therapy will be more effective. (The Ohio State University Wexner Medical Center) Using non-invasive transcranial magnetic stimulation, or TMS, researchers at Ohio State Wexner Medical Center may have found a way to help prep a stroke victim's brain prior to physical therapy to aid a more complete recovery. When one side of the brain is damaged by a stroke, the corresponding healthy part goes into overdrive in order to compensate, said Dr. Marcie Bockbrader, principle investigator of the study. She believes the hyperactivity in the healthy side may actually slow recovery in the injured side. The technology, called NexStim, employs TMS to prepare a stroke patient's brain for physical therapy by sending low-frequency magnetic pulses painlessly through a victim's scalp to suppress activity in the healthy part of the motor cortex. This allows the injured side to make use of more energy during physical therapy, which immediately follows the transcranial magnetic stimulation. "This device targets the overactive side, quieting it down enough, so that through therapies the injured side can learn to express itself again," said Bockbrader, an assistant professor of physical medicine and rehabilitation, in a new release.
Link ID: 20349 - Posted: 11.24.2014
by Hal Hodson Yet another smartwatch launched this week. Called Embrace, it is rather different from the latest offerings from Apple, Samsung and Motorola: it can spot the warning signs of an epileptic seizure. Embrace was developed by Matteo Lai and his team at a firm called Empatica, with the help of Rosalind Picard at the Massachusetts Institute of Technology. It measures the skin's electrical activity as a proxy for changes deep in the brain, and uses a model built on years of clinical data to tell which changes portend a seizure. It also gathers the usual temperature and motion data that smartwatches collect, allowing the wearer to measure physical activity and sleep quality. Empatica launched a crowdfunding campaign on Indiegogo on Tuesday and has already raised more than $120,000. Backers who pledge $169 will receive an Embrace watch. The idea for the wristband came when Picard and her colleagues were running a study on the emotional states of children with autism, measuring skin conductance at the wrist as part of the study. Picard noticed that one of the children had registered a spike in electrical activity that turned out to have happened 20 minutes before they noticed the symptoms of a seizure. "It shocked me when I realised these things were showing up on the wrist," says Picard. The whole point of Embrace is to prevent sudden unexplained death in epilepsy (SUDEP). Its causes are not fully understood, but Picard says they understand enough to know how to reduce the chances of dying after an epileptic seizure. © Copyright Reed Business Information Ltd.
Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o
Link ID: 20347 - Posted: 11.24.2014
By CLYDE HABERMAN The notion that a person might embody several personalities, each of them distinct, is hardly new. The ancient Romans had a sense of this and came up with Janus, a two-faced god. In the 1880s, Robert Louis Stevenson wrote “Strange Case of Dr. Jekyll and Mr. Hyde,” a novella that provided us with an enduring metaphor for good and evil corporeally bound. Modern comic books are awash in divided personalities like the Hulk and Two-Face in the Batman series. Even heroic Superman has his alternating personas. But few instances of the phenomenon captured Americans’ collective imagination quite like “Sybil,” the study of a woman said to have had not two, not three (like the troubled figure in the 1950s’ “Three Faces of Eve”), but 16 different personalities. Alters, psychiatrists call them, short for alternates. As a mass-market book published in 1973, “Sybil” sold in the millions. Tens of millions watched a 1976 television movie version. The story had enough juice left in it for still another television film in 2007. Sybil Dorsett, a pseudonym, became the paradigm of a psychiatric diagnosis once known as multiple personality disorder. These days, it goes by a more anodyne label: dissociative identity disorder. Either way, the strange case of the woman whose real name was Shirley Ardell Mason made itself felt in psychiatrists’ offices across the country. Pre-"Sybil,” the diagnosis was rare, with only about 100 cases ever having been reported in medical journals. Less than a decade after “Sybil” made its appearance, in 1980, the American Psychiatric Association formally recognized the disorder, and the numbers soared into the thousands. People went on television to tell the likes of Jerry Springer and Leeza Gibbons about their many alters. One woman insisted that she had more than 300 identities within her (enough, if you will, to fill the rosters of a dozen major-league baseball teams). Even “Eve,” whose real name is Chris Costner Sizemore, said in the mid-1970s that those famous three faces were surely an undercount. It was more like 22, she said. © 2014 The New York Times Company
Link ID: 20346 - Posted: 11.24.2014
Christopher Stringer Indeed, skeletal evidence from every inhabited continent suggests that our brains have become smaller in the past 10,000 to 20,000 years. How can we account for this seemingly scary statistic? Some of the shrinkage is very likely related to the decline in humans' average body size during the past 10,000 years. Brain size is scaled to body size because a larger body requires a larger nervous system to service it. As bodies became smaller, so did brains. A smaller body also suggests a smaller pelvic size in females, so selection would have favored the delivery of smaller-headed babies. What explains our shrinking body size, though? This decline is possibly related to warmer conditions on the earth in the 10,000 years after the last ice age ended. Colder conditions favor bulkier bodies because they conserve heat better. As we have acclimated to warmer temperatures, the way we live has also generally become less physically demanding, which overall serves to drive down body weights. Another likely reason for this decline is that brains are energetically expensive and will not be maintained at larger sizes unless it is necessary. The fact that we increasingly store and process information externally—in books, computers and online—means that many of us can probably get by with smaller brains. Some anthropologists have also proposed that larger brains may be less efficient at certain tasks, such as rapid computation, because of longer connection pathways. © 2014 Scientific American
Link ID: 20345 - Posted: 11.24.2014
by Linda Geddes A tapeworm that usually infects dogs, frogs and cats has made its home inside a man's brain. Sequencing its genome showed that it contains around 10 times more DNA than any other tapeworm sequenced so far, which could explain its ability to invade many different species. When a 50-year-old Chinese man was admitted to a UK hospital complaining of headaches, seizures, an altered sense of smell and memory flashbacks, his doctors were stumped. Tests for tuberculosis, syphilis, HIV and Lyme disease were negative, and although an MRI scan showed an abnormal region in the right side of his brain, a biopsy found inflammation, but no tumour. Over the next four years, further MRIs recorded the abnormal region moving across the man's brain (see animation), until finally his doctors decided to operate. To their immense surprise, they pulled out a 1 centimetre-long ribbon-shaped worm. It looked like a tapeworm, but was unlike any seen before in the UK, so a sample of its tissue was sent to Hayley Bennett and her colleagues at the Wellcome Trust Sanger Institute in Cambridge, UK. Genetic sequencing identified it as Spirometra erinaceieuropaei, a rare species of tapeworm found in China, South Korea, Japan and Thailand. Just 300 human infections have been reported since 1953, and not all of them in the brain. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 20344 - Posted: 11.21.2014
By Tara Parker-Pope Most people who drink to get drunk are not alcoholics, suggesting that more can be done to help heavy drinkers cut back, a new government report concludes. The finding, from a government survey of 138,100 adults, counters the conventional wisdom that every “falling-down drunk” must be addicted to alcohol. Instead, the results from the National Survey on Drug Use and Health show that nine out of 10 people who drink too much are not addicts, and can change their behavior with a little — or perhaps a lot of — prompting. “Many people tend to equate excessive drinking with alcohol dependence,’’ sad Dr. Robert Brewer, who leads the alcohol program at the Centers for Disease Control and Prevention. “We need to think about other strategies to address these people who are drinking too much but who are not addicted to alcohol.” Excessive drinking is viewed as a major public health problem that results in 88,000 deaths a year, from causes that include alcohol poisoning and liver disease, to car accidents and other accidental deaths. Excessive drinking is defined as drinking too much at one time or over the course of a week. For men, it’s having five or more drinks in one sitting or 15 drinks or more during a week. For women, it’s four drinks on one occasion or eight drinks over the course of a week. Underage drinkers and women who drink any amount while pregnant also are defined as “excessive drinkers.” Surprisingly, about 29 percent of the population meets the definition for excessive drinking, but 90 percent of them do not meet the definition of alcoholism. That’s good news because it means excessive drinking may be an easier problem to solve than previously believed. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20342 - Posted: 11.21.2014
By Jyoti Madhusoodanan Eurasian jays are tricky thieves. They eavesdrop on the noises that other birds make while hiding food in order to steal the stash later, new research shows. Scientists trying to figure out if the jays (Garrulus glandarius) could remember sounds and make use of the information placed trays of two materials—either sand or gravel—in a spot hidden from a listening jay’s view. Other avian participants of the same species, which were given a nut, cached the treat in one of the two trays. Fifteen minutes later, the listening bird was permitted to hunt up the stash (video). When food lay buried in a less noisy material such as sand, jays searched randomly. But if they heard gravel being tossed around as treats were hidden, they headed to the pebbles to pilfer the goods. Previous studies have shown that jays—like crows, ravens, and other bird burglars that belong to the corvid family—can remember where they saw food being hidden and return to the spot to look for the cache. But these new results, published in Animal Cognition this month, provide the first evidence that these corvids can also recollect sounds to locate and steal stashes of food. In their forest homes, where birds are heard more often than they are seen, this sneaky strategy might give eavesdropping jays a better chance at finding hidden feasts.
Link ID: 20339 - Posted: 11.21.2014
By Emily Underwood WASHINGTON, D.C.—Rapid changes unfold in the brain after a person's hand is amputated. Within days—and possibly even hours—neurons that once processed sensations from the palm and fingers start to shift their allegiances, beginning to fire in response to sensations in other body parts, such as the face. But a hand transplant can bring these neurons back into the fold, restoring the sense of touch nearly back to normal, according to a study presented here this week at the annual conference of the Society for Neuroscience. To date, roughly 85 people worldwide have undergone hand replant or transplant surgery, an 8- to 10-hour procedure in which surgeons reattach the bones, muscles, nerves, blood vessels, and soft tissue between the patient's severed wrist and their own hand or one from a donor, often using a needle finer than a human hair. After surgery, studies have shown that it takes about 2 years for the peripheral nerves to regenerate, with sensation slowly creeping through the palm and into the fingertips at a rate of roughly 2 mm per day, says Scott Frey, a cognitive neuroscientist at the University of Missouri, Columbia. Even once the nerves have regrown, the surgically attached hand remains far less sensitive to touch than the original hand once was. One potential explanation is that the brain's sensory "map" of the body—a series of cortical ridges and folds devoted to processing touch in different body parts—loses its ability to respond to the missing hand in the absence of sensory input, Frey says. If that's true, the brain may need to reorganize that sensory map once again in order to fully restore sensation. © 2014 American Association for the Advancement of Science
By Esther Hsieh A little-known fact: the tongue is directly connected to the brain stem. This anatomical feature is now being harnessed by scientists to improve rehabilitation. A team at the University of Wisconsin–Madison recently found that electrically stimulating the tongue can help patients with multiple sclerosis (MS) improve their gait. MS is an incurable disease in which the insulation around the nerves becomes damaged, disrupting the communication between body and brain. One symptom is loss of muscle control. In a study published in the Journal of Neuro-Engineering and Rehabilitation, Wisconsin neuroscientist Yuri Danilov and his team applied painless electrical impulses to the tip of the tongue of MS patients during physical therapy. Over a 14-week trial, patients who got tongue stimulation improved twice as much on variables such as balance and fluidity as did a control group who did the same regimen without stimulation. The tongue has extensive motor and sensory integration with the brain, Danilov explains. The nerves on the tip of the tongue are directly connected to the brain stem, a crucial hub that directs basic bodily processes. Previous research showed that sending electrical pulses through the tongue activated the neural network for balance; such activation may shore up the circuitry weakened by MS. The team is also using tongue stimulation to treat patients with vision loss, stroke damage and Parkinson's. “We have probably discovered a new way for the neurorehabilitation of many neurological disorders,” Danilov says. © 2014 Scientific American
Keyword: Multiple Sclerosis
Link ID: 20332 - Posted: 11.20.2014
By Gretchen Reynolds Exercise seems to be good for the human brain, with many recent studies suggesting that regular exercise improves memory and thinking skills. But an interesting new study asks whether the apparent cognitive benefits from exercise are real or just a placebo effect — that is, if we think we will be “smarter” after exercise, do our brains respond accordingly? The answer has significant implications for any of us hoping to use exercise to keep our minds sharp throughout our lives. In experimental science, the best, most reliable studies randomly divide participants into two groups, one of which receives the drug or other treatment being studied and the other of which is given a placebo, similar in appearance to the drug, but not containing the active ingredient. Placebos are important, because they help scientists to control for people’s expectations. If people believe that a drug, for example, will lead to certain outcomes, their bodies may produce those results, even if the volunteers are taking a look-alike dummy pill. That’s the placebo effect, and its occurrence suggests that the drug or procedure under consideration isn’t as effective as it might seem to be; some of the work is being done by people’s expectations, not by the medicine. Recently, some scientists have begun to question whether the apparently beneficial effects of exercise on thinking might be a placebo effect. While many studies suggest that exercise may have cognitive benefits, those experiments all have had a notable scientific limitation: They have not used placebos. This issue is not some abstruse scientific debate. If the cognitive benefits from exercise are a result of a placebo effect rather than of actual changes in the brain because of the exercise, then those benefits could be ephemeral and unable in the long term to help us remember how to spell ephemeral. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20329 - Posted: 11.20.2014
James Gorman Evidence has been mounting for a while that birds and other animals can count, particularly when the things being counted are items of food. But most of the research is done under controlled conditions. In a recent experiment with New Zealand robins, Alexis Garland and Jason Low at Victoria University of Wellington tested the birds in a natural setting, giving them no training and no rewards, and showed that they knew perfectly well when a scientist had showed them two mealworms in a box, but then delivered only one. The researchers reported the work this fall in the journal Behavioural Processes. The experiment is intriguing to watch, partly because it looks like a child’s magic trick. The apparatus used is a wooden box that has a sliding drawer. After clearly showing a robin that she was dropping two mealworms in a circular well in the box, Dr. Garland would slide in the drawer. It covered the two worms with an identical-looking circular well containing only one worm. When the researcher moved away and the robin flew down and lifted off a cover, it would find only one worm. The robins pecked intensely at the box, behavior they didn’t show if they found the two worms they were expecting. Earlier experiments had also shown the birds to be good at counting, and Dr. Garland said that one reason might be that they are inveterate thieves. Mates, in particular, steal from one another’s food caches, where they hide perishable prey like worms or insects. “If you’ve got a mate that steals 50 or more percent of your food,” she said, you’d better learn how to keep track of how many mealworms you’ve got. © 2014 The New York Times Company
By Bethany Brookshire WASHINGTON – Moldy houses are hard on the lungs, and new results in mice suggest that they could also be bad for the brain. Inhaling mold spores made mice anxious and forgetful, researchers reported November 15 at the annual meeting of the Society for Neuroscience. Cheryl Harding, a psychologist at the City University of New York, and colleagues dripped low doses of spores from the toxic mold Stachybotrys into mouse noses three times per week. After three weeks, the mice didn’t look sick. But they had trouble remembering a fearful place. The mice were also more anxious than normal counterparts. The anxiety and memory deficits went along with decreases in new brain cells in the hippocampus — a part of the brain that plays a role in memory — compared with control mice. Harding and colleagues also found that the behaviors linked to increased inflammatory proteins in the hippocampus. Exposure to mold’s toxins and structural proteins may trigger an immune response in the brain. The findings, Harding says, may help explain some of the conditions that people living in moldy buildings complain about, such as anxiety and cognitive problems. C. Harding et al. Mold inhalation, brain inflammation, and behavioral dysfunction. Society for Neuroscience Meeting, Washington, DC, November 15, 2014. © Society for Science & the Public 2000 - 2014.
By John Bohannon If you had the choice between hurting yourself or someone else in exchange for money, how altruistic do you think you’d be? In one infamous experiment, people were quite willing to deliver painful shocks to anonymous victims when asked by a scientist. But a new study that forced people into the dilemma of choosing between pain and profit finds that participants cared more about other people’s well-being than their own. It is hailed as the first hard evidence of altruism for the young field of behavioral economics. Human behavior toward others is hard to predict. On the one hand, we stand out in the animal world for our altruism, often making significant sacrifices to help out a stranger in need. And all but the most antisocial people experience psychological distress at witnessing, let alone causing, pain in others. Yet study after study in the field of behavioral economics has demonstrated that we tend to value our own needs and desires above those of others. For example, researchers have found that just thinking about money makes people behave more selfishly. To try to reconcile the angels and devils of our nature, a team led by Molly Crockett, a psychologist at the University of Oxford in the United Kingdom, combined the classic psychological and economics tools for probing altruism: pain and money. Everyone has their own pain threshold, so the first task was a pain calibration. Researchers administered electric shocks with electrodes attached to the wrists of 160 subjects, starting at an almost imperceptible level and amping up until the subject described the pain as intolerable. (For most people, that threshold for pain is similar to holding your wrist under a stream of 50°C water.) © 2014 American Association for the Advancement of Science.