Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.
By CATHERINE SAINT LOUIS Nearly 55 percent of infants nationwide are put to bed with soft blankets or covered by a comforter, even though such bedding raises the chances of suffocation or sudden infant death syndrome, federal researchers reported Monday. Their study, published in the journal Pediatrics, is the first to estimate how many infants sleep with potentially hazardous quilts, bean bags, blankets or pillows. Despite recommendations to avoid putting anything but a baby in a crib, two-thirds of black and Latino parents still use bedding that is both unnecessary and unsafe, the study also found. “I was startled a little bit by the number of people still using bedding in the sleep area,” said Dr. Michael Goodstein, a neonatologist in York, Pa., who serves on a task force on sleep-related infant deaths at the American Academy of Pediatrics. “Sleeping face down on soft bedding increases the risks of SIDS 21-fold.” Among the risk factors for SIDS, “bedding has fallen through the cracks,” said Dr. Thomas G. Keens, the chairman of the California SIDS Advisory Council. “This article is a wake-up call.” The new analysis looked at data gathered from 1993 to 2010 in the National Infant Sleep Position Study, which surveyed a random sample of nearly 19,000 parents by telephone. Use of infant bedding declined roughly 23 percent annually from 1993 to 2000. In recent years, however, the declines have slowed or stalled entirely. From 2001 to 2010, use of inappropriate bedding for white and Hispanic infants declined just 5 to 7 percent annually. There was no decline in the use of such bedding for black infants. Parents in the new study were not asked their reasons for using bedding. Previous research has found that they worry infants will be cold, or that the crib mattress is too hard. © 2014 The New York Times Company
Some teenagers appear to show changes in their brains after one season of playing American football, a small study suggests. Even though players were not concussed during the season, researchers found abnormalities similar to the effects of mild traumatic brain injury. Twenty-four players aged between 16 and 18 were studied and devices on their helmets measured head impacts. The study was presented to the Radiological Society of North America. In recent years, a number of reports have expressed concern about the potential effects on young, developing brains of playing contact sports. These studies have tended to focus on brain changes as a result of concussion. But this study focused on the effects of head impacts on the brain, even when players did not suffer concussion at any point during the season. Using detailed scans of the players' brains before the season began and then again after it ended, the researchers were able to identify slight changes to the white matter of the brain. White matter contains millions of nerve fibres which act as communication cables between the brain's regions. Those players who were hit harder and hit more often were more likely to show these changes in post-season brain scans. Dr Alex Powers, co-author and paediatric neurosurgeon at Wake Forest Baptist Medical Centre in North Carolina, said the changes were a direct result of the hits received by the young players during their football season. BBC © 2014
By BILL PENNINGTON It happens dozens of times in every N.F.L. game. There is a fierce collision, or perhaps a running back is slammed to the ground. Most of the time, all the players rise to their feet uneventfully. Other times, as the pileup unravels, a player gets up slowly. His gait may be unsteady. For decades in the N.F.L., the operative term for the situation was that someone “got dinged.” It was a cute, almost harmless-sounding description of what was often a concussion or a worrying subconcussive blow to the head. But with the N.F.L. agreeing to pay hundreds of millions of dollars to settle a lawsuit brought by about 5,000 former players who said the league hid from them the dangers of repeated hits to the head, a backpedaling league has corrected its lingo and hastily amended its methodology. The N.F.L. now has a concussion management protocol, outlined in an inches-thick document that commands teams to institute a specific, detailed game-day and postconcussion course of action. Once, the treatment of players with head injuries varied from team to team and could be haphazard. Beginning last season, all players suspected of having a head injury — should they lose consciousness from a collision or experience symptoms like a headache, dizziness or disorientation — were required to go through the concussion protocol system. It features a broad cast: a head-injury spotter in the press box, athletic trainers on the bench, doctors and neuro-trauma specialists on the sideline and experts in neuro-cognitive testing in the locker room. The system is far from foolproof — players with serious symptoms remain in games. But as the N.F.L. grapples with a sobering threat to the welfare of its work force, not to mention a public-relations nightmare, the new concussion protocol is meant to establish a systemic, itemized policy on how potential brain injuries should be handled. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20372 - Posted: 12.01.2014
By John Edward Terrell We will certainly hear it said many times between now and the 2016 elections that the country’s two main political parties have “fundamental philosophical differences.” But what exactly does that mean? At least part of the schism between Republicans and Democrats is based in differing conceptions of the role of the individual. We find these differences expressed in the frequent heated arguments about crucial issues like health care and immigration. In a broad sense, Democrats, particularly the more liberal among them, are more likely to embrace the communal nature of individual lives and to strive for policies that emphasize that understanding. Republicans, especially libertarians and Tea Party members on the ideological fringe, however, often trace their ideas about freedom and liberty back to Enlightenment thinkers of the 17th and 18th centuries, who argued that the individual is the true measure of human value, and each of us is naturally entitled to act in our own best interests free of interference by others. Self-described libertarians generally also pride themselves on their high valuation of logic and reasoning over emotion. The basic unit of human social life is not and never has been the selfish and self-serving individual. Philosophers from Aristotle to Hegel have emphasized that human beings are essentially social creatures, that the idea of an isolated individual is a misleading abstraction. So it is not just ironic but instructive that modern evolutionary research, anthropology, cognitive psychology and neuroscience have come down on the side of the philosophers who have argued that the basic unit of human social life is not and never has been the selfish, self-serving individual. Contrary to libertarian and Tea Party rhetoric, evolution has made us a powerfully social species, so much so that the essential precondition of human survival is and always has been the individual plus his or her relationships with others. © 2014 The New York Times Company
Link ID: 20371 - Posted: 12.01.2014
By Anna North What is depression? Anyone who has dealt with the condition knows what it can feel like — but what causes it, what sustains it, and what’s the best way to make it subside? Despite the prevalence of the disorder — in one Centers for Disease Control and Prevention study, 9.1 percent of adults met the criteria for depression — experts haven’t fully answered these questions. And to fully do so, some say we need new ways of thinking about depression entirely. For Turhan Canli, a professor of integrative neuroscience at Stony Brook University, that means looking at the possibility that depression could be caused by an infection. “I’ve always been struck by the fact that the treatment options did not seem to have dramatically improved over the course of decades,” Dr. Canli told Op-Talk. “I always had a feeling that somehow we seem to be missing the actual treatment of the disease.” He was intrigued by research showing a connection between depression and inflammation in the body, and he started to think about the known causes of inflammation — among them pathogens like bacteria, viruses and parasites. In a paper published in the journal Biology of Mood and Anxiety Disorders, he lays out his case for rethinking depression as a response to infection. He notes that the symptoms of depression are similar to those of infection: “Patients experience loss of energy; they commonly have difficulty getting out of bed and lose interest in the world around them. Although our Western conceptualization puts affective symptoms front-and-center, non-Western patients who meet DSM criteria for major depression report primarily somatic symptoms.” © 2014 The New York Times Company
Daniel Freeman and Jason Freeman “Although it is a waste of time to argue with a paranoid patient about his delusions, he may still be persuaded to keep them to himself, to repress them as far as possible and to forgo the aggressive action they might suggest, in general to conduct his life as if they did not exist.” This quote from Clinical Psychiatry, a hugely influential textbook in the 1950s and 1960s, epitomises the way in which unusual mental states were generally understood for much of the 20th century. Delusions (such as paranoid thoughts) and hallucinations (hearing voices, for example) were of interest purely as symptoms of psychosis, or what used to be called madness. Apart from their utility in diagnosis, they were deemed to be meaningless: the incomprehensible effusions of a diseased brain. Or in the jargon: “empty speech acts, whose informational content refers to neither world nor self”. There’s a certain irony here, of course, in experts supposedly dedicated to understanding the way the mind works dismissing certain thoughts as unworthy of attention or explanation. The medical response to these phenomena, which were considered to be an essentially biological problem, was to eradicate them with powerful antipsychotic drugs. This is not to say that other strategies weren’t attempted: in one revealing experiment in the 1970s, patients in a ward for “paranoid schizophrenics” in Vermont, US, were rewarded with tokens for avoiding “delusional talk”. These tokens could be exchanged for items including “meals, extra dessert, visits to the canteen, cigarettes, time off the ward, time in the TV and game room, time in bedroom between 8am and 9pm, visitors, books and magazines, recreation, dances on other wards.” (It didn’t work: most patients modified their behaviour temporarily, but “changes in a patient’s delusional system and general mental status could not be detected by a psychiatrist”.) © 2014 Guardian News and Media Limited
Link ID: 20369 - Posted: 11.29.2014
|By Jason G. Goldman A sharp cry pierces the air. Soon a worried mother deer approaches the source of the sound, expecting to find her fawn. But the sound is coming from a speaker system, and the call isn't that of a baby deer at all. It's an infant fur seal's. Because deer and seals do not live in the same habitats, mother deer should not know how baby seal screams sound, reasoned biologists Susan Lingle of the University of Winnipeg and Tobias Riede of Midwestern University, who were running the acoustic experiment. So why did a mother deer react with concern? Over two summers, the researchers treated herds of mule deer and white-tailed deer on a Canadian farm to modified recording of the cries of a wide variety of infant mammals—elands, marmots, bats, fur seals, sea lions, domestic cats, dogs and humans. By observing how mother deer responded, Lingle and Riede discovered that as long as the fundamental frequency was similar to that of their own infants' calls, those mothers approached the speaker as if they were looking for their offspring. Such a reaction suggests deep commonalities among the cries of most young mammals. (The mother deer did not show concern for white noise, birdcalls or coyote barks.) Lingle and Riede published their findings in October in the American Naturalist. Researchers had previously proposed that sounds made by different animals during similar experiences—when they were in pain, for example—would share acoustic traits. “As humans, we often ‘feel’ for the cry of young animals,” Lingle says. That empathy may arise because emotions are expressed in vocally similar ways among mammals. © 2014 Scientific American
by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.
By Virginia Morell When we listen to someone talking, we hear some sounds that combine to make words and other sounds that convey such things as the speaker’s emotions and gender. The left hemisphere of our brain manages the first task, while the right hemisphere specializes in the second. Dogs also have this kind of hemispheric bias when listening to the sounds of other dogs. But do they have it with human sounds? To find out, two scientists had dogs sit facing two speakers. The researchers then played a recorded short sentence—“Come on, then”—and watched which way the dogs turned. When the animals heard recordings in which individual words were strongly emphasized, they turned to the right—indicating that their left hemispheres were engaged. But when they listened to recordings that had exaggerated intonations, they turned to the left—a sign that the right hemisphere was responding. Thus, dogs seem to process the elements of speech very similarly to the way humans do, the scientists report online today in Current Biology. According to the researchers, the findings support the idea that our canine pals are indeed paying close attention not only to who we are and how we say things, but also to what we say. © 2014 American Association for the Advancement of Science.
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
by Bethany Brookshire We all experience stress, but some handle it better than others. A lot of research has focused on what makes animals and people susceptible to stress and how that, in turn, can trigger depression. It makes sense to study the condition, not the people that don’t experience it. Depression and susceptibility are the broken state. Resilience seems normal by comparison. But resilience is not just the absence of susceptibility. It turns out that a protein called beta-catenin plays an active role in resilience. A new study, from Eric Nestler’s laboratory at the Mount Sinai School of Medicine in New York City, also identifies a large number of new targets that could help scientists understand why some people are susceptible to stress — and how they might be made more resilient. “When people study stress responses, we often just assume that in an animal that’s stressed, there’s an active process that creates these depression-like behaviors,” says Andre Der-Avakian, a neuroscientist at the University of California, San Diego. “But this study and studies from others have shown that resilience is also an active process.” The nucleus accumbens is an area of the brain most often linked with reward and pleasure from items we enjoy, such as food or drugs. But the area also shows changes in people with depression. “It makes sense — here’s a region important in responding to rewards,” Nestler explains. “One of the symptoms of people with depression is that they don’t derive pleasure from things in life.” © Society for Science & the Public 2000 - 2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
|By Piercarlo Valdesolo Google “successful Thanksgiving” and you will get a lot of different recommendations. Most you’ve probably heard before: plan ahead, get help, follow certain recipes. But according to new research from Florida State University, enjoying your holiday also requires a key ingredient that few guests consider as they wait to dive face first into the turkey: a belief in free will. What does free will have to do with whether or not Aunt Sally leaves the table in a huff? These researchers argue that belief in free will is essential to experiencing the emotional state that makes Thanksgiving actually about giving thanks: gratitude. Previous research has shown that our level of gratitude for an act depends on three things: 1) the cost to the benefactor (in time, effort or money), 2) the value of the act to the beneficiary, and 3) the sincerity of the benefactor’s intentions. For example, last week my 4-year-old daughter gave me a drawing of our family. This act was costly (she spent time and effort), valuable (I love the way she draws herself bigger than everyone else in the family), and sincere (she drew it because she knew I would like it). But what if I thought that she drew it for a different reason? What if I thought that she was being coerced by my wife? Or if I thought that this was just an assignment at her pre-school? In other words, what if I thought she had no choice but to draw it? I wouldn’t have defiantly thrown it back in her face, but I surely would have felt differently about the sincerity of the action. It would have diminished my gratitude. © 2014 Scientific American
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company
By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,
More than 40 percent of infants in a group who died of sudden infant death syndrome (SIDS) were found to have an abnormality in a key part of the brain, researchers report. The abnormality affects the hippocampus, a brain area that influences such functions as breathing, heart rate, and body temperature, via its neurological connections to the brainstem. According to the researchers, supported by the National Institutes of Health, the abnormality was present more often in infants who died of SIDS than in infants whose deaths could be attributed to known causes. The researchers believe the abnormality may destabilize the brain’s control of breathing and heart rate patterns during sleep, or during the periodic brief arousals from sleep that occur throughout the night. “The new finding adds to a growing body of evidence that brain abnormalities may underlie many cases of sudden infant death syndrome,” said Marian Willinger, Ph.D, special assistant for SIDS at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development, which funded the study. “The hope is that research efforts in this area eventually will provide the means to identify vulnerable infants so that we’ll be able to reduce their risk for SIDS.” SIDS is the sudden death of an infant younger than 1 year of age that is still unexplained after a complete post mortem investigation by a coroner or medical examiner. This investigation includes an autopsy, a review of the death scene, and review of family and medical histories. In the United States, SIDS is the leading cause of death between one month and one year of age. The deaths are associated with an infant’s sleep period.