Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Daniel Freeman and Jason Freeman “Although it is a waste of time to argue with a paranoid patient about his delusions, he may still be persuaded to keep them to himself, to repress them as far as possible and to forgo the aggressive action they might suggest, in general to conduct his life as if they did not exist.” This quote from Clinical Psychiatry, a hugely influential textbook in the 1950s and 1960s, epitomises the way in which unusual mental states were generally understood for much of the 20th century. Delusions (such as paranoid thoughts) and hallucinations (hearing voices, for example) were of interest purely as symptoms of psychosis, or what used to be called madness. Apart from their utility in diagnosis, they were deemed to be meaningless: the incomprehensible effusions of a diseased brain. Or in the jargon: “empty speech acts, whose informational content refers to neither world nor self”. There’s a certain irony here, of course, in experts supposedly dedicated to understanding the way the mind works dismissing certain thoughts as unworthy of attention or explanation. The medical response to these phenomena, which were considered to be an essentially biological problem, was to eradicate them with powerful antipsychotic drugs. This is not to say that other strategies weren’t attempted: in one revealing experiment in the 1970s, patients in a ward for “paranoid schizophrenics” in Vermont, US, were rewarded with tokens for avoiding “delusional talk”. These tokens could be exchanged for items including “meals, extra dessert, visits to the canteen, cigarettes, time off the ward, time in the TV and game room, time in bedroom between 8am and 9pm, visitors, books and magazines, recreation, dances on other wards.” (It didn’t work: most patients modified their behaviour temporarily, but “changes in a patient’s delusional system and general mental status could not be detected by a psychiatrist”.) © 2014 Guardian News and Media Limited
Link ID: 20369 - Posted: 11.29.2014
|By Jason G. Goldman A sharp cry pierces the air. Soon a worried mother deer approaches the source of the sound, expecting to find her fawn. But the sound is coming from a speaker system, and the call isn't that of a baby deer at all. It's an infant fur seal's. Because deer and seals do not live in the same habitats, mother deer should not know how baby seal screams sound, reasoned biologists Susan Lingle of the University of Winnipeg and Tobias Riede of Midwestern University, who were running the acoustic experiment. So why did a mother deer react with concern? Over two summers, the researchers treated herds of mule deer and white-tailed deer on a Canadian farm to modified recording of the cries of a wide variety of infant mammals—elands, marmots, bats, fur seals, sea lions, domestic cats, dogs and humans. By observing how mother deer responded, Lingle and Riede discovered that as long as the fundamental frequency was similar to that of their own infants' calls, those mothers approached the speaker as if they were looking for their offspring. Such a reaction suggests deep commonalities among the cries of most young mammals. (The mother deer did not show concern for white noise, birdcalls or coyote barks.) Lingle and Riede published their findings in October in the American Naturalist. Researchers had previously proposed that sounds made by different animals during similar experiences—when they were in pain, for example—would share acoustic traits. “As humans, we often ‘feel’ for the cry of young animals,” Lingle says. That empathy may arise because emotions are expressed in vocally similar ways among mammals. © 2014 Scientific American
by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.
By Virginia Morell When we listen to someone talking, we hear some sounds that combine to make words and other sounds that convey such things as the speaker’s emotions and gender. The left hemisphere of our brain manages the first task, while the right hemisphere specializes in the second. Dogs also have this kind of hemispheric bias when listening to the sounds of other dogs. But do they have it with human sounds? To find out, two scientists had dogs sit facing two speakers. The researchers then played a recorded short sentence—“Come on, then”—and watched which way the dogs turned. When the animals heard recordings in which individual words were strongly emphasized, they turned to the right—indicating that their left hemispheres were engaged. But when they listened to recordings that had exaggerated intonations, they turned to the left—a sign that the right hemisphere was responding. Thus, dogs seem to process the elements of speech very similarly to the way humans do, the scientists report online today in Current Biology. According to the researchers, the findings support the idea that our canine pals are indeed paying close attention not only to who we are and how we say things, but also to what we say. © 2014 American Association for the Advancement of Science.
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
by Bethany Brookshire We all experience stress, but some handle it better than others. A lot of research has focused on what makes animals and people susceptible to stress and how that, in turn, can trigger depression. It makes sense to study the condition, not the people that don’t experience it. Depression and susceptibility are the broken state. Resilience seems normal by comparison. But resilience is not just the absence of susceptibility. It turns out that a protein called beta-catenin plays an active role in resilience. A new study, from Eric Nestler’s laboratory at the Mount Sinai School of Medicine in New York City, also identifies a large number of new targets that could help scientists understand why some people are susceptible to stress — and how they might be made more resilient. “When people study stress responses, we often just assume that in an animal that’s stressed, there’s an active process that creates these depression-like behaviors,” says Andre Der-Avakian, a neuroscientist at the University of California, San Diego. “But this study and studies from others have shown that resilience is also an active process.” The nucleus accumbens is an area of the brain most often linked with reward and pleasure from items we enjoy, such as food or drugs. But the area also shows changes in people with depression. “It makes sense — here’s a region important in responding to rewards,” Nestler explains. “One of the symptoms of people with depression is that they don’t derive pleasure from things in life.” © Society for Science & the Public 2000 - 2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
|By Piercarlo Valdesolo Google “successful Thanksgiving” and you will get a lot of different recommendations. Most you’ve probably heard before: plan ahead, get help, follow certain recipes. But according to new research from Florida State University, enjoying your holiday also requires a key ingredient that few guests consider as they wait to dive face first into the turkey: a belief in free will. What does free will have to do with whether or not Aunt Sally leaves the table in a huff? These researchers argue that belief in free will is essential to experiencing the emotional state that makes Thanksgiving actually about giving thanks: gratitude. Previous research has shown that our level of gratitude for an act depends on three things: 1) the cost to the benefactor (in time, effort or money), 2) the value of the act to the beneficiary, and 3) the sincerity of the benefactor’s intentions. For example, last week my 4-year-old daughter gave me a drawing of our family. This act was costly (she spent time and effort), valuable (I love the way she draws herself bigger than everyone else in the family), and sincere (she drew it because she knew I would like it). But what if I thought that she drew it for a different reason? What if I thought that she was being coerced by my wife? Or if I thought that this was just an assignment at her pre-school? In other words, what if I thought she had no choice but to draw it? I wouldn’t have defiantly thrown it back in her face, but I surely would have felt differently about the sincerity of the action. It would have diminished my gratitude. © 2014 Scientific American
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company
By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,
More than 40 percent of infants in a group who died of sudden infant death syndrome (SIDS) were found to have an abnormality in a key part of the brain, researchers report. The abnormality affects the hippocampus, a brain area that influences such functions as breathing, heart rate, and body temperature, via its neurological connections to the brainstem. According to the researchers, supported by the National Institutes of Health, the abnormality was present more often in infants who died of SIDS than in infants whose deaths could be attributed to known causes. The researchers believe the abnormality may destabilize the brain’s control of breathing and heart rate patterns during sleep, or during the periodic brief arousals from sleep that occur throughout the night. “The new finding adds to a growing body of evidence that brain abnormalities may underlie many cases of sudden infant death syndrome,” said Marian Willinger, Ph.D, special assistant for SIDS at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development, which funded the study. “The hope is that research efforts in this area eventually will provide the means to identify vulnerable infants so that we’ll be able to reduce their risk for SIDS.” SIDS is the sudden death of an infant younger than 1 year of age that is still unexplained after a complete post mortem investigation by a coroner or medical examiner. This investigation includes an autopsy, a review of the death scene, and review of family and medical histories. In the United States, SIDS is the leading cause of death between one month and one year of age. The deaths are associated with an infant’s sleep period.
By David Tuller Patients with chronic fatigue syndrome are accustomed to disappointment. The cause of the disorder remains unknown; it can be difficult to diagnose, and treatment options are few. Research suggesting that an infection from a mouse virus may cause it raised hopes among patients a few years ago, but the evidence fell apart under closer scrutiny. Many patients are still told to seek psychiatric help. But two recent studies — one from investigators at Stanford a few weeks ago and another from a Japanese research team published earlier this year — have found that the brains of people with chronic fatigue syndrome differ from those of healthy people, strengthening the argument that serious physiological dysfunctions are at the root of the condition. “You’ve got two different groups that have independently said, ‘There’s something going on in the brain that is aberrant,’ ” said Leonard Jason, a psychologist at DePaul University in Chicago who studies the condition, also called myalgic encephalomyelitis and widely known as M.E./C.F.S. “I think you have a growing sense that this illness should be taken seriously.” Both studies were small, however, and their results must be replicated before firm conclusions can be drawn. Still, other studies presented at scientific conferences this year also have demonstrated physiological dysfunctions in these patients. In the most recent study, published by the journal Radiology, researchers at Stanford University compared brain images of 15 patients with the condition to those of 14 healthy people. The scientists found differences in both the white matter, the long, cablelike nerve structures that transmit signals between parts of the brain, and the gray matter, the regions where these signals are processed and interpreted. The most striking finding was that in people with the disorder, one neural tract in the white matter of the right hemisphere appeared to be abnormally shaped, as if the cablelike nerve structures had crisscrossed or changed in some other way. Furthermore, the most seriously ill patients exhibited the greatest levels of this abnormality. © 2014 The New York Times Company
By Michelle Roberts Health editor, BBC News online The brain has a weak spot for Alzheimer's disease and schizophrenia, according to UK scientists who have pinpointed the region using scans. The brain area involved develops late in adolescence and degenerates early during ageing. At the moment, it is difficult for doctors to predict which people might develop either condition. The findings, in the journal PNAS, hint at a potential way to diagnose those at risk earlier, experts say. Although they caution that "much more research is needed into how to bring these exciting discoveries into the clinic". The Medical Research Council team who carried out the study did MRI brain scans on 484 healthy volunteers aged between eight and 85 years. The researchers, led by Dr Gwenaëlle Douaud of Oxford University, looked at how the brain naturally changes as people age. The images revealed a common pattern - the parts of the brain that were the last to develop were also the first to show signs of age-related decline. These brain regions - a network of nerve cells or grey matter - co-ordinate "high order" information coming from the different senses, such as sight and sound. When the researchers looked at scans of patients with Alzheimer's disease and scans of patients with schizophrenia they found the same brain regions were affected. The findings fit with what other experts have suspected - that although distinct, Alzheimer's and schizophrenia are linked. Prof Hugh Perry of the MRC said: "Early doctors called schizophrenia 'premature dementia' but until now we had no clear evidence that the same parts of the brain might be associated with two such different diseases. This large-scale and detailed study provides an important, and previously missing, link between development, ageing and disease processes in the brain. BBC © 2014
By Sandra G. Boodman ‘That’s it — I’m done,” Rachel Miller proclaimed, the sting of the neurologist’s judgment fresh as she recounted the just-concluded appointment to her husband. Whatever was wrong with her, Miller decided after that 2009 encounter, she was not willing to risk additional humiliation by seeing another doctor who might dismiss her problems as psychosomatic. The Baltimore marketing executive had spent the previous two years trying to figure out what was causing her bizarre symptoms, some of which she knew made her sound delusional. Her eyes felt “weird,” although her vision was 20/20. Normal sounds seemed hugely amplified: at night when she lay in bed, her breathing and heartbeat were deafening. Water pounding on her back in the shower sounded like a roar. She was plagued by dizziness. “I had started to feel like a person in one of those stories where someone has been committed to a mental hospital by mistake or malice and they desperately try to appear sane,” recalled Miller, now 53. She began to wonder if she really was crazy; numerous tests had ruled out a host of possible causes, including a brain tumor. Continuing to look for answers seemed futile, since all the doctors she had seen had failed to come up with anything conclusive. “My attitude was: If it’s something progressive like MS [multiple sclerosis] or ALS [amyotrophic lateral sclerosis], it’ll get bad enough that someone will eventually figure it out.” Figuring it out would take nearly three more years and was partly the result of an oddity that Miller mentioned to another neurologist, after she lifted her moratorium on seeing doctors.
Link ID: 20353 - Posted: 11.25.2014
By Chelsea Rice On December 14, 2012, Adam Lanza shot and killed 20 children and six personnel at Sandy Hook Elementary School in Newtown, Connecticut, before turning the gun on himself. Ever since, America has been wondering: Why? Today, after investigating and detailing every available record of the 20-year-old Lanza’s life since birth, the Connecticut Office of the Child Advocate released a report that said: We still don’t know what drove him to commit those terrible acts. But we do know he fell through the cracks of the school system, the health care system, and possibly the awareness of his own parents. Every documented moment of Lanza’s life was evaluated, from mental health records that tracked his social development to school and medical records that outlined his needs—and showed disparities in the services provided to him by the state. The review did not, however, stop at Lanza. It included a review of the laws regarding special education and the confidentiality of personal records in the system, as well as “how these laws implicate professional obligations and practices.” Unredacted state police and law enforcement records were also reviewed alongside interviews and extensive research with members of the Child Fatality Review Panel who led the initial investigation of that day. From “earliest childhood,” according to the report, Lanza had “significant developmental challenges,” such as communication and sensory problems, delays in socialization, and repetitive behaviors. Lanza was seen and evaluated by the New Hampshire “Birth to Three” early intervention program when he was almost 3 years old, and referred to special education preschool services.
By Victoria Colliver Marianne Austin watched her mother go blind from age-related macular degeneration, an eye disease that affects about 10 million older Americans. Now that Austin has been diagnosed with the same condition, she wants to avoid her mother’s experience. “I’ve seen what can happen and the devastation it can cause,” said Austin, 67, of Atherton, who found out she had the disease last year. “I call it having seen the movie. I don’t like that ending, I want to change the movie, and I don’t want to wait 10 years until something is proven in research.” About 10 percent of patients diagnosed with age-related macular degeneration will develop the form of the disease that causes permanent blindness. It’s unclear just how much genetics plays a role, so there’s no definitive way to predict who will progress to that stage or when that would happen. But a team of Stanford doctors think they may have found a way. In a study, published this month in the medical journal Investigative Ophthalmology and Visual Science, researchers analyzed data from 2,146 retinal scans from 244 macular degeneration patients at Stanford from 2008 to 2013. They then created an algorithm that predicted whether a particular patient would be likely to develop the form of the disease that causes blindness within less than a year, three years or up to five years. For those with macular degeneration to go blind, the disease has to advance from what is known as the “dry” form to the “wet” form. The sooner a doctor can notice changes, the better chance there is to save a patient’s vision.
|By Christof Koch Point to any one organ in the body, and doctors can tell you something about what it does and what happens if that organ is injured by accident or disease or is removed by surgery—whether it be the pituitary gland, the kidney or the inner ear. Yet like the blank spots on maps of Central Africa from the mid-19th century, there are structures whose functions remain unknown despite whole-brain imaging, electroencephalographic recordings that monitor the brain's cacophony of electrical signals and other advanced tools of the 21st century. Consider the claustrum. It is a thin, irregular sheet of cells, tucked below the neocortex, the gray matter that allows us to see, hear, reason, think and remember. It is surrounded on all sides by white matter—the tracts, or wire bundles, that interconnect cortical regions with one another and with other brain regions. The claustra—for there are two of them, one on the left side of the brain and one on the right—lie below the general region of the insular cortex, underneath the temples, just above the ears. They assume a long, thin wisp of a shape that is easily overlooked when inspecting the topography of a brain image. Advanced brain-imaging techniques that look at the white matter fibers coursing to and from the claustrum reveal that it is a neural Grand Central Station. Almost every region of the cortex sends fibers to the claustrum. These connections are reciprocated by other fibers that extend back from the claustrum to the originating cortical region. Neuroanatomical studies in mice and rats reveal a unique asymmetry—each claustrum receives input from both cortical hemispheres but only projects back to the overlying cortex on the same side. Whether or not this is true in people is not known. Curiouser and curiouser, as Alice would have said. © 2014 Scientific American
Link ID: 20350 - Posted: 11.24.2014