Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
by Bethany Brookshire We all experience stress, but some handle it better than others. A lot of research has focused on what makes animals and people susceptible to stress and how that, in turn, can trigger depression. It makes sense to study the condition, not the people that don’t experience it. Depression and susceptibility are the broken state. Resilience seems normal by comparison. But resilience is not just the absence of susceptibility. It turns out that a protein called beta-catenin plays an active role in resilience. A new study, from Eric Nestler’s laboratory at the Mount Sinai School of Medicine in New York City, also identifies a large number of new targets that could help scientists understand why some people are susceptible to stress — and how they might be made more resilient. “When people study stress responses, we often just assume that in an animal that’s stressed, there’s an active process that creates these depression-like behaviors,” says Andre Der-Avakian, a neuroscientist at the University of California, San Diego. “But this study and studies from others have shown that resilience is also an active process.” The nucleus accumbens is an area of the brain most often linked with reward and pleasure from items we enjoy, such as food or drugs. But the area also shows changes in people with depression. “It makes sense — here’s a region important in responding to rewards,” Nestler explains. “One of the symptoms of people with depression is that they don’t derive pleasure from things in life.” © Society for Science & the Public 2000 - 2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
|By Piercarlo Valdesolo Google “successful Thanksgiving” and you will get a lot of different recommendations. Most you’ve probably heard before: plan ahead, get help, follow certain recipes. But according to new research from Florida State University, enjoying your holiday also requires a key ingredient that few guests consider as they wait to dive face first into the turkey: a belief in free will. What does free will have to do with whether or not Aunt Sally leaves the table in a huff? These researchers argue that belief in free will is essential to experiencing the emotional state that makes Thanksgiving actually about giving thanks: gratitude. Previous research has shown that our level of gratitude for an act depends on three things: 1) the cost to the benefactor (in time, effort or money), 2) the value of the act to the beneficiary, and 3) the sincerity of the benefactor’s intentions. For example, last week my 4-year-old daughter gave me a drawing of our family. This act was costly (she spent time and effort), valuable (I love the way she draws herself bigger than everyone else in the family), and sincere (she drew it because she knew I would like it). But what if I thought that she drew it for a different reason? What if I thought that she was being coerced by my wife? Or if I thought that this was just an assignment at her pre-school? In other words, what if I thought she had no choice but to draw it? I wouldn’t have defiantly thrown it back in her face, but I surely would have felt differently about the sincerity of the action. It would have diminished my gratitude. © 2014 Scientific American
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company
By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,
More than 40 percent of infants in a group who died of sudden infant death syndrome (SIDS) were found to have an abnormality in a key part of the brain, researchers report. The abnormality affects the hippocampus, a brain area that influences such functions as breathing, heart rate, and body temperature, via its neurological connections to the brainstem. According to the researchers, supported by the National Institutes of Health, the abnormality was present more often in infants who died of SIDS than in infants whose deaths could be attributed to known causes. The researchers believe the abnormality may destabilize the brain’s control of breathing and heart rate patterns during sleep, or during the periodic brief arousals from sleep that occur throughout the night. “The new finding adds to a growing body of evidence that brain abnormalities may underlie many cases of sudden infant death syndrome,” said Marian Willinger, Ph.D, special assistant for SIDS at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development, which funded the study. “The hope is that research efforts in this area eventually will provide the means to identify vulnerable infants so that we’ll be able to reduce their risk for SIDS.” SIDS is the sudden death of an infant younger than 1 year of age that is still unexplained after a complete post mortem investigation by a coroner or medical examiner. This investigation includes an autopsy, a review of the death scene, and review of family and medical histories. In the United States, SIDS is the leading cause of death between one month and one year of age. The deaths are associated with an infant’s sleep period.
By David Tuller Patients with chronic fatigue syndrome are accustomed to disappointment. The cause of the disorder remains unknown; it can be difficult to diagnose, and treatment options are few. Research suggesting that an infection from a mouse virus may cause it raised hopes among patients a few years ago, but the evidence fell apart under closer scrutiny. Many patients are still told to seek psychiatric help. But two recent studies — one from investigators at Stanford a few weeks ago and another from a Japanese research team published earlier this year — have found that the brains of people with chronic fatigue syndrome differ from those of healthy people, strengthening the argument that serious physiological dysfunctions are at the root of the condition. “You’ve got two different groups that have independently said, ‘There’s something going on in the brain that is aberrant,’ ” said Leonard Jason, a psychologist at DePaul University in Chicago who studies the condition, also called myalgic encephalomyelitis and widely known as M.E./C.F.S. “I think you have a growing sense that this illness should be taken seriously.” Both studies were small, however, and their results must be replicated before firm conclusions can be drawn. Still, other studies presented at scientific conferences this year also have demonstrated physiological dysfunctions in these patients. In the most recent study, published by the journal Radiology, researchers at Stanford University compared brain images of 15 patients with the condition to those of 14 healthy people. The scientists found differences in both the white matter, the long, cablelike nerve structures that transmit signals between parts of the brain, and the gray matter, the regions where these signals are processed and interpreted. The most striking finding was that in people with the disorder, one neural tract in the white matter of the right hemisphere appeared to be abnormally shaped, as if the cablelike nerve structures had crisscrossed or changed in some other way. Furthermore, the most seriously ill patients exhibited the greatest levels of this abnormality. © 2014 The New York Times Company
By Michelle Roberts Health editor, BBC News online The brain has a weak spot for Alzheimer's disease and schizophrenia, according to UK scientists who have pinpointed the region using scans. The brain area involved develops late in adolescence and degenerates early during ageing. At the moment, it is difficult for doctors to predict which people might develop either condition. The findings, in the journal PNAS, hint at a potential way to diagnose those at risk earlier, experts say. Although they caution that "much more research is needed into how to bring these exciting discoveries into the clinic". The Medical Research Council team who carried out the study did MRI brain scans on 484 healthy volunteers aged between eight and 85 years. The researchers, led by Dr Gwenaëlle Douaud of Oxford University, looked at how the brain naturally changes as people age. The images revealed a common pattern - the parts of the brain that were the last to develop were also the first to show signs of age-related decline. These brain regions - a network of nerve cells or grey matter - co-ordinate "high order" information coming from the different senses, such as sight and sound. When the researchers looked at scans of patients with Alzheimer's disease and scans of patients with schizophrenia they found the same brain regions were affected. The findings fit with what other experts have suspected - that although distinct, Alzheimer's and schizophrenia are linked. Prof Hugh Perry of the MRC said: "Early doctors called schizophrenia 'premature dementia' but until now we had no clear evidence that the same parts of the brain might be associated with two such different diseases. This large-scale and detailed study provides an important, and previously missing, link between development, ageing and disease processes in the brain. BBC © 2014
By Sandra G. Boodman ‘That’s it — I’m done,” Rachel Miller proclaimed, the sting of the neurologist’s judgment fresh as she recounted the just-concluded appointment to her husband. Whatever was wrong with her, Miller decided after that 2009 encounter, she was not willing to risk additional humiliation by seeing another doctor who might dismiss her problems as psychosomatic. The Baltimore marketing executive had spent the previous two years trying to figure out what was causing her bizarre symptoms, some of which she knew made her sound delusional. Her eyes felt “weird,” although her vision was 20/20. Normal sounds seemed hugely amplified: at night when she lay in bed, her breathing and heartbeat were deafening. Water pounding on her back in the shower sounded like a roar. She was plagued by dizziness. “I had started to feel like a person in one of those stories where someone has been committed to a mental hospital by mistake or malice and they desperately try to appear sane,” recalled Miller, now 53. She began to wonder if she really was crazy; numerous tests had ruled out a host of possible causes, including a brain tumor. Continuing to look for answers seemed futile, since all the doctors she had seen had failed to come up with anything conclusive. “My attitude was: If it’s something progressive like MS [multiple sclerosis] or ALS [amyotrophic lateral sclerosis], it’ll get bad enough that someone will eventually figure it out.” Figuring it out would take nearly three more years and was partly the result of an oddity that Miller mentioned to another neurologist, after she lifted her moratorium on seeing doctors.
Link ID: 20353 - Posted: 11.25.2014
By Chelsea Rice On December 14, 2012, Adam Lanza shot and killed 20 children and six personnel at Sandy Hook Elementary School in Newtown, Connecticut, before turning the gun on himself. Ever since, America has been wondering: Why? Today, after investigating and detailing every available record of the 20-year-old Lanza’s life since birth, the Connecticut Office of the Child Advocate released a report that said: We still don’t know what drove him to commit those terrible acts. But we do know he fell through the cracks of the school system, the health care system, and possibly the awareness of his own parents. Every documented moment of Lanza’s life was evaluated, from mental health records that tracked his social development to school and medical records that outlined his needs—and showed disparities in the services provided to him by the state. The review did not, however, stop at Lanza. It included a review of the laws regarding special education and the confidentiality of personal records in the system, as well as “how these laws implicate professional obligations and practices.” Unredacted state police and law enforcement records were also reviewed alongside interviews and extensive research with members of the Child Fatality Review Panel who led the initial investigation of that day. From “earliest childhood,” according to the report, Lanza had “significant developmental challenges,” such as communication and sensory problems, delays in socialization, and repetitive behaviors. Lanza was seen and evaluated by the New Hampshire “Birth to Three” early intervention program when he was almost 3 years old, and referred to special education preschool services.
By Victoria Colliver Marianne Austin watched her mother go blind from age-related macular degeneration, an eye disease that affects about 10 million older Americans. Now that Austin has been diagnosed with the same condition, she wants to avoid her mother’s experience. “I’ve seen what can happen and the devastation it can cause,” said Austin, 67, of Atherton, who found out she had the disease last year. “I call it having seen the movie. I don’t like that ending, I want to change the movie, and I don’t want to wait 10 years until something is proven in research.” About 10 percent of patients diagnosed with age-related macular degeneration will develop the form of the disease that causes permanent blindness. It’s unclear just how much genetics plays a role, so there’s no definitive way to predict who will progress to that stage or when that would happen. But a team of Stanford doctors think they may have found a way. In a study, published this month in the medical journal Investigative Ophthalmology and Visual Science, researchers analyzed data from 2,146 retinal scans from 244 macular degeneration patients at Stanford from 2008 to 2013. They then created an algorithm that predicted whether a particular patient would be likely to develop the form of the disease that causes blindness within less than a year, three years or up to five years. For those with macular degeneration to go blind, the disease has to advance from what is known as the “dry” form to the “wet” form. The sooner a doctor can notice changes, the better chance there is to save a patient’s vision.
|By Christof Koch Point to any one organ in the body, and doctors can tell you something about what it does and what happens if that organ is injured by accident or disease or is removed by surgery—whether it be the pituitary gland, the kidney or the inner ear. Yet like the blank spots on maps of Central Africa from the mid-19th century, there are structures whose functions remain unknown despite whole-brain imaging, electroencephalographic recordings that monitor the brain's cacophony of electrical signals and other advanced tools of the 21st century. Consider the claustrum. It is a thin, irregular sheet of cells, tucked below the neocortex, the gray matter that allows us to see, hear, reason, think and remember. It is surrounded on all sides by white matter—the tracts, or wire bundles, that interconnect cortical regions with one another and with other brain regions. The claustra—for there are two of them, one on the left side of the brain and one on the right—lie below the general region of the insular cortex, underneath the temples, just above the ears. They assume a long, thin wisp of a shape that is easily overlooked when inspecting the topography of a brain image. Advanced brain-imaging techniques that look at the white matter fibers coursing to and from the claustrum reveal that it is a neural Grand Central Station. Almost every region of the cortex sends fibers to the claustrum. These connections are reciprocated by other fibers that extend back from the claustrum to the originating cortical region. Neuroanatomical studies in mice and rats reveal a unique asymmetry—each claustrum receives input from both cortical hemispheres but only projects back to the overlying cortex on the same side. Whether or not this is true in people is not known. Curiouser and curiouser, as Alice would have said. © 2014 Scientific American
Link ID: 20350 - Posted: 11.24.2014
By Amy Ellis Nutt Debbie Hall undergoes external brain stimulation at Ohio State's Wexner Medical Center. Hall was partially paralyzed on her left side after a stroke. Doctors are conducting a study to see if a device known as NexStim can `prep` a stroke victim's brain immediately prior to physical therapy so that the therapy will be more effective. (The Ohio State University Wexner Medical Center) Using non-invasive transcranial magnetic stimulation, or TMS, researchers at Ohio State Wexner Medical Center may have found a way to help prep a stroke victim's brain prior to physical therapy to aid a more complete recovery. When one side of the brain is damaged by a stroke, the corresponding healthy part goes into overdrive in order to compensate, said Dr. Marcie Bockbrader, principle investigator of the study. She believes the hyperactivity in the healthy side may actually slow recovery in the injured side. The technology, called NexStim, employs TMS to prepare a stroke patient's brain for physical therapy by sending low-frequency magnetic pulses painlessly through a victim's scalp to suppress activity in the healthy part of the motor cortex. This allows the injured side to make use of more energy during physical therapy, which immediately follows the transcranial magnetic stimulation. "This device targets the overactive side, quieting it down enough, so that through therapies the injured side can learn to express itself again," said Bockbrader, an assistant professor of physical medicine and rehabilitation, in a new release.
Link ID: 20349 - Posted: 11.24.2014
by Hal Hodson Yet another smartwatch launched this week. Called Embrace, it is rather different from the latest offerings from Apple, Samsung and Motorola: it can spot the warning signs of an epileptic seizure. Embrace was developed by Matteo Lai and his team at a firm called Empatica, with the help of Rosalind Picard at the Massachusetts Institute of Technology. It measures the skin's electrical activity as a proxy for changes deep in the brain, and uses a model built on years of clinical data to tell which changes portend a seizure. It also gathers the usual temperature and motion data that smartwatches collect, allowing the wearer to measure physical activity and sleep quality. Empatica launched a crowdfunding campaign on Indiegogo on Tuesday and has already raised more than $120,000. Backers who pledge $169 will receive an Embrace watch. The idea for the wristband came when Picard and her colleagues were running a study on the emotional states of children with autism, measuring skin conductance at the wrist as part of the study. Picard noticed that one of the children had registered a spike in electrical activity that turned out to have happened 20 minutes before they noticed the symptoms of a seizure. "It shocked me when I realised these things were showing up on the wrist," says Picard. The whole point of Embrace is to prevent sudden unexplained death in epilepsy (SUDEP). Its causes are not fully understood, but Picard says they understand enough to know how to reduce the chances of dying after an epileptic seizure. © Copyright Reed Business Information Ltd.
Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o
Link ID: 20347 - Posted: 11.24.2014
By CLYDE HABERMAN The notion that a person might embody several personalities, each of them distinct, is hardly new. The ancient Romans had a sense of this and came up with Janus, a two-faced god. In the 1880s, Robert Louis Stevenson wrote “Strange Case of Dr. Jekyll and Mr. Hyde,” a novella that provided us with an enduring metaphor for good and evil corporeally bound. Modern comic books are awash in divided personalities like the Hulk and Two-Face in the Batman series. Even heroic Superman has his alternating personas. But few instances of the phenomenon captured Americans’ collective imagination quite like “Sybil,” the study of a woman said to have had not two, not three (like the troubled figure in the 1950s’ “Three Faces of Eve”), but 16 different personalities. Alters, psychiatrists call them, short for alternates. As a mass-market book published in 1973, “Sybil” sold in the millions. Tens of millions watched a 1976 television movie version. The story had enough juice left in it for still another television film in 2007. Sybil Dorsett, a pseudonym, became the paradigm of a psychiatric diagnosis once known as multiple personality disorder. These days, it goes by a more anodyne label: dissociative identity disorder. Either way, the strange case of the woman whose real name was Shirley Ardell Mason made itself felt in psychiatrists’ offices across the country. Pre-"Sybil,” the diagnosis was rare, with only about 100 cases ever having been reported in medical journals. Less than a decade after “Sybil” made its appearance, in 1980, the American Psychiatric Association formally recognized the disorder, and the numbers soared into the thousands. People went on television to tell the likes of Jerry Springer and Leeza Gibbons about their many alters. One woman insisted that she had more than 300 identities within her (enough, if you will, to fill the rosters of a dozen major-league baseball teams). Even “Eve,” whose real name is Chris Costner Sizemore, said in the mid-1970s that those famous three faces were surely an undercount. It was more like 22, she said. © 2014 The New York Times Company
Link ID: 20346 - Posted: 11.24.2014
Christopher Stringer Indeed, skeletal evidence from every inhabited continent suggests that our brains have become smaller in the past 10,000 to 20,000 years. How can we account for this seemingly scary statistic? Some of the shrinkage is very likely related to the decline in humans' average body size during the past 10,000 years. Brain size is scaled to body size because a larger body requires a larger nervous system to service it. As bodies became smaller, so did brains. A smaller body also suggests a smaller pelvic size in females, so selection would have favored the delivery of smaller-headed babies. What explains our shrinking body size, though? This decline is possibly related to warmer conditions on the earth in the 10,000 years after the last ice age ended. Colder conditions favor bulkier bodies because they conserve heat better. As we have acclimated to warmer temperatures, the way we live has also generally become less physically demanding, which overall serves to drive down body weights. Another likely reason for this decline is that brains are energetically expensive and will not be maintained at larger sizes unless it is necessary. The fact that we increasingly store and process information externally—in books, computers and online—means that many of us can probably get by with smaller brains. Some anthropologists have also proposed that larger brains may be less efficient at certain tasks, such as rapid computation, because of longer connection pathways. © 2014 Scientific American
Link ID: 20345 - Posted: 11.24.2014