Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Mallory Locklear Men and women show different patterns of drug abuse, with women becoming addicted to some substances much more quickly. Now a study in rats has found that sex hormones can reduce opioid abuse. From studies of other drugs, such as cocaine and alcohol, we know that women are less likely to use these substances than men, but become addicted faster when they do. “There are a lot of data to indicate that women transition from that initial use to having a substance-use disorder much more rapidly,” says Mark Smith, a psychologist at Davidson College, North Carolina. Once addicted, women also seem to have stronger drug cravings. Tracking drug use throughout women’s menstrual cycles suggests that both these differences could be shaped by hormones – with more intense cravings and greater euphoria at particular times in the cycle, says Smith. Craving crash Now Smith’s team has investigated the effects of hormones on opioid addiction in rats. Their findings suggest that hormones such as oestrogen and progesterone may help women to kick the habit. The researchers allowed female rats to self-administer heroin, and measured how much they chose to take at different times in their oestrous cycle – a regular sequence of hormone fluctuations similar to those seen in the menstrual cycle in women. © Copyright Reed Business Information Ltd.
By Virginia Morell There will never be a horse like Mr. Ed, the talking equine TV star. But scientists have discovered that the animals can learn to use another human tool for communicating: pointing to symbols. They join a short list of other species, including some primates, dolphins, and pigeons, with this talent. Scientists taught 23 riding horses of various breeds to look at a display board with three icons, representing wearing or not wearing a blanket. Horses could choose between a “no change” symbol or symbols for “blanket on” or “blanket off.” Previously, their owners made this decision for them. Horses are adept at learning and following signals people give them, and it took these equines an average of 10 days to learn to approach and touch the board and to understand the meaning of the symbols. All 23 horses learned the entire task within 14 days. They were then tested in various weather conditions to see whether they could use the board to tell their trainers about their blanket preferences. The scientists report online in Applied Animal Behaviour Science that the horses did not touch the symbols randomly, but made their choices based on the weather. If it was wet, cold, and windy, they touched the "blanket on" icon; horses that were already wearing a blanket nosed the “no change” image. But when the weather was sunny, the animals touched the "blanket off" symbol; those that weren’t blanketed pressed the “no change” icon. The study’s strong results show that the horses understood the consequences of their choices, say the scientists, who hope that other researchers will use their method to ask horses more questions. © 2016 American Association for the Advancement of Science.
By Michael Price A soft brush that feels like prickly thorns. A vibrating tuning fork that produces no vibration. Not being able to tell which direction body joints are moving without looking at them. Those are some of the bizarre sensations reported by a 9-year-old girl and 19-year-old woman in a new study. The duo, researchers say, shares an extremely rare genetic mutation that may shed light on a so-called “sixth sense” in humans: proprioception, or the body’s awareness of where it is in space. The new work may even explain why some of us are klutzier than others. The patients’ affliction doesn’t have a name. It was discovered by one of the study’s lead authors, pediatric neurologist Carsten Bönnemann at the National Institutes of Health (NIH) in Bethesda, Maryland, who specializes in diagnosing unknown genetic illnesses in young people. He noticed that the girl and the woman shared a suite of physical symptoms, including hips, fingers, and feet that bent at unusual angles. They also had scoliosis, an unusual curvature of the spine. And, significantly, they had difficulty walking, showed an extreme lack of coordination, and couldn’t physically feel objects against their skin. Bönnemann screened their genomes and looked for mutations that they might have in common. One in particular stood out: a catastrophic mutation in PIEZO2, a gene that has been linked to the body’s sense of touch and its ability to perform coordinated movements. At about the same time, in a “very lucky accident,” Bönnemann attended a lecture by Alexander Chesler, a neurologist also at NIH, on PIEZO2. Bönnemann invited Chesler to help study his newly identified patients. © 2016 American Association for the Advancement of Science.
Carl Zimmer Modern humans evolved in Africa roughly 200,000 years ago. But how did our species go on to populate the rest of the globe? The question, one of the biggest in studies of human evolution, has intrigued scientists for decades. In a series of extraordinary genetic analyses published on Wednesday, researchers believe they have found an answer. In the journal Nature, three separate teams of geneticists survey DNA collected from cultures around the globe, many for the first time, and conclude that all non-Africans today trace their ancestry to a single population emerging from Africa between 50,000 and 80,000 years ago. “I think all three studies are basically saying the same thing,” said Joshua M. Akey of the University of Washington, who wrote a commentary accompanying the new work. “We know there were multiple dispersals out of Africa, but we can trace our ancestry back to a single one.” The three teams sequenced the genomes of 787 people, obtaining highly detailed scans of each. The genomes were drawn from people in hundreds of indigenous populations: Basques, African pygmies, Mayans, Bedouins, Sherpas and Cree Indians, to name just a few. The DNA of indigenous populations is essential to understanding human history, many geneticists believe. Yet until now scientists have sequenced entire genomes from very few people outside population centers like Europe and China. © 2016 The New York Times Company
Link ID: 22682 - Posted: 09.22.2016
By Andy Coghlan You made a choice and it didn’t turn out too well. How will your brain ensure you do better next time? It seems there’s a hub in the brain that doles out rewards and punishments to reinforce vital survival skills. “Imagine you go to a restaurant hoping to have a good dinner,” says Bo Li of Cold Spring Harbor Laboratory in New York. “If the food exceeds your expectations, you will likely come back again, whereas you will avoid it in future if the food disappoints.” Li’s team has discovered that a part of the brain’s basal ganglia area, called the habenula-projecting globus pallidus (GPh), plays a crucial role in this process. They trained mice to associate specific sound cues either with a reward of a drink of water or a punishment of a puff of air in the face, and then surprised them by switching them around. When mice expecting a drink were instead punished with a puff of air, GPh neurons became particularly active. But when the mice were unexpectedly rewarded, the activity of these neurons was inhibited. Further experiments revealed that once activated GPh neurons enforce punishment in the brain, reducing levels of the reward chemical dopamine in regions of the brain that plan actions. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 22681 - Posted: 09.22.2016
Sara Reardon Two heads are better than one: an idea that a new global brain initiative hopes to take advantage of. In recent years, brain-mapping initiatives have been popping up around the world. They have different goals and areas of expertise, but now researchers will attempt to apply their collective knowledge in a global push to more fully understand the brain. Thomas Shannon, US Under Secretary of State, announced the launch of the International Brain Initiative on 19 September at a meeting that accompanied the United Nations’ General Assembly in New York City. Details — including which US agency will spearhead the programme and who will pay for it — are still up in the air. However, researchers held a separate, but concurrent, meeting hosted by the US National Science Foundation at Rockefeller University to discuss which aspects of the programmes already in existence could be aligned under the global initiative. The reaction was a mixture of concerns over the fact that attemping to align projects could siphon money and attention from existing initiatives in other countries, and anticipation over the possibilities for advancing our knowledge about the brain. “I thought the most exciting moment in my scientific career was when the president announced the BRAIN Initiative in 2013,” says Cori Bargmann, a neuroscientist at the Rockefeller University in New York City and one of the main architects of the US Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. “But this was better.” © 2016 Macmillan Publishers Limited,
Keyword: Brain imaging
Link ID: 22680 - Posted: 09.22.2016
By Elisabeth Pain BARCELONA, SPAIN—In a bid to win the public's hearts and minds, the Spanish scientific community has pledged to become more transparent about animal research. Ninety research centers, universities, scientific societies, and companies around Spain have adopted a set of standards, launched yesterday by the Confederation of Spanish Scientific Societies (COSCE), on how research organizations should open up communication channels about their use of laboratory animals. They are joining a growing movement for transparency in Europe. Although animal research is generally accepted in Spain as beneficial, “part of the society is opposed to this type of research or isn’t sure about supporting it,” Juan Lerma, a professor at the Institute of Neurosciences of Alicante, Spain, who coordinated a COSCE commission on the use of animal research, wrote in the document. The signatories want to help the public better understand the benefits, costs, and limitations of animal research through a “realistic” description of the expected results, the impact on animals' welfare, and ethical considerations. Among other things, the Spanish organizations pledge to publicly recognize the fact that they're doing animal research, talk clearly about when, how, and why they use animals, allow visitors into their facilities, highlight the contribution of animal research during the dissemination of results, and publicize efforts to replace, reduce, and refine animal research. © 2016 American Association for the Advancement of Science
Keyword: Animal Rights
Link ID: 22679 - Posted: 09.22.2016
By Meredith Wadman While the United Nations General Assembly prepared for its sometimes divisive annual general debate on Monday, a less official United Nations of Brain Projects met nearby in a display of international amity and unbounded enthusiasm for the idea that transnational cooperation can, must, and will, at last, explain the brain. The tribe of some 400 neuroscientists, computational biologists, physicists, physicians, ethicists, government science counselors, and private funders convened at The Rockefeller University on Manhattan’s Upper East Side in New York City. The Coordinating Global Brain Projects gathering was mandated by the U.S. Congress in a 2015 law funding the U.S. Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. The meeting aimed to synchronize the explosion of big, ambitious neuroscience efforts being launched from Europe to China. Nearly 50 speakers from more than a dozen countries explained how their nations are plumbing brain science; all seemed eager to be part of the as-yet unmapped coordination that they hope will lead to a mellifluous symphony rather than a cacophony of competing chords. “We are really seeing international cooperation at a level that we have not seen before,” said Rockefeller’s Cori Bargmann, a neurobiologist who with Rafael Yuste of Columbia University convened the meeting with the backing of the universities, the National Science Foundation (NSF), and the Kavli Foundation, a private funder of neuroscience and nanoscience. Bargmann and Yuste have been integral to planning the BRAIN Initiative launched by President Barack Obama in the spring of 2013, which, along with the European Human Brain Project, started the new push for large-scale neuroscience initiatives. “This could be historic,” Yuste said. “I could imagine out of this meeting that groups of people could get together and start international collaborations the way the astronomers and the physicists have been doing for decades.” © 2016 American Association for the Advancement of Science
Keyword: Brain imaging
Link ID: 22678 - Posted: 09.21.2016
Nicola Davis Tyrannosaur, Breaking the Waves and Schindler’s List might make you reach for the tissues, but psychologists say they have found a reason why traumatic films are so appealing. Researchers at Oxford University say that watching traumatic films boosts feelings of group bonding, as well as increasing pain tolerance by upping levels of feel-good, pain-killing chemicals produced in the brain. “The argument here is that actually, maybe the emotional wringing you get from tragedy triggers the endorphin system,” said Robin Dunbar, a co-author of the study and professor of evolutionary psychology at the University of Oxford. Previous research has found that laughing together, dancing together and working in a team can increase social bonding and heighten pain tolerance through an endorphin boost. “All of those things, including singing and dancing and jogging and laughter, all produce an endorphin kick for the same reason - they are putting the musculature of the body under stress,” said Dunbar. Being harrowed, he adds, could have a similar effect. “It has turned out that the same areas in the brain that deal with physical pain also handle psychological pain,” said Dunbar. Writing in the journal Royal Society Open Science, Dunbar and colleagues describe how they set out to unpick whether our love of storytelling, a device used to share knowledge and cultivate a sense of identity within a group, is underpinned by an endorphin-related bonding mechanism. © 2016 Guardian News and Media Limited
By Rajeev Raizada These brain maps show how accurately it was possible to predict neural activation patterns for new, previously unseen sentences, in different regions of the brain. The brighter the area, the higher the accuracy. The most accurate area, which can be seen as the bright yellow strip, is a region in the left side of the brain known as the Superior Temporal Sulcus. This region achieved statistically significant sentence predictions in 11 out of the 14 people whose brains were scanned. Although that was the most accurate region, several other regions, broadly distributed across the brain, also produced significantly accurate sentence predictions Credit: University of Rochester graphic / Andrew Anderson and Xixi Wang. Used with permission Words, like people, can achieve a lot more when they work together than when they stand on their own. Words working together make sentences, and sentences can express meanings that are unboundedly rich. How the human brain represents the meanings of sentences has been an unsolved problem in neuroscience, but my colleagues and I recently published work in the journal Cerebral Cortex that casts some light on the question. Here, my aim is to give a bigger-picture overview of what that work was about, and what it told us that we did not know before. To measure people's brain activation, we used fMRI (functional Magnetic Resonance Imaging). When fMRI studies were first carried out, in the early 1990s, they mostly just asked which parts of the brain "light up,” i.e. which brain areas are active when people perform a given task. © 2016 Scientific American
Alva Noë Eaters and cooks know that flavor, in the jargon of neuroscientists, is multi-modal. Taste is all important, to be sure. But so is the look of food and its feel in the mouth — not to mention its odor and the noisy crunch, or juicy squelch, that it may or may not make as we bite into it. The perception of flavor demands that we exercise a suite of not only gustatory, but also visual, olfactory, tactile and auditory sensitivities. Neuroscientists are now beginning to grasp some of the ways the brain enables our impressive perceptual power when it comes to food. Traditionally, scientists represent the brain's sensory function in a map where distinct cortical areas are thought of as serving the different senses. But it is increasingly appreciated that brain activity can't quite be segregated in this way. Cells in visual cortex may be activated by tactile stimuli. This is the case, for example, when Braille readers use their fingers to read. These blind readers aren't seeing with their fingers, rather, they are deploying their visual brains to perceive with their hands. And, in a famous series of studies that had a great influence on my thinking on these matters, Miriganka Sur at MIT showed that animals whose retinas were re-wired surgically to feed directly into auditory cortex do not hear lights and other visible objects presented to the eyes, rather, they see with their auditory brains. The brain is plastic, and different sensory modalities compete continuously for control over populations of cells. An exciting new paper on the gustatory cortex from the laboratory of Alfredo Fontanini at Stony Brook University shows that there are visual-, auditory-, olfactory- and touch-sensitive cells in the gustatory cortex of rats. There are even some cells that respond to stimuli in more than one modality. But what is more remarkable is that when rats learn to associate non-taste qualities — tones, flashes of lights, etc. — with food (sucrose in their study), there is a marked transformation in the gustatory cortex. © 2016 npr
By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American
By Carey Goldberg I’d just gotten used to the idea that I’m a walking mountain of microbes. The sizzling field of research into the microbiome — our full complement of bugs — is casting new light on our role as homes to the trillions of bacteria that inhabit each of us. At least most of them are friendly, I figured. But now comes the next microbial shift in my self-image, courtesy of the new book “The Mind-Gut Connection.” My trillions of gut microbes, it seems, are in constant communication with my brain, and there’s mounting evidence that they may affect how I feel — not just physically but emotionally. Does this mean — gulp — that maybe our bugs are driving the bus? I spoke with the book’s author, Dr. Emeran Mayer, professor of medicine and psychiatry at UCLA, executive director of the Oppenheimer Center for Neurobiology of Stress and Resilience and expert in brain-gut microbiome interactions. Edited excerpts: So we’re not only packed with trillions of gut microbes but they’re in constant cross-talk with our brains — that’s the picture? First of all, you have to realize that these are invisible creatures. So even though there are 100 trillion of them living in our gut, you wouldn’t be able to see them with the naked eye. It’s not like something tangible sitting inside of you, like another organ. © Copyright WBUR 2016
Link ID: 22673 - Posted: 09.20.2016
Laura Sanders In growing brains, billions of nerve cells must make trillions of precise connections. As they snake through the brain, nerve cell tendrils called axons use the brain’s stiffness to guide them on their challenging journey, a study of frog nerve cells suggests. The results, described online September 19 in Nature Neuroscience, show that along with chemical guidance signals, the brain’s physical properties help shape its connections. That insight may be key to understanding how nerve cells wire the brain, says study coauthor Kristian Franze. “I strongly believe that it’s not enough to look at chemistry,” says Franze, a mechanobiologist at the University of Cambridge. “We need to look at environmental factors, too.” The notion that physical features help guide axons is gaining momentum, says neuroscientist Samantha Butler of UCLA. “It’s a really intriguing study.” A better understanding of how nerve cells find their targets could help scientists coax new cells to grow after a spinal cord injury or design better materials for nerve cell implants. Franze and colleagues studied nerve cells from the retina of frogs. Experiments on cells in dishes suggested that axons, signal-transmitting tendrils led by tiny pioneering structures called growth cones, grew differently on hard and soft material. Axons grew longer and straighter on stiff surfaces and seemed to meander more on softer material. © Society for Science & the Public 2000 - 2016.
Keyword: Development of the Brain
Link ID: 22672 - Posted: 09.20.2016
By SABRINA TAVERNISE WASHINGTON — The Food and Drug Administration approved the first drug to treat patients with the most common childhood form of muscular dystrophy, a vivid example of the growing power that patients and their advocates wield over the federal government’s evaluation of drugs. The agency’s approval went against the recommendation of its experts. The main clinical trial of the drug was small, involving only 12 boys with the disease known as Duchenne muscular dystrophy, and did not have an adequate control group of boys who had the disease but did not take the drug. A group of independent experts convened by the agency this spring said there was not enough evidence that it was effective. But the vote was close. Large and impassioned groups of patients, including boys in wheelchairs, and their advocates, weighed in. The muscular dystrophy community is well organized and has lobbied for years to win approval for the drug, getting members of Congress to write letters to the agency. A decision on the drug had been delayed for months. The approval was so controversial that F.D.A. employees fought over it, a dispute that was taken to the agency’s commissioner, Dr. Robert M. Califf, who ultimately decided that it would stand. The approval delighted the drug’s advocates and sent the share price of the drug’s maker, Sarepta Therapeutics, soaring. But it was taken as a deeply troubling sign among drug policy experts who believe the F.D.A. has been far too influenced by patient advocates and drug companies, and has allowed the delicate balance in drug approvals to tilt toward speedy decisions based on preliminary data and away from more conclusive evidence of effectiveness and safety. © 2016 The New York Times Company
Researchers at the National Institutes of Health have discovered a two-way link between depression and gestational diabetes. Women who reported feeling depressed during the first two trimesters of pregnancy were nearly twice as likely to develop gestational diabetes, according to an analysis of pregnancy records. Conversely, a separate analysis found that women who developed gestational diabetes were more likely to report postpartum depression six weeks after giving birth, compared to a similar group of women who did not develop gestational diabetes. The study was published online in Diabetologia. Gestational diabetes is a form of diabetes (high blood sugar level) occurring only in pregnancy, which if untreated may cause serious health problems for mother and infant. “Our data suggest that depression and gestational diabetes may occur together,” said the study’s first author, Stefanie Hinkle, Ph.D., staff scientist in the Division of Intramural Population Health Research at the NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “Until we learn more, physicians may want to consider observing pregnant women with depressive symptoms for signs of gestational diabetes. They also may want to monitor women who have had gestational diabetes for signs of postpartum depression.” Although obesity is known to increase the risk for gestational diabetes, the likelihood of gestational diabetes was higher for non-obese women reporting depression than for obese women with depression.
By Meredith Wadman Last year, in a move to counter charges that it has neglected the health and safety of its players, the National Football League (NFL) tapped Elizabeth “Betsy” Nabel as its first chief health and medical adviser, a paid position to which she told The Boston Globe she devotes about 1 day a month, plus some nights and weekends. (She and NFL have not disclosed her salary.) And last week, Nabel answered Science’s questions on the heels of NFL’s 14 September announcement that it will devote $40 million in new funding to medical research, primarily neuroscience relevant to repetitive head injuries—with grant applications judged by an NFL-convened panel of scientists, rather than by National Institutes of Health (NIH) study sections. Nabel is well known to many medical scientists as the cardiologist who directed the National Heart, Lung, and Blood Institute at NIH, then left that job in 2009 to become president of a prestigious Harvard University–affiliated teaching hospital: Brigham and Women’s Hospital in Boston. Nabel’s new role with NFL came under media scrutiny in May, when a report by Democrats on the House of Representatives Energy and Commerce Committee found that NFL inappropriately tried to influence the way its “unrestricted” donation to NIH was spent. It revealed, for example, that last year Nabel contacted NIH’s neurology institute director Walter Koroshetz to question the objectivity of an NIH study section and of a principal investigator whose team the peer reviewers had just awarded a $16 million grant. Robert Stern and his group at Boston University, with others, were proposing to image the brains and chart the symptoms of scores of college and professional football players across time. NFL suggested that the scientists, who have led in establishing the link between repetitive head injury and the neurodegenerative brain disease chronic traumatic encephalopathy (CTE), were not objective; Nabel described them in one email as “a more marginal group” whose influence it would be well to “dilute.” The scientists were to have been paid from $30 million that NFL donated to NIH in 2012. After the league objected to its $16 million going to fund the Boston University–led team—it did offer to fund $2 million of the amount—NIH’s neurology institute ended up wholly funding the 7-year grant with its own money. © 2016 American Association for the Advancement of Scienc
Keyword: Brain Injury/Concussion
Link ID: 22669 - Posted: 09.20.2016
By CATHERINE SAINT LOUIS Attention deficit disorder is the most common mental health diagnosis among children under 12 who die by suicide, a new study has found. Very few children aged 5 to 11 take their own lives, and little is known about these deaths. The new study, which included deaths in 17 states from 2003 to 2012, compared 87 children aged 5 to 11 who committed suicide with 606 adolescents aged 12 to 14 who did, to see how they differed. The research was published on Monday in the journal Pediatrics. About a third of the children of each group had a known mental health problem. The very young who died by suicide were most likely to have had attention deficit disorder, or A.D.D., with or without accompanying hyperactivity. By contrast, nearly two-thirds of early adolescents who took their lives struggled with depression. Suicide prevention has focused on identifying children struggling with depression; the new study provides an early hint that this strategy may not help the youngest suicide victims. “Maybe in young children, we need to look at behavioral markers,” said Jeffrey Bridge, the paper’s senior author and an epidemiologist at the Research Institute at Nationwide Children’s Hospital in Columbus, Ohio. Jill Harkavy-Friedman, the vice president of research at the American Foundation for Suicide Prevention, agreed. “Not everybody who is at risk for suicide has depression,” even among adults, said Dr. Harkavy-Friedman, who was not involved in the new research. Yet the new research does not definitively establish that attention deficit disorder and attention deficit hyperactivity disorder, or A.D.H.D., are causal risk factors for suicide in children, Dr. Bridge said. Instead, the findings suggest that “suicide is potentially a more impulsive act among children.” © 2016 The New York Times Company
By PAGAN KENNEDY In 1914, The Lancet reported on a clergyman who was found dead in a pool; he had left behind this suicide note: “Another sleepless night, no real sleep for weeks. Oh, my poor brain, I cannot bear the lengthy, dark hours of the night.” I came across that passage with a shock of recognition. Many people think that the worst part of insomnia is the daytime grogginess. But like that pastor, I suffered most in the dark hours after midnight, when my desire for sleep, my raging thirst for it, would drive me into temporary insanity. On the worst nights, my mind would turn into a mad dog that snapped and gnawed itself. Though one in 10 American adults suffer from chronic insomnia, we have yet to answer the most fundamental questions about the affliction. Scientists are still arguing about the mechanisms of sleep and the reasons it fails in seemingly healthy people. There are few — if any — reliable treatments for insomnia. At the same time, medical journals warn that bad sleep can fester into diseases like cancer and diabetes. Deep in the night, those warnings scuttle around my mind like rats. About 18 months ago, during a particularly grueling period, I felt so desperate that I consulted yet another doctor — but all he did was suggest the same drugs that had failed me in the past. I was thrown back once again on my own ways of coping. As a child, I had invented mental games to distract myself. For instance, I would compile a list of things and people that made me happy, starting with words that began with A and moving through the alphabet. One night, I was in the Qs, trying to figure out what to add to quesadillas, queer theory and Questlove. Then, suddenly, the game infuriated me — why, why, why did I have to spend hours doing this? In the red glare of the digital clock, my brain rattled its cage. I prepared for a wave of lunacy. But instead of a meltdown, I had a wild idea: What if there was another, easier, way to drive the miserable thoughts from my mind? I began to fantasize about a machine that would do the thinking for me. I pictured it like another brain that would fit on top of my head. The next day, I cobbled together my first insomnia machine. © 2016 The New York Times Company
Link ID: 22667 - Posted: 09.19.2016
By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company