Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was texting a girlfriend at the time, adding in unmistakable anguish that he can’t even remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company
Link ID: 20124 - Posted: 09.27.2014
By Rachel Feltman With the help of electrical stimulation, a paralyzed rat is "walking" again. It's actually being controlled by a computer that monitors its gait and adjusts it to keep the rat balanced. When a spinal cord is severed, the electrical pulses sent out by the brain to control limb movement are interrupted. With this method of treatment, the rat's leg movements are driven by electrical pulses shot directly into the spinal cord (which has unfortunately been severed in the name of science). Scientists have been working on this method in humans for awhile, but have only had moderate success — some subjects have regained sensation and movement in their legs, but haven't walked on their own. In the experiment described in the video above, published Wednesday in Science Translational Medicine, researchers tweaked this use of electrical stimulation: They primed the rats with a drug to boost their ability to respond to the electrical signal. Then, while the rats were placed in treadmill harnesses to support their weight, the researchers trained a camera on their subjects. The camera tracked the rats as they took electrically stimulated steps, and corrected their movement in real time. This instant feedback made the system precise enough to get the rats up tiny sets of stairs. MIT Technology Review reports that the team hopes to use a human volunteer within the next year. If the system works on humans, doctors can prescribe its use in rehabilitation therapy. You can watch the actual experiment in the video below:
by Helen Thomson My, what big eyes you have – you must be trying really hard. A study of how pupils dilate with physical effort could allow us to make strenuous tasks seem easier by zapping specific areas of the brain. We know pupils dilate with mental effort, when we think about a difficult maths problem, for example. To see if this was also true of physical exertion, Alexandre Zenon at the Catholic University of Louvain in Belgium, measured the pupils of 18 volunteers as they squeezed a device which reads grip strength. Sure enough, the more force they exerted, the larger their pupils. To see whether pupil size was related to actual or perceived effort, the volunteers were asked to squeeze the device with four different grip strengths. Various tests enabled the researchers to tell how much effort participants felt they used, from none at all to the most effort possible. Comparing the results from both sets of experiments suggested that pupil dilation correlated more closely with perceived effort than actual effort. The fact that both mental effort and perceived physical effort are reflected in pupil size suggests there is a common representation of effort in the brain, says Zenon. To see where in the brain this might be, the team looked at which areas were active while similar grip tasks were being performed. Zenon says they were able to identify areas within the supplementary motor cortex – which plays a role in movement – associated with how effortful a task is perceived to be. © Copyright Reed Business Information Ltd.
Link ID: 20121 - Posted: 09.27.2014
By Roni Caryn Rabin When I was in college, my father David started walking with an odd, barely perceptible limp. He was in his mid-40s, a gregarious physician, teacher and researcher who was always upbeat. He told his four kids that he had a “back problem” — a deliberately vague cover story that I, for one, was willing to believe. I had never heard of the real culprit — amyotrophic lateral sclerosis, or A.L.S. In fact, no one had. A.L.S. was a disease in the shadows. During my father’s life, it didn’t even have its own advocacy organization. This was the early ’80s, long before support groups and the Internet and a colored ribbon for every cause. And it was way before ice bucket challenges. My parents continued to use their code — “back problem” — to talk about the disease. They used it to protect my younger sisters, who were about to start high school, but I think they were also protecting themselves. My mother was also a physician, and they both knew exactly what lay ahead. Saying “A.L.S.” out loud was too threatening. But soon there was no getting around it. My father’s legs were getting weaker, his muscles were wasting, and he started relying on a cane to get around. I was 19, and my mother and I were out running errands one afternoon when she pulled the car over to the curb and stopped. She told me the truth. This was no slipped disc. She laid it all out for me in black and white: A.L.S. is a progressive, degenerative neurological disease that causes paralysis in the entire body. It’s fatal. There is no cure. It sounded like something from a horror movie. Over the next five years, as my father’s health deteriorated, he remained remarkably determined. He ate a high-protein diet and swam laps every day in an attempt to maintain his muscle and fend off the atrophy caused by the disease. He kept on swimming laps in our next-door neighbor’s pool, even when he had to use a walker — and later a wheelchair — to get there. © 2014 The New York Times Company
Keyword: ALS-Lou Gehrig's Disease
Link ID: 20120 - Posted: 09.27.2014
By Jennifer Cutraro and Michael Gonchar Marijuana is illegal in the United States. Yet 35 states and the District of Columbia permit some form of marijuana consumption for medical purposes, and, as of this year, two states now allow its recreational use. As national policy evolves on this issue, the New York Times editorial board this summer published a six-part series calling for legalization. In this lesson, we pull together those opinion pieces as well as many other Times articles, graphics and videos to offer starting points for science, social studies and English teachers aiming to use the debate as an opportunity for learning, research and discussion. Like other crops, marijuana is largely cultivated — legally and illegally — in greenhouse-type “grow houses” and on farms. And like other crops, marijuana comes from a plant — cannabis, originally found in the wild and cultivated over thousands of years. Have students research the history of cannabis, from its origins in South and Central Asia to its introduction to the Americas. How have people used the different parts of the plant throughout history? Then, have students work in groups to annotate a map of the world, tracing the history of marijuana cultivation. Marijuana is best known for its psychoactive properties. But how does marijuana bring about these sensations and how else does it behave in the body? To answer these questions, students might research how the active compounds in marijuana affect the body at the level of the cell, and draw parallels with how other drugs act in the body. As is the case with many other drugs — from legal, over-the-counter medications to illegal street drugs, like heroin — the active compounds interact with locations on the surfaces of cells called receptors. Cell surface receptors provide a means for cells to receive information and input from the environment; when a molecule attaches, or binds, to a cell surface receptor, it triggers a series of events inside the cell, like the release of hormones, neurotransmitters or other molecules. A discussion about marijuana’s effects on the body might dovetail nicely with a broader class discussion or review of cell biology, the makeup and function of the cell membrane, and the function of neurotransmitters. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 20118 - Posted: 09.27.2014
Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.
Link ID: 20115 - Posted: 09.25.2014
by Greg Laden I heard yesterday that my friend and former advisor Irven DeVore died. He was important, amazing, charming, difficult, harsh, brilliant, fun, annoying. My relationship to him as an advisee and a friend was complex, important to me for many years, and formative. For those who don’t know he was instrumental in developing several subfields of anthropology, including behavioral biology, primate behavioral studies, hunter-gatherer research, and even ethnoarchaeology. He was a cultural anthropologist who realized during his first field season that a) he was not cut out to be a cultural anthropologist and b) most of the other cultural anthropologists were not either. Soon after he became Washburn’s student and independently invented the field study of complex social behavior in primates (though some others were heading in that direction at the same time), producing his famous work on the baboons of Kenya’s Nairobi National Park. For many years, what students learned about primate behavior, they learned from that work. Later he and Richard Lee, along with John Yellen, Alison Brooks, Henry Harpending, and others started up the study of Ju/’hoansi Bushmen along the Namibian/Botswana border. One of the outcomes of that work was the famous Werner Gren conference and volume called “Man the Hunter.” That volume has two roles in the history of anthropology. First, it launched modern forager studies. Second, it became one of the more maligned books in the field of Anthropology. I have yet to meet a single person who has a strong criticism of that book that is not based on having not read it. For many years, much of what students learned about human foragers, they learned from that work.
Link ID: 20114 - Posted: 09.25.2014
By SAM BORDEN Bellini, a Brazilian soccer star who led the team that won the 1958 World Cup and was honored with a statue outside the Estádio do Maracanã in Rio de Janeiro, had a degenerative brain disease linked to dozens of boxers and American football players when he died in March at age 83. At the time, his death was attributed to complications related to Alzheimer’s disease. But researchers now say he had an advanced case of chronic traumatic encephalopathy, or C.T.E., which is caused by repeated blows to the head and has symptoms similar to those of Alzheimer’s. C.T.E. can be diagnosed only posthumously, and few brains of former soccer players have been examined. Bellini is the second known case, according to Dr. Ann McKee, a neuropathologist at Boston University and the Veterans Affairs Medical Center in Bedford, Mass., who assisted in examining Bellini’s brain. McKee was also involved this year when researchers found C.T.E. in the brain of a 29-year-old man from New Mexico who had played soccer semiprofessionally. McKee said in an interview that she was aware of a third former soccer player who had C.T.E. but that she was not yet authorized to publicly identify the person. As C.T.E. began to gain widespread attention about six years ago, it was often thought of as an American problem. Many of the early cases of the disease, for which there is no known cure, were connected to boxers and American football players. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20110 - Posted: 09.24.2014
// by Jennifer Viegas Harems -- where a group of females share a single mate -- can be sexual bliss for the male, but the arrangement poses many challenges for him, according to a new study. Male leaders of harems are often overworked and tired finds the study, published in the latest issue of Royal Society Open Science. Gelada baboons exemplify the problems. "Being a gelada leader male is fairly exhausting," co-author David Pappano told Discovery News. "In order to keep the females within his harem happy, gelada leader males spend a lot of time grooming them." "When bachelors are around, leader males often engage in costly displays -- running around, climbing up a tree, and producing a very loud (ee-yow) display call," added Pappano, who is an NSF postdoctoral research fellow at Princeton University's Department of Ecology and Evolutionary Biology. He co-authored the paper with Jacinta Beehner. © 2014 Discovery Communications, LLC.
Keyword: Sexual Behavior
Link ID: 20108 - Posted: 09.24.2014
By Nicholas Bakalar Average waist circumference — but not body mass index— increased significantly in the United States between 1999 and 2012, a new study reports. Abdominal obesity — a “beer belly” or “beer gut” — is caused by fat around the internal organs. It is one of the indicators of metabolic syndrome, a group of five conditions that raises the risk for heart disease and diabetes. After adjusting for age, the overall mean waist circumference increased to 38.7 inches in 2012 from 37.5 in 1999. The increases were significant for men, women, non-Hispanic whites, non-Hispanic blacks and Mexican-Americans. They were greatest among non-Hispanic whites in their 40s, and non-Hispanic black men in their 30s. “I would encourage people to keep track of their waists,” said the lead author of the study, Dr. Earl S. Ford, a medical officer with the Centers for Disease Control and Prevention. “Standing on the scale every day is all good and well, but you can have a steady weight and still have an expanding waist. And that should be a signal for people to start looking at their diet and physical activity.” In 2012, 54.2 percent of Americans had abdominal obesity (defined as an age-adjusted waist circumference of more than 40 inches for men and more than 34.6 for women) compared with 46.4 percent in 1999. The study was published in JAMA. © 2014 The New York Times Company
Link ID: 20103 - Posted: 09.23.2014
2014 by Dan Jones The vast majority of people think we have free will and are the authors of our own life stories. But if neuroscientists were one day able to predict our every action based on brain scans, would people abandon this belief in droves? A new study concludes that such knowledge would not by itself be enough to shake our confidence in our own volition. Many neuroscientists, such as the late Francis Crick, have argued that our sense of free will is no more than the behaviour of a vast assembly of nerve cells. This is tied to the idea of determinism, which has it that every effect is preceded by a cause, with cause and effect connected by physical laws. This is why the behaviour of physical systems can be predicted – even the brain, in principle. As author Sam Harris puts it: "If determinism is true, the future is set – and this includes all our future states of mind and our subsequent behaviour." If people lost their belief in their own free will, that would have important consequences for how we think about moral responsibility, and even how we behave. For example, numerous studies have shown that when people are led to reject free will they are more likely to cheat, and are also less bothered about punishing other wrongdoers. For those who argue that what we know about neuroscience is incompatible with free will, predicting what our brain is about to do should reveal the illusory nature of free will, and lead people to reject it. Experimental philosopher Eddy Nahmias at Georgia State University in Atlanta dubs this view "willusionism". He recently set out to test it. © Copyright Reed Business Information Ltd.
Link ID: 20102 - Posted: 09.22.2014
By CLYDE HABERMAN When it came to pharmacological solutions to life’s despairs, Aldous Huxley was ahead of the curve. In Huxley’s 1932 novel about a dystopian future, the Alphas, Betas and others populating his “Brave New World” have at their disposal a drug called soma. A little bit of it chases the blues away: “A gramme” — Huxley was English, remember, spelling included — “is better than a damn.” With a swallow, negative feelings are dispelled. Prozac, the subject of this week’s video documentary from Retro Report, is hardly soma. But its guiding spirit is not dissimilar: A few milligrams of this drug are preferable to the many damns that lie at the core of some people’s lives. Looking back at Prozac’s introduction by Eli Lilly and Company in 1988, and hopscotching to today, the documentary explores the enormous influence, both chemical and cultural, that Prozac and its brethren have had in treating depression, a concern that gained new resonance with the recent suicide of the comedian Robin Williams. In the late 1980s and the 90s, Prozac was widely viewed as a miracle pill, a life preserver thrown to those who felt themselves drowning in the high waters of mental anguish. It was the star in a class of new pharmaceuticals known as S.S.R.I.s — selective serotonin reuptake inhibitors. Underlying their use is a belief that depression is caused by a shortage of the neurotransmitter serotonin. Pump up the levels of this brain chemical and, voilà, the mood lifts. Indeed, millions have embraced Prozac, and swear by it. Depression left them emotionally paralyzed, they say. Now, for the first time in years, they think clearly and can embrace life. Pharmacological merits aside, the green-and-cream pill was also a marvel of commercial branding, down to its market-tested name. Its chemical name is fluoxetine hydrochloride, not the most felicitous of terms. A company called Interbrand went to work for Eli Lilly and came up with Prozac. “Pro” sounds positive. Professional, too. “Ac”? That could signify action. As for the Z, it suggests a certain strength, perhaps with a faint high-techy quality. © 2014 The New York Times Company
Link ID: 20098 - Posted: 09.22.2014
By Jocelyn Kaiser A virus that shuttles a therapeutic gene into cells has strengthened the muscles, improved the motor skills, and lengthened the lifespan of mice afflicted with two neuromuscular diseases. The approach could one day help people with a range of similar disorders, from muscular dystrophy to amyotrophic lateral sclerosis, or ALS. Many of these diseases involve defective neuromuscular junctions—the interface between neurons and muscle cells where brain signals tell muscles to contract. In one such disease, a form of familial limb-girdle myasthenia, people carry two defective copies of the gene called DOK7, which codes for a protein that’s needed to form such junctions. Their hip and shoulder muscles atrophy over many years, and some eventually have trouble breathing or end up in a wheelchair. Mice similarly missing a properly working Dok7 gene are severely underweight and die within a few weeks. In the new study, researchers led by molecular biologist Yuji Yamanashi of the University of Tokyo first injected young mice engineered to have defective Dok7 with a harmless virus carrying a good copy of the Dok7 gene, which is expressed only in muscle. Within about 7 weeks, the rodents recovered. Their muscle cells cranked out the DOK7 protein, and under a microscope their muscles had larger neuromuscular junctions than those of untreated mice with defective Dok7. What’s more, the mice grew to a healthy body weight and had essentially normal scores on tests of motor skills and muscle strength. © 2014 American Association for the Advancement of Science.
By John Horgan On this blog, in my book The End of War and elsewhere (see Further Reading and Viewing), I have knocked the deep roots theory of war, which holds that war stems from an instinct deeply embedded in the genes of our male ancestors. Inter-community killings are rare among chimpanzees and non-existent among bonobos, according to a new report in Nature, undercutting the theory that the roots of war extend back to the common ancestor of humans and chimps. Proponents of this theory—notably primatologist Richard Wrangham—claim it is supported by observations of inter-community killings by chimpanzees, Pan troglodytes, our closest genetic relatives. Skeptics, including anthropologists Robert Sussman and Brian Ferguson, have pointed out that chimpanzee violence might be not an adaptation but a response to environmental circumstances, such as human encroachment. This “human impacts” hypothesis is rejected in a new report in Nature by a coalition of 30 primatologists, including Wrangham and lead author Michael Wilson. In “Lethal aggression in Pan is better explained by adaptive strategies than human impacts,” Wilson et al. analyze 152 killings in 18 chimpanzee communities and find “little correlation with human impacts.” Given that the primary interest in chimp violence is its alleged support of the deep-roots theory, it might seem odd, at first, that Wilson et al. do not mention human warfare. Actually, this omission is wise, because the Nature report undermines the deep-roots theory of war, and establishes that the “human impact” issue is a red herring. © 2014 Scientific American,
I’m an epileptic. It’s not how I define myself, but I am writing about epilepsy, so I think pointing out the fact that I am speaking from experience is acceptable. I may not define myself by my epilepsy but it’s a big part of my life. It affects my life on a daily basis. Because of the epilepsy I can’t drive, can’t pull all-nighters or get up really early just in case I have a seizure. It’s frustrating at times, though I will gladly milk the not getting up early thing when I can, eg bin day. But whereas I’ve grown up with it, having been diagnosed when I was 17, most people I’ve met don’t understand it. You mention the fact that you’re epileptic to some people and they look at you like they’re a robot you’ve just asked to explain the concept of love; they adopt a sort of “DOES NOT COMPUTE!” expression. They often don’t know what to say, or do, or even what epilepsy is and often spend the rest of the conversation searching their data banks for information on what to do if I have a seizure, like “Do I … put a spoon in his mouth?” For the record: no, you don’t. If putting a spoon in an epileptics mouth helped, then we would be prescribed a constant supply of Fruit Corners. So let me put you at ease. No one expects you to know that much about epilepsy (unless you’re responsible for treating it). There are many different types, with many different causes. Not everyone has seizures and often those who do, when given the correct meds, can live pretty much fit-free lives. © 2014 Guardian News and Media Limited
Link ID: 20091 - Posted: 09.18.2014
By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.
Link ID: 20088 - Posted: 09.18.2014
By Virginia Morell Living in a complex social world—one with shifting alliances and competitors—is often cited as the key reason humans, dolphins, and spotted hyenas evolved large brains. Now, researchers say that social complexity also underlies the braininess of parrots, which have big brains relative to their body size. To understand the social lives of these birds, the scientists observed wild populations of monk parakeets (Myiopsitta monachus), a small parrot, in Argentina and captive ones in Florida. They recorded how often the birds (pictured) were seen with other individuals and how they interacted—and then analyzed the parakeets’ social networks. The birds, the researchers report online today in The Auk: Ornithological Advances, prefer to spend time with one specific individual, usually their mate. In the captive populations, the birds also had strong associations with one or two other individuals, numerous more moderate relationships, and only a few that were weak. The scientists also recorded aggressive interactions among the captive birds, revealing that monk parakeets have a dominance hierarchy based on which birds won or lost confrontations. Thus, the parakeets’ society has layers of relationships, similar to those documented in other big-brained animals. Living in such a society requires that the birds recognize and remember others, and whether they are friend or foe—mental tasks that are thought to be linked to the evolution of significant cognitive skills. © 2014 American Association for the Advancement of Science
Link ID: 20087 - Posted: 09.18.2014
|By Daniel A. Yudkin If you’re reading this at a desk, do me a favor. Grab a pen or pencil and hold the end between your teeth so it doesn’t touch your lips. As you read on, stay that way—science suggests you’ll find this article more amusing if you do. Why? Notice that holding a pencil in this manner puts your face in the shape of a smile. And research in psychology says that the things we do—smiling at a joke, giving a gift to a friend, or even running from a bear—influence how we feel. This idea—that actions affect feelings—runs counter to how we generally think about our emotions. Ask average folks how emotions work—about the causal relationship between feelings and behavior—and they’ll say we smile because we’re happy, we run because we’re afraid. But work by such psychologists as Fritz Strack, Antonio Damasio, Joe LeDoux shows the truth is often the reverse: what we feel is actually the product, not the cause, of what we do. It’s called “somatic feedback.” Only after we act do we deduce, by seeing what we just did, how we feel. This bodes well, at first blush, for anyone trying to change their emotions for the better. All you’d need to do is act like the kind of person you want to be, and that’s who you’ll become. (Call it the Bobby McFerrin philosophy: “Aren’t happy? Don’t worry. Just smile!”) But new research, published in the Journal of Experimental Social Psychology by Aparna Labroo, Anirban Mukhopadhyay, and Ping Dong suggests there may be limits to our ability to proactively manage our own well-being. The team ran a series of studies examining whether more smiling led to more happiness. One asked people how much smiling they had done that day, and how happy they currently felt. Other studies manipulated the amount of smiling people actually did, either by showing them a series of funny pictures or by replicating a version of the pencil-holding experiment. As expected, across these experiments, the researchers found that the more people smiled, the happier they reported being. © 2014 Scientific American
Link ID: 20085 - Posted: 09.17.2014
By Douglas Main Researchers have created a blood test that they have used to accurately diagnose depression in a small sample of people, and they hope that with time and funding it could be used on a widespread basis. It is the first blood test—and thus the first “objective” gauge—for any type of mental disorder in adults, says study co-author Eva Redei, a neuroscientist at Northwestern University in Evanston, Ill. Outside experts caution, however, that the results are preliminary, and not close to ready for use the doctor’s office. Meanwhile, diagnosing depression the “old-fashioned way” through an interview works quite well, and should only take 10 to 15 minutes, says Todd Essig, a clinical psychologist in New York. But many doctors are increasingly overburdened and often not reimbursed for taking the time to talk to their patients, he says. The test works by measuring blood levels of nine different types of RNA, a chemical that the body uses to process DNA. Besides accurately diagnosing depression, which affects perhaps 10 percent of American adults and is becoming more common, the technique may also be able to tell who could benefit from talk therapy and who may be vulnerable to the condition in the first place. In a study describing the test, published in the journal Translational Psychiatry, the scientists recruited 32 patients who were diagnosed with depression using a clinical interview, the standard technique. They also got 32 non-depressed patients to participate as a control group. © 2014 Newsweek LLC
Link ID: 20084 - Posted: 09.17.2014
By Neuroskeptic Today, we are thinking – and talking – about the brain more than ever before. It is widely said that neuroscience has much to teach psychiatry, cognitive science, economics, and others. Practical applications of brain science are proposed in the fields of politics, law enforcement and education. The brain is everywhere. This “Neuro Turn” has, however, not always been accompanied by a critical attitude. We ought to be skeptical of any claims regarding the brain because it remains a mystery – we fundamentally do not understand how it works. Yet much neuro-discourse seems to make the assumption that the brain is almost a solved problem already. For example, media stories about neuroscience commonly contain simplistic misunderstandings – such as the tendency to over-interpret neural activation patterns as practical guides to human behavior. For instance, recently we have heard claims that because fMRI finds differences in the brain activity of some violent offenders, this means that their criminal tendencies are innate and unchangeable – with clear implications for rehabilitation. Neuroscientists are well aware of the faults in lay discourse about the brain – and are increasingly challenging them e.g. on social media. Unfortunately, the same misunderstandings also exist within neuroscience itself. For example, I argue, much of cognitive neuroscience is actually based on (or, only makes sense given the assumption that) the popular misunderstanding that brain activity has a psychological ‘meaning’. In fact, we just do not know what a given difference in brain activity means, in the vast majority of cases. Thus, many research studies based on finding differences in fMRI activity maps across groups or across conditions, are not really helping us to understand the brain at all – but only providing us with a canvas to project our misunderstandings onto it.
Keyword: Brain imaging
Link ID: 20082 - Posted: 09.17.2014