Chapter 18. Attention and Higher Cognition
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
George Paxinos Many people today believe they possess a soul. While conceptions of the soul differ, many would describe it as an “invisible force that appears to animate us”. It’s often believed the soul can survive death and is intimately associated with a person’s memories, passions and values. Some argue the soul has no mass, takes no space and is localised nowhere. But as a neuroscientist and psychologist, I have no use for the soul. On the contrary, all functions attributable to this kind of soul can be explained by the workings of the brain. Psychology is the study of behaviour. To carry out their work of modifying behaviour, such as in treating addiction, phobia, anxiety and depression, psychologists do not need to assume people have souls. For the psychologists, it is not so much that souls do not exist, it is that there is no need for them. It is said psychology lost its soul in the 1930s. By this time, the discipline fully became a science, relying on experimentation and control rather than introspection. What is the soul? It is not only religious thinkers who have proposed that we possess a soul. Some of the most notable proponents have been philosophers, such as Plato (424-348 BCE) and René Descartes in the 17th century. Plato believed we do not learn new things but recall things we knew before birth. For this to be so, he concluded, we must have a soul. Centuries later, Descartes wrote his thesis Passions of the Soul, where he argued there was a distinction between the mind, which he described as a “thinking substance”, and the body, “the extended substance”. He wrote: © 2010–2016, The Conversation US, Inc.
Link ID: 22692 - Posted: 09.26.2016
By CATHERINE SAINT LOUIS Attention deficit disorder is the most common mental health diagnosis among children under 12 who die by suicide, a new study has found. Very few children aged 5 to 11 take their own lives, and little is known about these deaths. The new study, which included deaths in 17 states from 2003 to 2012, compared 87 children aged 5 to 11 who committed suicide with 606 adolescents aged 12 to 14 who did, to see how they differed. The research was published on Monday in the journal Pediatrics. About a third of the children of each group had a known mental health problem. The very young who died by suicide were most likely to have had attention deficit disorder, or A.D.D., with or without accompanying hyperactivity. By contrast, nearly two-thirds of early adolescents who took their lives struggled with depression. Suicide prevention has focused on identifying children struggling with depression; the new study provides an early hint that this strategy may not help the youngest suicide victims. “Maybe in young children, we need to look at behavioral markers,” said Jeffrey Bridge, the paper’s senior author and an epidemiologist at the Research Institute at Nationwide Children’s Hospital in Columbus, Ohio. Jill Harkavy-Friedman, the vice president of research at the American Foundation for Suicide Prevention, agreed. “Not everybody who is at risk for suicide has depression,” even among adults, said Dr. Harkavy-Friedman, who was not involved in the new research. Yet the new research does not definitively establish that attention deficit disorder and attention deficit hyperactivity disorder, or A.D.H.D., are causal risk factors for suicide in children, Dr. Bridge said. Instead, the findings suggest that “suicide is potentially a more impulsive act among children.” © 2016 The New York Times Company
By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company
By Colin Barras Subtract 8 from 52. Did you see the calculation in your head? While a leading theory suggests our visual experiences are linked to our understanding of numbers, a study of people who have been blind from birth suggests the opposite. The link between vision and number processing is strong. Sighted people can estimate the number of people in a crowd just by looking, for instance, while children who can mentally rotate an object and correctly imagine how it might look from a different angle often develop better mathematical skills. “It’s actually hard to think of a situation when you might process numbers through any modality other than vision,” says Shipra Kanjlia at Johns Hopkins University in Baltimore, Maryland. But blind people can do maths too. To understand how they might compensate for their lack of visual experience, Kanjlia and her colleagues asked 36 volunteers – 17 of whom had been blind at birth – to do simple mental arithmetic inside an fMRI scanner. To level the playing field, the sighted participants wore blindfolds. We know that a region of the brain called the intraparietal sulcus (IPS) is, and brain scans revealed that the same area is similarly active in blind people too. “It’s really surprising,” says Kanjlia. “It turns out brain activity is remarkably similar, at least in terms of classic number processing.” This may mean we have a deep understanding of how to handle numbers that is entirely independent of visual experience. This suggests we are all born with a natural understanding of numbers – an idea many researchers find difficult to accept. © Copyright Reed Business Information Ltd.
Dean Burnett You remember that time a children’s TV presenter, one who has been working in children’s television for decades and is now employed on a channel aimed at under-8-year-olds, decided to risk it all and say one of the worst possible swear words on a show for pre-schoolers that he is famous for co-hosting? Remember how he took a huge risk for no appreciable gain and uttered a context-free profanity to an audience of toddlers? How he must have wanted to swear on children’s TV but paradoxically didn’t want anyone to notice so “snuck it in” as part of a song, where it would be more ambiguous? How all the editors and regulators at the BBC happened to completely miss it and allow it to be aired? Remember this happening? Well you shouldn’t, because it clearly didn’t. No presenter and/or channel would risk their whole livelihood in such a pointless, meaningless way, especially not the ever-pressured BBC. And, yet, an alarming number of people do think it happened. Apparently, there have been some “outraged parents” who are aghast at the whole thing. This seems reasonable in some respects; if your toddler was subjected to extreme cursing then as a parent you probably would object. On the other hand, if your very small child is able to recognise strong expletives, then perhaps misheard lyrics on cheerful TV shows aren’t the most pressing issue in their life. Regardless, a surprising number of people report that they did genuinely “hear” the c-word. This is less likely to be due to a TV presenter having some sort of extremely-fleeting breakdown, and more likely due to the quirks and questionable processing of our senses by our powerful yet imperfect brains. © 2016 Guardian News and Media Limited
André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited
Link ID: 22653 - Posted: 09.15.2016
Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201
By Karen Zusi At least one type of social learning, or the ability to learn from observing others’ actions, is processed by individual neurons within a region of the human brain called the rostral anterior cingulate cortex (rACC), according to a study published today (September 6) in Nature Communications. The work is the first direct analysis in humans of the neuronal activity that encodes information about others’ behavior. “The idea [is] that there could be an area that’s specialized for processing things about other people,” says Matthew Apps, a neuroscientist at the University of Oxford who was not involved with the study. “How we think about other people might use distinct processes from how we might think about ourselves.” During the social learning experiments, the University of California, Los Angeles (UCLA) and CalTech–based research team recorded the activity of individual neurons in the brains of epilepsy patients. The patients were undergoing a weeks-long procedure at the Ronald Reagan UCLA Medical Center in which their brains were implanted with electrodes to locate the origin of their epileptic seizures. Access to this patient population was key to the study. “It’s a very rare dataset,” says Apps. “It really does add a lot to the story.” With data streaming out of the patients’ brains, the researchers taught the subjects to play a card game on a laptop. Each turn, the patients could select from one of two decks of face-down cards: the cards either gave $10 or $100 in virtual winnings, or subtracted $10 or $100. In one deck, 70 percent of the cards were winning cards, while in the other only 30 percent were. The goal was to rack up the most money. © 1986-2016 The Scientist
Chris Chambers One of the most compelling impressions in everyday life is that wherever we look, we “see” everything that is happening in front of us – much like a camera. But this impression is deceiving. In reality our senses are bombarded by continual waves of stimuli, triggering an avalanche of sensations that far exceed the brain’s capacity. To make sense of the world, the brain needs to determine which sensations are the most important for our current goals, focusing resources on the ones that matter and throwing away the rest. These computations are astonishingly complex, and what makes attention even more remarkable is just how effortless it is. The mammalian attention system is perhaps the most efficient and precisely tuned junk filter we know of, refined through millions of years of annoying siblings (and some evolution). Attention is amazing but no system is ever perfect. Our brain’s computational reserves are large but not infinite, and under the right conditions we can “break it” and peek behind the curtain. This isn’t just a fun trick – understanding these limits can yield important insights into psychology and neurobiology, helping us to diagnose and treat impairments that follow brain injury and disease. Thanks to over a hundred years of psychology research, it’s relatively easy to reveal attention in action. One way is through the phenomenon of change blindness. Try it yourself by following the instructions in the short video below (no sound). When we think of the term “blindness” we tend to assume a loss of vision caused by damage to the eye or optic nerves. But as you saw in the video, change blindness is completely normal and is caused by maxing out your attentional capacity. © 2016 Guardian News and Media Limited
A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily
By STEVE SILBERMAN In the late 1930s, Charles Bradley, the director of a home for “troublesome” children in Rhode Island, had a problem. The field of neuroscience was still in its infancy, and one of the few techniques available to allow psychiatrists like Bradley to ponder the role of the brain in emotional disorders was a procedure that required replacing a volume of cerebrospinal fluid in the patient’s skull with air. This painstaking process allowed any irregularities to stand out clearly in X-ray images, but many patients suffered excruciating headaches that lasted for weeks afterward. Meanwhile, a pharmaceutical company called Smith, Kline & French was facing a different sort of problem. The firm had recently acquired the rights to sell a powerful stimulant then called “benzedrine sulfate” and was trying to create a market for it. Toward that end, the company made quantities of the drug available at no cost to doctors who volunteered to run studies on it. Bradley was a firm believer that struggling children needed more than a handful of pills to get better; they also needed psychosocial therapy and the calming and supportive environment that he provided at the home. But he took up the company’s offer, hoping that the drug might eliminate his patients’ headaches. It did not. But the Benzedrine did have an effect that was right in line with Smith, Kline & French’s aspirations for its new product: The drug seemed to boost the children’s eagerness to learn in the classroom while making them more amenable to following the rules. The drug seemed to calm the children’s mood swings, allowing them to become, in the words of their therapists, more “attentive” and “serious,” able to complete their schoolwork and behave. Bradley was amazed that Benzedrine, a forerunner of Ritalin and Adderall, was such a great normalizer, turning typically hard-to-manage kids into models of complicity and decorum. But even after marveling at the effects of the drug, he maintained that medication should be considered for children only in addition to other forms of therapy. © 2016 The New York Times Company
By Usha Lee McFarling @ushamcfarling LOS ANGELES — A team of physicians and neuroscientists on Wednesday reported the successful use of ultrasound waves to “jump start” the brain of a 25-year-old man recovering from coma — and plan to launch a much broader test of the technique, in hopes of finding a way to help at least some of the tens of thousands of patients in vegetative states. The team, based at the University of California, Los Angeles, cautions that the evidence so far is thin: They have no way to know for sure whether the ultrasound stimulation made the difference for their young patient, or whether he spontaneously recovered by coincidence shortly after the therapy. But the region of the brain they targeted with the ultrasound — the thalamus — has previously been shown to be important in restoring consciousness. In 2007, a 38-year-old man who had been minimally conscious for six years regained some functions after electrodes were implanted in his brain to stimulate the thalamus. The ultrasound technique is a “good idea” that merits further study, said Dr. Nicholas Schiff, a pioneer in the field of using brain stimulation to restore consciousness who conducted the 2007 study. “It’s intriguing and it’s an interesting possibility,” said Schiff, a neuroscientist at Weill Cornell Medicine. The UCLA procedure used an experimental device, about the size of a teacup saucer, to focus ultrasonic waves on the thalamus, two walnut-sized bulbs in the center of the brain that serve as a critical hub for information flow and help regulate consciousness and sleep.
Link ID: 22606 - Posted: 08.27.2016
Laura Sanders Brain scientists Eric Jonas and Konrad Kording had grown skeptical. They weren’t convinced that the sophisticated, big data experiments of neuroscience were actually accomplishing anything. So they devised a devilish experiment. Instead of studying the brain of a person, or a mouse, or even a lowly worm, the two used advanced neuroscience methods to scrutinize the inner workings of another information processor — a computer chip. The unorthodox experimental subject, the MOS 6502, is the same chip that dazzled early tech junkies and kids alike in the 1980s by powering Donkey Kong, Space Invaders and Pitfall, as well as the Apple I and II computers. Of course, these experiments were rigged. The scientists already knew everything about how the 6502 works. “The beauty of the microprocessor is that unlike anything in biology, we understand it on every level,” says Jonas, of the University of California, Berkeley. A barrel-hurling gorilla is the enemy in Donkey Kong, a video game powered by the MOS 6502 microprocessor. Along with Space Invaders and Pitfall, this game served as the “behavior” in a recent experiment. Using a simulation of MOS 6502, Jonas and Kording, of Northwestern University in Chicago, studied the behavior of electricity-moving transistors, along with aspects of the chip’s connections and its output, to reveal how it handles information. Since they already knew what the outcomes should be, they were actually testing the methods. By the end of their experiments, Jonas and Kording had discovered almost nothing. |© Society for Science & the Public 2000 - 2016
Keyword: Brain imaging
Link ID: 22597 - Posted: 08.24.2016
By Jessica Hamzelou Feel like you’ve read this before? Most of us have experienced the eerie familiarity of déjà vu, and now the first brain scans of this phenomenon have revealed why – it’s a sign of our brain checking its memory. Déjà vu was thought to be caused by the brain making false memories, but research by Akira O’Connor at the University of St Andrews, UK, and his team now suggests this is wrong. Exactly how déjà vu works has long been a mystery, partly because its fleeting and unpredictable nature makes it difficult to study. To get around this, O’Connor and his colleagues developed a way to trigger the sensation of déjà vu in the lab. The team’s technique uses a standard method to trigger false memories. It involves telling a person a list of related words – such as bed, pillow, night, dream – but not the key word linking them together, in this case, sleep. When the person is later quizzed on the words they’ve heard, they tend to believe they have also heard “sleep” – a false memory. To create the feeling of déjà vu, O’ Connor’s team first asked people if they had heard any words beginning with the letter “s”. The volunteers replied that they hadn’t. This meant that when they were later asked if they had heard the word sleep, they were able to remember that they couldn’t have, but at the same time, the word felt familiar. “They report having this strange experience of déjà vu,” says O’Connor. © Copyright Reed Business Information Ltd.
By MIKE SACKS You’ve seen me. I know you have. I’m the guy wearing gloves on the subway in October. Or even into April. Perhaps I’m wearing just one glove, allowing my naked hand to turn the pages of a book. No big deal. Just another one-gloved commuter, heading home. If it’s crowded, you may have noticed me doing my best to “surf,” sans contact, until the car comes to a stop, in which case I may knock into a fellow passenger. Aboveground you may have seen me acting the gentleman, opening doors for others with a special paper towel I carry in my front left pocket for just such a momentous occasion. No? How about that guy walking quickly ahead of you, the one impishly avoiding sidewalk cracks? Or perhaps you’ve noticed a stranger who turns and makes eye contact with you for seemingly no reason. You may have asked, “You got a problem?” Oh, I definitely have a problem. But it has nothing to do with you, sir or madam. (And, yes, even in my thoughts I refer to you as “sir” and “madam.”) The problem here is what multiple doctors have diagnosed as obsessive-compulsive disorder. You may refer to it by its kicky abbreviation, O.C.D. I prefer to call it Da Beast. Da Beast is a creature I have lived with since I was 11, a typical age for O.C.D. to snarl into one’s life without invitation or warning. According to the International O.C.D. Foundation, roughly one in 100 adults suffers from the disorder. Each of us has his or her own obsessive thoughts and fears to contend with. My particular beast of burden is a fear of germs and sickness. It’s a popular one, perhaps the most common. © 2016 The New York Times Company
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 22541 - Posted: 08.11.2016
By ABBY GOODNOUGH TUSCALOOSA, Ala. — Roslyn Lewis was at work at a dollar store here in Tuscaloosa, pushing a heavy cart of dog food, when something popped in her back: an explosion of pain. At the emergency room the next day, doctors gave her Motrin and sent her home. Her employer paid for a nerve block that helped temporarily, numbing her lower back, but she could not afford more injections or physical therapy. A decade later, the pain radiates to her right knee and remains largely unaddressed, so deep and searing that on a recent day she sat stiffly on her couch, her curtains drawn, for hours. The experience of African-Americans, like Ms. Lewis, and other minorities illustrates a problem as persistent as it is complex: Minorities tend to receive less treatment for pain than whites, and suffer more disability as a result. While an epidemic of prescription opioid abuse has swept across the United States, African-Americans and Hispanics have been affected at much lower rates than whites. Researchers say minority patients use fewer opioids, and they offer a thicket of possible explanations, including a lack of insurance coverage and a greater reluctance among members of minority groups to take opioid painkillers even if they are prescribed. But the researchers have also found evidence of racial bias and stereotyping in recognizing and treating pain among minorities, particularly black patients. “We’ve done a good job documenting that these disparities exist,” said Salimah Meghani, a pain researcher at the University of Pennsylvania. “We have not done a good job doing something about them.” Dr. Meghani’s 2012 analysis of 20 years of published research found that blacks were 34 percent less likely than whites to be prescribed opioids for conditions such as backaches, abdominal pain and migraines, and 14 percent less likely to receive opioids for pain from traumatic injuries or surgery. © 2016 The New York Times Company
By EUGENE M. CARUSO, ZACHARY C. BURNS and BENJAMIN A. CONVERSE Watching slow-motion footage of an event can certainly improve our judgment of what happened. But can it also impair judgment? This question arose in the 2009 murder trial of a man named John Lewis, who killed a police officer during an armed robbery of a Dunkin’ Donuts in Philadelphia. Mr. Lewis pleaded guilty; the only question for the jury was whether the murder resulted from a “willful, deliberate and premeditated” intent to kill or — as Mr. Lewis argued — from a spontaneous, panicked reaction to seeing the officer enter the store unexpectedly. The key piece of evidence was a surveillance video of the shooting, which the jury saw both in real time and in slow motion. The jury found that Mr. Lewis had acted with premeditation, and he was sentenced to death. Mr. Lewis appealed the decision, arguing that the slow-motion video was prejudicial. Specifically, he claimed that watching the video in slow motion artificially stretched the relevant time period and created a “false impression of premeditation.” Did it? We recently conducted a series of experiments whose results are strikingly consistent with that claim. Our studies, published this week in the Proceedings of the National Academy of Sciences, show that seeing replays of an action in slow motion leads viewers to believe that the actor had more time to think before acting than he actually did. The result is that slow motion makes actions seem more intentional, more premeditated. In one of our studies, participants watched surveillance video of a fatal shooting that occurred outside a convenience store during an armed robbery. We gave them a set of instructions similar to those given to the jurors in Mr. Lewis’s case, asking them to decide whether the crime was premeditated or not. We assigned half our participants to watch the video in slow motion and the other half to watch it at regular speed. © 2016 The New York Times Company
Link ID: 22525 - Posted: 08.08.2016
By BENEDICT CAREY Solving a hairy math problem might send a shudder of exultation along your spinal cord. But scientists have historically struggled to deconstruct the exact mental alchemy that occurs when the brain successfully leaps the gap from “Say what?” to “Aha!” Now, using an innovative combination of brain-imaging analyses, researchers have captured four fleeting stages of creative thinking in math. In a paper published in Psychological Science, a team led by John R. Anderson, a professor of psychology and computer science at Carnegie Mellon University, demonstrated a method for reconstructing how the brain moves from understanding a problem to solving it, including the time the brain spends in each stage. The imaging analysis found four stages in all: encoding (downloading), planning (strategizing), solving (performing the math), and responding (typing out an answer). “I’m very happy with the way the study worked out, and I think this precision is about the limit of what we can do” with the brain imaging tools available, said Dr. Anderson, who wrote the report with Aryn A. Pyke and Jon M. Fincham, both also at Carnegie Mellon. To capture these quicksilver mental operations, the team first taught 80 men and women how to interpret a set of math symbols and equations they had not seen before. The underlying math itself wasn’t difficult, mostly addition and subtraction, but manipulating the newly learned symbols required some thinking. The research team could vary the problems to burden specific stages of the thinking process — some were hard to encode, for instance, while others extended the length of the planning stage. The scientists used two techniques of M.R.I. data analysis to sort through what the participants’ brains were doing. One technique tracked the neural firing patterns during the solving of each problem; the other identified significant shifts from one kind of mental state to another. The subjects solved 88 problems each, and the research team analyzed the imaging data from those solved successfully. © 2016 The New York Times Company
By ERICA GOODE You are getting sleepy. Very sleepy. You will forget everything you read in this article. Hypnosis has become a common medical tool, used to reduce pain, help people stop smoking and cure them of phobias. But scientists have long argued about whether the hypnotic “trance” is a separate neurophysiological state or simply a product of a hypnotized person’s expectations. A study published on Thursday by Stanford researchers offers some evidence for the first explanation, finding that some parts of the brain function differently under hypnosis than during normal consciousness. The study was conducted with functional magnetic resonance imaging, a scanning method that measures blood flow in the brain. It found changes in activity in brain areas that are thought to be involved in focused attention, the monitoring and control of the body’s functioning, and the awareness and evaluation of a person’s internal and external environments. “I think we have pretty definitive evidence here that the brain is working differently when a person is in hypnosis,” said Dr. David Spiegel, a professor of psychiatry and behavioral sciences at Stanford who has studied the effectiveness of hypnosis. Functional imaging is a blunt instrument and the findings can be difficult to interpret, especially when a study is looking at activity levels in many brain areas. Still, Dr. Spiegel said, the findings might help explain the intense absorption, lack of self-consciousness and suggestibility that characterize the hypnotic state. © 2016 The New York Times Company
By ANNA WEXLER EARLIER this month, in the journal Annals of Neurology, four neuroscientists published an open letter to practitioners of do-it-yourself brain stimulation. These are people who stimulate their own brains with low levels of electricity, largely for purposes like improved memory or learning ability. The letter, which was signed by 39 other researchers, outlined what is known and unknown about the safety of such noninvasive brain stimulation, and asked users to give careful consideration to the risks. For the last three years, I have been studying D.I.Y. brain stimulators. Their conflict with neuroscientists offers a fascinating case study of what happens when experimental tools normally kept behind the closed doors of academia — in this case, transcranial direct current stimulation — are appropriated for use outside them. Neuroscientists began experimenting in earnest with transcranial direct current stimulation about 15 years ago. In such stimulation, electric current is administered at levels that are hundreds of times less than those used in electroconvulsive therapy. To date, more than 1,000 peer-reviewed studies of the technique have been published. Studies have suggested, among other things, that the stimulation may be beneficial for treating problems like depression and chronic pain as well as enhancing cognition and learning in healthy individuals. The device scientists use for stimulation is essentially a nine-volt battery attached to two wires that are connected to electrodes placed at various spots on the head. A crude version can be constructed with just a bit of electrical know-how. Consequently, as reports of the effects of the technique began to appear in scientific journals and in newspapers, people began to build their own devices at home. By late 2011 and early 2012, diagrams, schematics and videos began to appear online. © 2016 The New York Times Company
Link ID: 22471 - Posted: 07.23.2016