Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 1251

Jon Hamilton Impulsive children become thoughtful adults only after years of improvements to the brain's information highways, a team reports in Current Biology. A study of nearly 900 young people ages 8 to 22 found that the ability to control impulses, stay on task and make good decisions increased steadily over that span as the brain remodeled its information pathways to become more efficient. The finding helps explain why these abilities, known collectively as executive function, take so long to develop fully, says Danielle Bassett, an author of the study and an associate professor of bioengineering at the University of Pennsylvania. "A child's ability to run or to see is very well developed by the time they're 8," she says. "However, their ability to inhibit inappropriate responses is not something that's well developed until well into the 20s." The results also suggest it may be possible to identify adolescents at risk of problems related to poor executive function, says Joshua Gordon, director of the National Institute of Mental Health, which helped fund the study. These include "all kinds of disorders such as substance abuse, depression and schizophrenia," he says. The study is part of an effort to understand the brain changes underlying the development of executive function. It used a technology called diffusion imaging that reveals the fibers that make up the brain's information highways. © 2017 npr

Keyword: ADHD; Development of the Brain
Link ID: 23668 - Posted: 05.27.2017

by Angela Chen@chengela What happens when you look up and see a ball headed toward you? Without even thinking about it, you flinch. That might be because our brains are constantly living our lives in fast-forward, playing out the action in our head before it happens. Humans have to navigate, and respond to, an environment that is always changing. Our brain compensates for this by constantly making predictions about what’s going to happen, says Mattias Ekman, a researcher at Radboud University Nijmegen in the Netherlands. We’ve known this for a while, but these predictions are usually associative. An example: if you see a hamburger, your brain might predict that there will be fries nearby. In a study published today in the journal Nature Communications, Ekman and other scientists focused instead on how the brain predicts motion. So they used brain scans to track what happened as participants observed a moving dot. First, 29 volunteers looked at a white dot the size of a ping-pong ball. The dot went from left to right and then reversed directions. The volunteers watched the dot for about five minutes while scientists scanned their brains with ultra-fast fMRI. This way, the researchers know what pattern of brain activity was activated in the visual cortex while they watched the dot. After these five minutes, the researchers showed only the beginning of the sequence to the volunteers. Here, the scans showed that the brain “autocompletes” the full sequence — and it does it at twice the rate of the actual event. So if a dot took two seconds to go across the screen, the brain predicted the entire sequence in one second. “You’re actually already trying to predict what’s going to happen,” says Ekman. “These predictions are hypothetical, so in a way you’re trying to generate new memories that match the future.” © 2017 Vox Media, Inc.

Keyword: Attention
Link ID: 23653 - Posted: 05.24.2017

Jon Hamilton It took an explosion and 13 pounds of iron to usher in the modern era of neuroscience. In 1848, a 25-year-old railroad worker named Phineas Gage was blowing up rocks to clear the way for a new rail line in Cavendish, Vt. He would drill a hole, place an explosive charge, then pack in sand using a 13-pound metal bar known as a tamping iron. But in this instance, the metal bar created a spark that touched off the charge. That, in turn, "drove this tamping iron up and out of the hole, through his left cheek, behind his eye socket, and out of the top of his head," says Jack Van Horn, an associate professor of neurology at the Keck School of Medicine at the University of Southern California. Gage didn't die. But the tamping iron destroyed much of his brain's left frontal lobe, and Gage's once even-tempered personality changed dramatically. "He is fitful, irreverent, indulging at times in the grossest profanity, which was not previously his custom," wrote John Martyn Harlow, the physician who treated Gage after the accident. This sudden personality transformation is why Gage shows up in so many medical textbooks, says Malcolm Macmillan, an honorary professor at the Melbourne School of Psychological Sciences and the author of An Odd Kind of Fame: Stories of Phineas Gage. "He was the first case where you could say fairly definitely that injury to the brain produced some kind of change in personality," Macmillan says. © 2017 npr

Keyword: Attention; Emotions
Link ID: 23643 - Posted: 05.22.2017

By MARTIN E. P. SELIGMAN and JOHN TIERNEY We are misnamed. We call ourselves Homo sapiens, the “wise man,” but that’s more of a boast than a description. What makes us wise? What sets us apart from other animals? Various answers have been proposed — language, tools, cooperation, culture, tasting bad to predators — but none is unique to humans. What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives. A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present. Behaviorists thought of animal learning as the ingraining of habit by repetition. Psychoanalysts believed that treating patients was a matter of unearthing and confronting the past. Even when cognitive psychology emerged, it focused on the past and present — on memory and perception. But it is increasingly clear that the mind is mainly drawn to the future, not driven by the past. Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities. Our brain sees the world not by processing every pixel in a scene but by focusing on the unexpected. © 2017 The New York Times Company

Keyword: Attention; Learning & Memory
Link ID: 23641 - Posted: 05.20.2017

By Clare Wilson Seeing shouldn’t always be believing. We all have blind spots in our vision, but we don’t notice them because our brains fill the gaps with made-up information. Now subtle tests show that we trust this “fake vision” more than the real thing. If the brain works like this in other ways, it suggests we should be less trusting of the evidence from our senses, says Christoph Teufel of Cardiff University, who wasn’t involved in the study. “Perception is not providing us with a [true] representation of the world,” he says. “It is contaminated by what we already know.” The blind spot is caused by a patch at the back of each eye where there are no light-sensitive cells, just a gap where neurons exit the eye on their way to the brain. We normally don’t notice blind spots because our two eyes can fill in for each other. When vision is obscured in one eye, the brain makes up what’s in the missing area by assuming that whatever is in the regions around the spot continues inwards. But do we subconsciously know that this filled-in vision is less trustworthy than real visual information? Benedikt Ehinger of the University of Osnabrück in Germany and his colleagues set out to answer this question by asking 100 people to look at a picture of a circle of vertical stripes, which contained a small patch of horizontal stripes. The circle was positioned so that with one eye obscured, the patch of horizontal stripes fell within the other eye’s blind spot. As a result, the circle appeared as though there was no patch and the vertical stripes were continuous. © Copyright New Scientist Ltd.

Keyword: Vision; Attention
Link ID: 23640 - Posted: 05.20.2017

Sarah Boseley in Porto A crinkly plate, designed with ridges that cunningly reduce the amount of food it holds, may be heading for the market to help people concerned about their weight to eat less. The plate is the brainchild of a Latvian graphic designer, Nauris Cinovics, from the Art Academy of Latvia, who is working with a Latvian government agency to develop the idea and hopes to trial it soon. It may look like just another arty designer plate, but it is intended to play tricks with the mind. “My idea is to make food appear bigger than it is. If you make the plate three-dimensional [with the ridges and troughs] it actually looks like there is the same amount of food as on a normal plate – but there is less of it,” said Cinovics. “You are tricking the brain into thinking you are eating more.” The plate will be made of clear glass and could turn eating dinner into a more complex and longer process than it is usually for most of us. Negotiating the folds in the glass where pieces of fish or stray carrots may lurk will slow down the speed with which people get through their meal. Cinovics has also designed heavy cutlery, with the idea of making eating more of a labour – that therefore lasts longer. His knife, fork and spoon weigh 1.3kg each. “We tested this and it took 11 minutes to finish a meal with this cutlery rather than seven minutes,” he said.

Keyword: Obesity; Attention
Link ID: 23639 - Posted: 05.20.2017

By Bret Stetka For many hours a day they pluck dirt, debris and bugs from each other’s fur. Between grooming sessions they travel in troops to search for food. When ignored by mom, they throw tantrums; when not ignored by zoo-goers, they throw feces. Through these behaviors, monkeys demonstrate they understand the meaning of social interactions with other monkeys. They recognize when their peers are grooming one another and infer social rank from seeing such actions within their group. But it has long been unclear how the brains of our close evolutionary relatives actually process what they observe of these social situations. New findings published Thursday in Science offer a clue. A team of researchers from The Rockefeller University have identified a network in the monkey brain dedicated exclusively to analyzing social interactions. And they believe this network could be akin to human brains’ social circuitry. In the new work—led by Winrich Freiwald, an associate professor of neurosciences and behavior—four rhesus macaques viewed videos of various social and physical interactions while undergoing functional magnetic resonance imaging. (Monkeys love watching TV, so they paid attention.) They were shown clips of monkeys interacting, as well as performing tasks on their own. They also watched videos of various physical interactions among inanimate objects. © 2017 Scientific American

Keyword: Attention; Evolution
Link ID: 23637 - Posted: 05.19.2017

Katherine Isbister The fidget spinner craze has been sweeping elementary and middle schools. As of May 17 every one of the top 10 best-selling toys on Amazon was a form of the hand-held toy people can spin and do tricks with. Kids and parents are even making them for themselves using 3D printers and other more homespun crafting techniques. But some teachers are banning them from classrooms. And experts challenge the idea that spinners are good for conditions like ADHD and anxiety. Meanwhile, the Kickstarter online fundraising campaign for the Fidget Cube – another popular fidget toy in 2017 – raised an astounding US$6.4 million, and can be seen on the desks of hipsters and techies across the globe. My research group has taken a deep look at how people use fidget items over the last several years. What we found tells us that these items are not a fad that will soon disappear. Despite sometimes being an annoying distraction for others, fidget items can have some practical uses for adults; our inquiry into their usefulness for children is underway. Fidgeting didn’t start with the spinner craze. If you’ve ever clicked a ballpoint pen again and again, you’ve used a fidget item. As part of our work, we’ve asked people what items they like to fidget with and how and when they use them. (We’re compiling their answers online and welcome additional contributions.) © 2010–2017, The Conversation US, Inc.

Keyword: ADHD; Attention
Link ID: 23630 - Posted: 05.18.2017

By Helen Thomson People in a minimally conscious state have been “woken” for a whole week after a brief period of brain stimulation. The breakthrough suggests we may be on the verge of creating a device that can be used at home to help people with disorders of consciousness communicate with friends and family. People with severe brain trauma can fall into a coma. If they begin to show signs of arousal but not awareness, they are said to be in a vegetative state. If they then show fluctuating signs of awareness but cannot communicate, they are described as being minimally consciousness. In 2014, Steven Laureys at the University of Liège in Belgium and his colleagues discovered that 13 people with minimal consciousness and two people in a vegetative state could temporarily show new signs of awareness when given mild electrical stimulation. The people in the trial received transcranial direct current stimulation (tDCS), which uses low-level electrical stimulation to make neurons more or less likely to fire. This was applied once over an area of the brain called the prefrontal cortex, which is involved in “higher” cognitive functions such as consciousness. Soon after, they showed signs of consciousness, including moving their hands or following instructions using their eyes. Two people were even able to answer questions for 2 hours by moving their body, before drifting back into their previous state. © Copyright New Scientist Ltd.

Keyword: Consciousness
Link ID: 23610 - Posted: 05.13.2017

By Reuters People with attention-deficit/hyperactivity disorder are at increased risk of motor-vehicle accidents, but it is significantly reduced when they are taking ADHD medication, a 10-year study finds. The researchers estimate that 1 in 5 of the accidents among more than 2 million people with ADHD during the study period could have been avoided if these individuals had been receiving medication the entire time. “The patients should be aware of the potential risk of [crashes], and seek specific treatment advice from their doctors if they experience difficulties in driving from their condition,” said lead author Zheng Chang, of the Karolinska Institute in Stockholm. Chang said that motor-vehicle crashes kill more than 1.25 million people around the world each year. ADHD is a common disorder with symptoms that include poor sustained attention, impaired impulse control and hyperactivity, he added. Past studies have found that people with ADHD are at an increased risk for crashes and that medication may reduce symptoms and ultimately improve driving skills. To examine the risk of crashes with ADHD and how it is influenced by medication, the researchers analyzed U.S. commercial health insurance claims between 2005 and 2014. They identified 2,319,450 adults with an ADHD diagnosis, half of whom were older than 33. About 1.9 million of them received at least one prescription to treat their ADHD during the study period. © 1996-2017 The Washington Post

Keyword: ADHD; Attention
Link ID: 23609 - Posted: 05.13.2017

By Agata Blaszczak-Boxe We tend to be worse at telling apart faces of other races than those of our own race, studies have found. Now research shows some people are completely blind to features that make other-race faces distinct. Such an impairment could have important implications for eyewitness testimony in situations involving other-race suspects. The ability to distinguish among members of one's own race varies wildly: some people can tell strangers apart effortlessly, whereas others cannot even recognize the faces of their own family and friends (a condition known as prosopagnosia). Psychologist Lulu Wan of the Australian National University and her colleagues wanted to quantify the distribution of abilities for recognizing other-race faces. They asked 268 Caucasians born and raised in Australia to memorize a series of six Asian faces and conducted the same experiment, involving Caucasian faces, with a group of 176 Asians born and raised in Asia who moved to Australia to attend university. In 72 trials, every participant was then shown sets of three faces and had to point to the one he or she had learned in the memorization task. The authors found that 26 Caucasian and 10 Asian participants—8 percent of the collective study population—did so badly on the test that they met the criteria for clinical-level impairment. “We know that we are poor at recognizing other-race faces,” says Jim Tanaka, a professor of psychology at the University of Victoria in British Columbia, who was not involved in the research. “This study shows just how poor some people are.” Those individuals “would be completely useless in terms of their legal value as an eyewitness,” says study co-author Elinor McKone, a professor of psychology at the Australian National University. The world's legal systems do not, however, take into account individual differences in other-race face recognition, she notes. © 2017 Scientific American

Keyword: Attention
Link ID: 23602 - Posted: 05.11.2017

Ian Sample Science editor It isn’t big and it isn’t clever. But the benefits, known to anyone who has moved home, climbed a mountain, or pushed a broken-down car, have finally been confirmed: according to psychologists, swearing makes you stronger. The upside of letting profanities fly emerged from a series of experiments with people who repeated either a swear word or a neutral word as they pounded away on an exercise bike, or performed a simple hand-grip test. When people cursed their way through the half-minute bike challenge, their peak power rose by 24 watts on average, according to the study. In the 10-second grip task, swearers boosted their strength by the equivalent of 2.1kg, researchers found. “In the short period of time we looked at there are benefits from swearing,” said Richard Stephens, a psychologist at Keele University, who presented the results at the British Psychological Society meeting in Brighton. Stephens enrolled 29 people aged about 21 for the cycling test, and 52 people with a typical age of 19 for the hand-grip test. All were asked to choose a swearword to repeat in the studies, based on a term they might utter if they banged their head. For the neutral word, the volunteers were asked to pick a word they might use to describe a table, such as “wooden” or “brown”. © 2017 Guardian News and Media Limited

Keyword: Attention; Language
Link ID: 23575 - Posted: 05.05.2017

Long assumed to be a mere “relay,” an often-overlooked egg-like structure in the middle of the brain also turns out to play a pivotal role in tuning-up thinking circuity. A trio of studies in mice funded by the National Institutes of Health are revealing that the thalamus sustains the ability to distinguish categories and hold thoughts in mind. By manipulating activity of thalamus neurons, scientists were able to control an animal’s ability to remember how to find a reward. In the future, the thalamus might even become a target for interventions to reduce cognitive deficits in psychiatric disorders such as schizophrenia, researchers say. “If the brain works like an orchestra, our results suggest the thalamus may be its conductor,” explained Michael Halassa, M.D., Ph.D. (link is external), of New York University (NYU) Langone Medical Center, a BRAINS Award grantee of the NIH’s National Institute of Mental Health (NIMH), and also a grantee of the National Institute of Neurological Disorders and Stroke (NINDS). “It helps ensembles play in-sync by boosting their functional connectivity.” Three independent teams of investigators led by Halassa, Joshua Gordon, M.D., Ph.D., formerly of Columbia University, New York City, now NIMH director, in collaboration with Christoph Kellendonk, Ph.D. (link is external) of Columbia, and Karel Svoboda, PhD (link is external), at Howard Hughes Medical Institute Janelia Research Campus, Ashburn, Virginia, in collaboration with Charles Gerfen, Ph.D., of the NIMH Intramural Research Program, report on the newfound role for the thalamus online May 3, 2017 in the journals Nature and Nature Neuroscience.

Keyword: Attention; Learning & Memory
Link ID: 23571 - Posted: 05.04.2017

By Colleen Kimmett, Dr. Rebecca Carey admits to being a little embarrassed about what her son, Mark, eats every day. Hamburger patties for breakfast, or bacon. A pack of raisins and a cookie for lunch; a turkey and cheese sandwich “if I’m lucky,” says Carey, but it usually comes back home. His favorite dinner is fish cakes and pasta, but all vegetables remain firmly untouched. It’s the kind of diet—low in fruits and vegetables, high in carbs—that a doctor like herself might caution against. But it’s also low in milk, sugar, and artificial food additives — all things Carey believes worsen 10-year-old Mark’s attention deficit hyperactivity disorder, or ADHD, symptoms. Twice a day, in the morning at their home in Newburgh, Ind., and from the school nurse at lunch, he takes a vitamin and mineral supplement, which helps make up for the lack of veggies. It’s been six months on this diet, which Carey researched herself and tested out on Mark, and in that time he has transitioned off his ADHD medication. It wasn’t all smooth sailing; there were fights in the candy section of the grocery store, and Carey struggled to find quick, high-protein breakfasts. “But honestly, I would never go back,” she said. Carey is not the only one who’s trying this approach. Medication and therapy remain the most effective treatments for ADHD. But driven by concerns about the short- and long-term side effects of psychiatric medications on children, some parents are looking for ways to keep their kids on lower doses of the drugs, or to quit the drugs entirely. © 2017 Scientific American

Keyword: ADHD
Link ID: 23568 - Posted: 05.04.2017

By Christof Koch | Imagine you are an astronaut, untethered from your safety line, adrift in space. Your damaged radio lets you hear mission control's repeated attempts to contact you, but your increasingly desperate cries of “I'm here, I'm here” go unacknowledged—you are unable to signal that you're alive but injured. After days and weeks of fruitless pleas from your loved ones, their messages cease. You become lost to the world. How long do you keep your sanity when you are locked in your own echo chamber? Days? Months? Years? This nightmarish scenario is vividly described by British neuroscientist Adrian Owen in his upcoming book Into the Gray Zone (Scribner). Taking my evening bath while dipping into its opening pages, I only put the book down after finishing hours later, with the water cold. The story of communicating with the most impaired neurological patients at a greater distance from us than an astronaut lost in space is told by Owen in a most captivating manner. A professor at Western University in Ontario, Canada, Owen pioneered brain-imaging technology to establish what islands of awareness persist in patients with severe disorders of consciousness. These people are bedridden and seriously disabled, unable to speak or otherwise articulate their mental state following traumatic brain injury, encephalitis, meningitis, stroke, or drug or alcohol intoxication. Two broad groups can be distinguished among those who do not quickly succumb to their injuries. Vegetative state patients, in the first group, cycle in and out of sleep. When they are awake, their eyes are open, but attempts to establish bedside communications with them—“if you hear me, squeeze my hand or look down”—meet only with failure. These patients can move their eyes or head, swallow and yawn but never in an intentional manner. Nothing is left but surviving brain stem reflexes. With proper nursing care to avoid bedsores and infections, these individuals can live for years. © 2017 Scientific American

Keyword: Consciousness
Link ID: 23563 - Posted: 05.02.2017

By Mo Costandi The world is an unpredictable place. But the brain has evolved a way to cope with the everyday uncertainties it encounters—it doesn’t present us with many of them, but instead resolves them as a realistic model of the world. The body’s central controller predicts every contingency, using its stored database of past experiences, to minimize the element of surprise. Take vision, for example: We rarely see objects in their entirety but our brains fill in the gaps to make a best guess at what we are seeing—and these predictions are usually an accurate reflection of reality. The same is true of hearing, and neuroscientists have now identified a predictive textlike brain mechanism that helps us to anticipate what is coming next when we hear someone speaking. The findings, published this week in PLoS Biology, advance our understanding of how the brain processes speech. They also provide clues about how language evolved, and could even lead to new ways of diagnosing a variety of neurological conditions more accurately. The new study builds on earlier findings that monkeys and human infants can implicitly learn to recognize artificial grammar, or the rules by which sounds in a made-up language are related to one another. Neuroscientist Yukiko Kikuchi of Newcastle University in England and her colleagues played sequences of nonsense speech sounds to macaques and humans. Consistent with the earlier findings, Kikuchi and her team found both species quickly learned the rules of the language’s artificial grammar. After this initial learning period the researchers played more sound sequences—some of which violated the fabricated grammatical rules. They used microelectrodes to record responses from hundreds of individual neurons as well as from large populations of neurons that process sound information. In this way they were able to compare the responses with both types of sequences and determine the similarities between the two species’ reactions. © 2017 Scientific American,

Keyword: Language; Attention
Link ID: 23558 - Posted: 05.01.2017

By Thomas MacMillan “Time” is the most common noun in the English language, Dean Buonomano tells us on the first page of his new book, Your Brain Is a Time Machine: The Neuroscience and Physics of Time. But our despite fixation with time, and its obvious centrality in our lives, we still struggle to fully understand it. From a psychology perspective, for instance, time seems to flow by, sometimes slowly — like when we’re stuck in line at the DMV — and sometimes quickly — like when we’re lost in an engrossing novel. But from a physics perspective, time may be simply another dimension in the universe, like length, height, or width. Buonomano, a professor of neuroscience at UCLA, lays out the latest, best theories about how we understand time, illuminating a fundamental aspect of being human. The human brain, he writes, is a time machine that allows us to mentally travel backward and forward, to plan for the future and agonizingly regret that past like no other animal. And, he argues, our brains are time machines like clocks are time machines: constantly tracking the passage of time, whether it’s circadian rhythms that tell us when to go to sleep, or microsecond calculations that allow us to the hear the difference between “They gave her cat-food” and “They gave her cat food.” In an interview with Science of Us, Buonomano spoke about planning for the future as a basic human activity, the limits of be-here-now mindfulness, and the inherent incompatibility between physicists’ and neuroscientists’ understanding of the nature of time. I finished reading your book late last night and went to bed sort of planning our interview today, and then woke up at about 3:30 a.m. ready to do the interview, with my head full of insistent thoughts about questions that I should ask you. So was that my brain being a — maybe malfunctioning — time machine? I think this is consistent with the notion that the brain is an organ that’s future-oriented. As far as survival goes, the evolutionary value of the brain is to act in the present to ensure survival in the future, whether survival is figuring out a good place to get food, or doing an interview, I suppose. ! © Invalid Date, New York Media LLC

Keyword: Attention; Consciousness
Link ID: 23537 - Posted: 04.26.2017

By Daniel Barron Earlier this month, JAMA Psychiatry published a land-breaking editorial. A group of psychiatrists led by David Ross described how and why post-traumatic stress disorder (PTSD) should be clinically evaluated from a neuroscience framework. The fact that this editorial was published in one of psychiatry’s leading journals is no small feat. Psychiatry houses a large and powerful contingency that argues neuroscience has little clinical relevance. The relevance of neuroscience to psychiatry was the subject of a recent Op-Ed debate in the New York Times: “There’s Such a Thing as Too Much Neuroscience” was rebutted with “More Neuroscience, Not Less.” This specific debate—and the dense politics as a whole—exists because competing frameworks are vying for competing funding, a conflict that pre-dates Freud’s departure from neurology. That the relevance of neuroscience to psychiatry is still questioned is blatantly outlandish: what organ do psychiatrists treat if not the brain? And what framework could possibly be more relevant than neuroscience to understanding brain dysfunction? In his editorial, Ross tactfully presented his case for neuroscience, describing the obvious choice for a clinical framework as one “perspective,” making a delicate intellectual curtsey while supporting his case with data. Ross discussed five “key neuroscience themes” (read: lines of evidence from burgeoning sub-fields) relevant to understanding and treating PTSD: fear conditioning, dysregulated circuits, memory reconsolidation, and epigenetic and genetic considerations. Each theme accounts for the diverse biological, psychological and social factors involved in PTSD—which is to say, these factors all have some affect on the brain mechanisms. Most importantly, Ross describes how a mechanistic approach allows clinicians to trace the specific causes of PTSD to specific treatments that can target those causes. © 2017 Scientific American,

Keyword: Schizophrenia; Depression
Link ID: 23536 - Posted: 04.26.2017

By Cormac McCarthy I call it the Kekulé Problem because among the myriad instances of scientific problems solved in the sleep of the inquirer Kekulé’s is probably the best known. He was trying to arrive at the configuration of the benzene molecule and not making much progress when he fell asleep in front of the fire and had his famous dream of a snake coiled in a hoop with its tail in its mouth—the ouroboros of mythology—and woke exclaiming to himself: “It’s a ring. The molecule is in the form of a ring.” Well. The problem of course—not Kekulé’s but ours—is that since the unconscious understands language perfectly well or it would not understand the problem in the first place, why doesnt it simply answer Kekulé’s question with something like: “Kekulé, it’s a bloody ring.” To which our scientist might respond: “Okay. Got it. Thanks.” Why the snake? That is, why is the unconscious so loathe to speak to us? Why the images, metaphors, pictures? Why the dreams, for that matter. A logical place to begin would be to define what the unconscious is in the first place. To do this we have to set aside the jargon of modern psychology and get back to biology. The unconscious is a biological system before it is anything else. To put it as pithily as possibly—and as accurately—the unconscious is a machine for operating an animal. All animals have an unconscious. If they didnt they would be plants. We may sometimes credit ours with duties it doesnt actually perform. Systems at a certain level of necessity may require their own mechanics of governance. Breathing, for instance, is not controlled by the unconscious but by the pons and the medulla oblongata, two systems located in the brainstem. Except of course in the case of cetaceans, who have to breathe when they come up for air. An autonomous system wouldnt work here. The first dolphin anesthetized on an operating table simply died. (How do they sleep? With half of their brain alternately.) But the duties of the unconscious are beyond counting. Everything from scratching an itch to solving math problems. © 2017 NautilusThink Inc,

Keyword: Language; Consciousness
Link ID: 23525 - Posted: 04.22.2017

Tara García Mathewson You saw the pictures in science class—a profile view of the human brain, sectioned by function. The piece at the very front, right behind where a forehead would be if the brain were actually in someone’s head, is the pre-frontal cortex. It handles problem-solving, goal-setting, and task execution. And it works with the limbic system, which is connected and sits closer to the center of the brain. The limbic system processes emotions and triggers emotional responses, in part because of its storage of long-term memory. When a person lives in poverty, a growing body of research suggests the limbic system is constantly sending fear and stress messages to the prefrontal cortex, which overloads its ability to solve problems, set goals, and complete tasks in the most efficient ways. This happens to everyone at some point, regardless of social class. The overload can be prompted by any number of things, including an overly stressful day at work or a family emergency. People in poverty, however, have the added burden of ever-present stress. They are constantly struggling to make ends meet and often bracing themselves against class bias that adds extra strain or even trauma to their daily lives. And the science is clear—when brain capacity is used up on these worries and fears, there simply isn’t as much bandwidth for other things. Economic Mobility Pathways, or EMPath, has built its whole service-delivery model around this science, which it described in its 2014 report, “Using Brain Science to Design New Pathways Out of Poverty.” The Boston nonprofit started out as Crittenton Women’s Union, a merger of two of the city’s oldest women-serving organizations, both of which focused on improving the economic self-sufficiency of families. It continues that work with a new name and a burgeoning focus on intergenerational mobility. © 2017 by The Atlantic Monthly Group.

Keyword: Development of the Brain; Learning & Memory
Link ID: 23514 - Posted: 04.20.2017