Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1353

By Todd E. Feinberg, Jon Mallatt Consciousness seems mysterious. By this we mean that while life in general can be explained by physics, chemistry and biology, it seems that whenever one tries to explain the relationship between the brain and the subjective events that are experienced as feelings—what philosophers often refer to as “qualia”—something appears to be “left out” of the explanation. This apparent divide between the brain and subjective experience is what philosopher Joseph Levine famously called this the “explanatory gap,” and how to bridge that gap is what philosopher David Chalmers called the term “hard problem of consciousness.” We study primary consciousness, the most basic type of sensory experience. This is the ability to have any experience or feeling at all, what philosopher Thomas Nagel called “something it is like to be” in his famous 1974 paper “What is it like to be a bat?” Over the last few years, we have tried to “demystify” primary consciousness by combining neural and philosophical aspects of the problem into a unified view of how feelings are created in a naturally biological way. Our analysis leads us to the view that the puzzle of consciousness and the explanatory gap actually has two related aspects: an ontological aspect and an epistemic aspect and that both have a natural and scientific explanation. First, we consider the ontological aspect of the problem. This part of the puzzle entails what philosopher John Searle called the “ontological subjectivity” of consciousness. This is the idea that consciousness has a unique and fundamentally “first-person” ontology—or mode of being—in that feelings only exist when experienced by an animal subject. The implications of this view would be that no manner of objective scientific explanation, no matter how complete, would “explain away” the neurobiologically unique subjective feelings that are associated with certain brain states—in other words how things feel. The challenge here is to explain this unique aspect of feelings in a way that is consistent with an entirely scientific world view and do so without invoking any new or fundamentally “mysterious” physical principles. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25585 - Posted: 10.17.2018

By Frankie Schembri Humans are awful at estimating a person’s age based on their face alone. This can lead not only to uncomfortable social situations, but also to critical errors in criminal investigations and enforcing age-based restrictions on such things as alcohol and gambling. New research shows people are usually off by about 8 years, and their estimate might be shaped by the last face they saw. To conduct the study, researchers collected 3968 pictures of consenting participants from the Australian Passport Office—31 men and 31 women at each age from 7 through 70. Then, they showed 81 people photographs of a man and woman at each age in a random sequence, and asked them to guess their ages. The faces above are computer-generated averages of more than 100 pictures from the study of people aged 19 to 22, 50 to 53, and 63 to 66. Volunteers consistently guessed that young faces were several years older than they actually were and that older faces were several years younger than they actually were, the team reports today in Royal Society Open Science. The results also showed that people’s estimates were affected by the previous face they had viewed—if they had just seen a young face, they usually lowballed the next face’s age, and vice versa. © 2018 American Association for the Advancement of Science

Keyword: Attention
Link ID: 25581 - Posted: 10.17.2018

By Frankie Schembri Think of all the faces you know. As you flick through your mental Rolodex, your friends, family, and co-workers probably come first—along with celebrities—followed by the faces of the nameless strangers you encounter during your daily routine. But how many faces can the human Rolodex store? To ballpark the size of the average person’s “facial vocabulary,” researchers gave 25 people 1 hour to list as many faces from their personal lives as possible, and then another hour to do the same with famous faces, like those of actors, politicians, and musicians. If the participants couldn’t remember a person’s name, but could imagine their face, they used a descriptive phrase like “the high school janitor,” or “the actress from Friends with the haircut.” People came up with lots of faces during the first minutes of the test, but the rate of remembrance dropped over the course of the hour. By graphing this relationship and extrapolating it to when most people would run out of faces, the researchers estimated the number of faces an average person can recall from memory. To figure out how many additional faces people recognized but were unable to recall without prompting, researchers showed the participants photographs of 3441 celebrities, including Barack Obama and Tom Cruise. To qualify as “knowing” a face, the participants had to recognize two different photos of each person. © 2018 American Association for the Advancement of Science

Keyword: Attention
Link ID: 25552 - Posted: 10.10.2018

By John Horgan I’m already getting pushback against my free online book Mind-Body Problems: Science, Subjectivity & Who We Really Are. Tom Clark knocks me for not giving more credit to straight-forward materialism, or naturalism, as he prefers to call it. Meanwhile, Deepak Chopra, while defending an anti-materialistic view, compares my pluralistic approach “to giving every player in a junior soccer match a trophy.” Good one, Deepak! See “Discussion” for these and other comments. It was precisely because people have divergent views of the mind-body problem that I decided to write a book about it. The mind-body problem is the knottiest of all mysteries. It encompasses puzzles such as consciousness (which David Chalmers calls “the hard problem”), free will, the self, morality and the meaning of life (which Owen Flanagan, a subject of my book, calls “the really hard problem”). Another way of posing the mind-body problem is simply by asking, Who are we, really? Sages as diverse as Buddha, Plato, Kant and Douglas Hofstadter (to whom I devote a chapter of Mind-Body Problems) have offered answers to this question. In the early 1990s, Francis Crick said that science had finally given us the tools to solve the problem once and for all. In his 1994 book The Astonishing Hypothesis, he spells out the implications of his ultra-materialistic creed “You,” your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. As Lewis Carroll’s Alice might have phrased it: “You’re nothing but a pack of neurons.” © 2018 Scientific American

Keyword: Consciousness
Link ID: 25543 - Posted: 10.08.2018

By Emily Underwood The ornately folded outer layer of the human brain, the cerebral cortex, has long received nearly all the credit for our ability to perform complex cognitive tasks such as composing a sonata, imagining the plot of a novel or reflecting on our own thoughts. One explanation for how we got these abilities is that the cortex rapidly expanded relative to body size as primates evolved — the human cortex has 10 times the surface area of a monkey’s cortex, for example, and 1,000 times that of a mouse. But the cortex is not the only brain region that has gotten bigger and more complex throughout evolution. Nestled beneath the cortex, a pair of egg-shaped structures called the thalamus has also grown, and its wiring became much more intricate as mammals diverged from reptiles. The thalamus — from the Greek thalamos, or inner chamber — transmits 98 percent of sensory information to the cortex, including vision, taste, touch and balance; the only sense that doesn’t pass through this brain region is smell. The thalamus also conducts motor signals and relays information from the brain stem to the cortex, coordinating shifts in consciousness such as waking up and falling asleep. Scientists have known for decades that the thalamus faithfully transmits information about the visual world from the retina to the cortex, leading to the impression that it is largely a messenger of sensory information rather than a center of complex cognition itself. But that limited, passive view of the thalamus is outdated, maintains Michael Halassa, a neuroscientist at the Massachusetts Institute of Technology who recently coauthored (with Ralf D. Wimmer and Rajeev V. Rikhye) an article in the Annual Review of Neuroscience exploring the thalamus’s role. © 2018 Annual Reviews, Inc

Keyword: Attention
Link ID: 25542 - Posted: 10.08.2018

By Michael Price Alien limb syndrome isn’t as extraterrestrial as it sounds—but it’s still pretty freaky. Patients complain that one of their hands has gone “rogue,” reaching for things without their knowledge. “They sit on their hand trying to get it not to move,” says Ryan Darby, a neurologist and neuroscientist at Vanderbilt University in Nashville. “They’re not crazy. They know there’s not something controlling their arm, that it’s not possessed. But they really feel like they don’t have control.” Now, a study analyzing the locations of brain lesions in these patients—and those who have akinetic mutism, in which people can scratch an itch and chew food placed into their mouths without being aware they’ve initiated these movements—are shedding light on how our brains know what’s going on with our bodies. The work shows how neuroscience is beginning to approach elements of the biological nature of free will. “I think it's really nice work, carefully done and thoughtfully presented,” says Kevin Mitchell, a neurogeneticist at Trinity College in Dublin who studies perception and who wasn’t involved in the study. Philosophers have wrestled with questions of free will—that is, whether we are active drivers or passive observers of our decisions—for millennia. Neuroscientists tap-dance around it, asking instead why most of us feel like we have free will. They do this by looking at rare cases in which people seem to have lost it. © 2018 American Association for the Advancement of Science

Keyword: Consciousness
Link ID: 25520 - Posted: 10.02.2018

By John Horgan I just finished Tao Lin’s new book Trip: Psychedelics, Alienation, and Change, and I have some things to say about it. I’m a Lin fan. He first came to my attention in 2013 when he mailed me his novel Taipei, which mentions a trippy scene in The End of Science. Taipei is a lightly fictionalized memoir that details a young writer’s consumption of drugs, including uppers, downers, heroin, cannabis and a smattering of psychedelics, sometimes all in combination. Lin writes with a deadpan hyper-realism so acute that he makes other fiction and non-fiction seem phony. Even when he’s funny, Lin is bleak, but there’s something exhilarating about the precision with which he describes the world, other people, the swirl of his thoughts and emotions. He’s like a stoned American version of Norwegian memoirist/novelist Karl Ove Knausgaard, author of My Struggle. Lin also reminds me of Jack Kerouac, who in On the Road and Dharma Bums desperately chases epiphanies in an effort to escape his tormented self. Advertisement By the time I finished Taipei, I was worried about the author, who seems to be in a state of terminal despair. Lin was apparently worried too. Trip recounts how he pulls himself out of his “zombie-like and depressed” funk by immersing himself in the writings and online talks of psychedelic visionary Terence McKenna. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25513 - Posted: 10.01.2018

By Bret Stetka More often than not a trip to Las Vegas is not a financially sound decision. And yet every year over 40 million people hand over their cash to the city’s many towering casinos, hoping the roulette ball rattles to a stop on black. Gambling and other forms of risk-taking appear to be hardwired into our psyche. Humans at least as far back as Mesopotamia have rolled the dice, laying their barley, bronze and silver on the line, often against miserable odds. According to gambling industry consulting company H2 Gambling Capital, Americans alone lose nearly $120 billion a year to games of chance. Now a set of neuroscience findings is closer than ever to figuring out why. Ongoing research is helping illuminate the biology of risky behaviors—studies that may one day lead to interventions for vices like compulsive gambling. The recent results show an explanation is more complex than looking at dysfunctional reward circuitry, the network of brain regions that fire in response to pleasing stimuli like sex and drugs. Risking loss on a slim chance of thrill or reward involves a complex dance of decision-making and emotion. A new study by a team from Johns Hopkins University appears to have identified a region of the brain that plays a critical role in risky decisions. Published September 20 in Current Biology, the authors analyzed the behavior of rhesus monkeys, who share similar brain structure and function to our own. And like us, they are risk-takers, too. © 2018 Scientific American

Keyword: Drug Abuse; Attention
Link ID: 25481 - Posted: 09.22.2018

By Kelly Servick PHILADELPHIA, PENNSYLVANIA—While artificial intelligence (AI) has been busy trouncing humans at Go and spawning eerily personable Alexas, some neuroscientists have harbored a different hope: that the types of algorithms driving those technologies can also yield some insight into the squishy, wet computers in our skulls. At the Conference on Cognitive Computational Neuroscience here this month, researchers presented new tools for comparing data from living brains with readouts from computational models known as deep neural networks. Such comparisons might offer up new hypotheses about how humans process sights and sounds, understand language, or navigate the world. “People have fantasized about that since the 1980s,” says Josh McDermott, a computational neuroscientist at the Massachusetts Institute of Technology (MIT) in Cambridge. Until recently, AI couldn’t come close to human performance on tasks such as recognizing sounds or classifying images. But deep neural networks, loosely inspired by the brain, have logged increasingly impressive performances, especially on visual tasks. That “brings the question back to mind,” says neuroscientist Chris Baker of the National Institute of Mental Health in Bethesda, Maryland. Deep neural networks work by passing information between computational “nodes” that are arranged in successive layers. The systems hone skills on huge sets of data; for networks that classify images, that usually means collections of labeled photos. Performance improves with feedback as the systems repeatedly adjust the strengths of the connections between nodes. © 2018 American Association for the Advancement of Science

Keyword: Attention; Vision
Link ID: 25466 - Posted: 09.18.2018

By John Horgan It is the central mystery of existence, the one toward which all other mysteries converge. Schopenhauer called it the world knot (hence the image above). Descartes often gets credit for posing it first, but Socrates pondered it millennia earlier, as did Buddha and other Eastern sages. I’m talking about the mind-body problem, which encompasses the riddles of consciousness, the self, free will, morality, the meaning of life. Modern scientists and philosophers often make the mind-body problem seem hopelessly esoteric, a topic only for experts. Hard-core materialists insist it is a pseudo-problem, which vanishes once you jettison archaic concepts like “the self” and “free will.” Actually, the mind-body problem is quite real, simple and urgent. You face it whenever you wonder who you really are. Long before I heard of it, I was obsessed with the mind-body problem. I touch on it, directly or indirectly, in my previous four books, even The End of War, the epilogue of which is called “In Defense of Free Will.” Writing hasn’t been cathartic. The more I write about the mind-body problem, the more it grips me. In 2015, after attending a workshop on a weird new theory of consciousness, I started looking at the mind-problem in a new way. Our responses to the mind-body problem will always be emotional as well as rational, a matter of taste as much as truth. We can’t escape our subjectivity when we try to solve the riddle of ourselves. So I conjectured. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25457 - Posted: 09.17.2018

By Abraham Loeb Scientific discoveries substantiate our awe when faced with the richness and universality of the laws of nature. But science falls short of explaining this natural order and why it exists in the first place. This is where philosophy comes to the rescue. Science seeks to understand how the universe works, just as we might try to figure out the mechanics of a sophisticated engine. Philosophy, by contrast, addresses questions that transcend the functionality of nature, as we might pursue the complementary task of figuring out why the engine is constructed in a particular way. As a scientist, I am surprised at the degree of organization the universe exhibits; the same laws that govern its earliest moments—something we know from observations of the most distant galaxies and most ancient radiation—also preside over what we find today in laboratories on Earth. This should not be taken for granted. We could have witnessed a fragmented reality, one in which different regions of spacetime obey different sets of laws or even behave chaotically with no rational explanation. By studying the physical constituents of an engine, one acquires a better understanding of how it works but not necessarily the purpose for its existence. Metaphysical thinking can supplement science in territories not accessible to empirical inquiry. Within these domains, philosophy can build on scientific knowledge rather than yield to it. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25447 - Posted: 09.13.2018

Jon Hamilton Kids with ADHD are easily distracted. Barn owls are not. So a team at Johns Hopkins University in Baltimore is studying these highly focused predatory birds in an effort to understand the brain circuits that control attention. The team's long-term goal is to figure out what goes wrong in the brains of people with attention problems, including attention deficit hyperactivity disorder. "We think we have the beginnings of an answer," says Shreesh Mysore, an assistant professor who oversees the owl lab at Hopkins. The answer, he says, appears to involve an ancient brain area with special cells that tell us what to ignore. Mysore explains his hypothesis from one of the owl rooms in his basement lab. He has a distraught bird perched on his forearm. And as he talks, he tries to soothe the animal. The owl screeches, flaps and digs its talons into the elbow-length leather glove that Mysore wears for protection. He covers the bird's eyes with his free hand and hugs the animal to his chest. The owl, no longer able to focus on the movements of his human visitors, goes quiet. When it comes to paying attention, barn owls have a lot in common with people, Mysore says. "Essentially, a brain decides at any instant: What is the most important piece of information for behavior or survival?" he says. "And that is the piece of information that gets attended to, that drives behavior." © 2018 npr

Keyword: ADHD; Attention
Link ID: 25441 - Posted: 09.12.2018

By Elena Pasquinelli Ten years ago technology writer Nicholas Carr published an article in the Atlantic entitled “Is Google Making Us Stupid?” He strongly suspected the answer was “yes.” Himself less and less able to focus, remember things or absorb more than a few pages of text, he accused the Internet of radically changing people’s brains. And that is just one of the grievances leveled against the Internet and at the various devices we use to access it–including cell phones, tablets, game consoles and laptops. Often the complaints target video games that involve fighting or war, arguing that they cause players to become violent. But digital devices also have fervent defenders—in particular the promoters of brain-training games, who claim that their offerings can help improve attention, memory and reflexes. Who, if anyone, is right? The answer is less straightforward than you might think. Take Carr’s accusation. As evidence, he quoted findings of neuroscientists who showed that the brain is more plastic than previously understood. In other words, it has the ability to reprogram itself over time, which could account for the Internet’s effect on it. Yet in a 2010 opinion piece in the Los Angeles Times, psychologists Christopher Chabris, then at Union College, and Daniel J. Simons of the University of Illinois at Urbana-Champaign rebutted Carr’s view: “There is simply no experimental evidence to show that living with new technologies fundamentally changes brain organization in a way that affects one’s ability to focus,” they wrote. And the debate goes on. © 2018 Scientific American

Keyword: Learning & Memory; Attention
Link ID: 25440 - Posted: 09.12.2018

By Rachel Bluth The number of children diagnosed with attention-deficit/hyper­activity disorder (ADHD) has reached more than 10 percent, a significant increase during the past 20 years, according to a new study. The rise was most pronounced in minority groups, suggesting that better access to health insurance and mental-health treatment through the Affordable Care Act (ACA) may have played some role in the increase. The rate of diagnosis doubled in girls, although it was still much lower than in boys. But the researchers say they found no evidence confirming frequent complaints that the condition is overdiagnosed or misdiagnosed. The United States has significantly more instances of ADHD than other developed countries, which researchers said has led some to think Americans are overdiagnosing children. Wei Bao, the lead author of the study, said in an interview that a review of studies around the world doesn’t support that. “I don’t think overdiagnosis is the main issue,” he said. Nonetheless, those doubts persist. Stephen Hinshaw, who co-authored a 2014 book called “The ADHD Explosion: Myths, Medication, Money, and Today’s Push for Performance,” compared ADHD to depression. He said in an interview that neither condition has unequivocal biological markers, which makes it hard to determine whether a person has the condition. Symptoms of ADHD can include inattention, fidgety behavior and impulsivity. © 1996-2018 The Washington Post

Keyword: ADHD; Development of the Brain
Link ID: 25434 - Posted: 09.11.2018

By: Richard Restak, M.D. Editor’s Note: Unthinkable’s author, a British neuroscientist, tracked down nine people with rare brain disorders to tell their stories. From the man who thinks he's a tiger to the doctor who feels the pain of others just by looking at them to a woman who hears music that’s not there, their experiences illustrate how the brain can shape our lives in unexpected and, in some cases, brilliant and alarming ways. Several years ago, science writer Helen Thomson, consultant to New Scientist and contributor to the Washington Post and Nature, decided to travel around the world to interview people with "the most extraordinary brains." In the process, as described in Unthinkable: An Extraordinary Journey Through the World's Strangest Brains (Ecco/Harper Collins 2018), Thomas discovered that "by putting their lives side-by-side, I was able to create a picture of how the brain functions in us all. Through their stories, I uncovered the mysterious manner in which the brain can shape our lives in unexpected—and, some cases, brilliant and alarming ways." Thomson wasn't just learning about the most extraordinary brains in the world, but in the process was "uncovering the secrets of my own." During her journey Thomson encounters Bob, who can remember days from 40 years ago with as much clarity and detail as yesterday; Sharon, who has lost her navigational abilities and on occasion becomes lost in her own home; Tommy who, after a ruptured aneurysm that damaged his left temporal lobe, underwent a total personality change; Sylvia, an otherwise normal retired school teacher who experiences near constant musical hallucinations; and Louise, who is afflicted with a permanent sense of detachment from herself and everyone around her. Beyond skillfully portraying each of these and other fascinating individuals, Thomson places them in historical and scientific context: when neuroscientists first encountered similar patients, along with past and current explanations of what has gone amiss in their brains. © 2018 The Dana Foundation

Keyword: Attention
Link ID: 25420 - Posted: 09.07.2018

By Bahar Gholipour Milena Canning can see steam rising from a coffee cup but not the cup. She can see her daughter’s ponytail swing from side to side, but she can’t see her daughter. Canning is blind, yet moving objects somehow find a way into her perception. Scientists studying her condition say it could reveal secrets about how humans process vision in general. Canning was 29 when a stroke destroyed her entire occipital lobe, the brain region housing the visual system. The event left her sightless, but one day she saw a flash of light from a metallic gift bag next to her. Her doctors told her she was hallucinating. Nevertheless, “I thought there must be something happening within my brain [allowing me to see],” she says. She went from doctor to doctor until she met Gordon Dutton, an ophthalmologist in Glasgow, Scotland. Dutton had encountered this mystery before—in a 1917 paper by neurologist George Riddoch describing brain-injured World War I soldiers. To help enhance Canning’s motion-based vision, Dutton prescribed her a rocking chair. Canning is one of a handful of people who have been diagnosed with the “Riddoch phenomenon,” the ability to perceive motion while blind to other visual stimuli. Jody Culham, a neuroscientist at Western University in Ontario, and her colleagues launched a 10-year investigation into Canning’s remarkable vision and published the results online in May in Neuropsychologia. The team confirmed that Canning was able to detect motion and its direction. She could see a hand moving toward her, but she could not tell a thumbs-up from a thumbs-down. She was also able to navigate around obstacles, reach and grasp, and catch a ball thrown at her. © 2018 Scientific American

Keyword: Attention; Vision
Link ID: 25409 - Posted: 09.01.2018

Megan MolteniMegan Molteni It’s been more than a century since Spanish neuroanatomist Santiago Ramón y Cajal won the Nobel Prize for illustrating the way neurons allow you to walk, talk, think, and be. In the intervening hundred years, modern neuroscience hasn’t progressed that much in how it distinguishes one kind of neuron from another. Sure, the microscopes are better, but brain cells are still primarily defined by two labor-intensive characteristics: how they look and how they fire. Which is why neuroscientists around the world are rushing to adopt new, more nuanced ways to characterize neurons. Sequencing technologies, for one, can reveal how cells with the same exact DNA turn their genes on or off in unique ways—and these methods are beginning to reveal that the brain is a more diverse forest of bristling nodes and branching energies than even Ramón y Cajal could have imagined. On Monday, an international team of researchers introduced the world to a new kind of neuron, which, at this point, is believed to exist only in the human brain. The long nerve fibers known as axons of these densely bundled cells bulge in a way that reminded their discoverers of a rose without its petals—so much that they named them “rose hip cells.” Described in the latest issue of Nature Neuroscience, these new neurons might use their specialized shape to control the flow of information from one region of the brain to another. “They can really act as a sort of brake on the system,” says Ed Lein, an investigator at the Allen Institute for Brain Science—home to several ambitious brain mapping projects—and one of the lead authors on the study. Neurons come in two basic flavors: Excitatory cells send information to the cells next to them, while inhibitory cells slow down or stop excitatory cells from firing. Rose hip cells belong to this latter type, and based on their physiology, seem to be a particularly potent current-curber. © 2018 Condé Nast

Keyword: Attention; Consciousness
Link ID: 25391 - Posted: 08.28.2018

By Susana Martinez-Conde Research has shown that the experience of pain is highly subjective: people feel more or less pain, in identical physical situations, as a function of their mood and attention. This flexibility showcases the potential for cognitive manipulations to decrease the pain associated with a variety of pathologies. As an example, the virtual-reality game “Snow World” (in which game in which players shoot snowballs to defeat snowman Frosty and his penguins) reportedly works better than morphine at counteracting the pain of patients in burn units. Other studies have indicated that virtual reality manipulations of the patient’s own body can also help ameliorate pain: an experiment conducted by neuroscientist Maria Victoria Sanchez-Vives and her team at the University of Barcelona in Spain showed that heat applied to experimental participants’ wrists felt more painful when their virtual arms turned red than when they turned blue or green. Following on this tradition, a study published PeerJ last month showed that visuotactile illusions can help the pain experienced by patients suffering from knee osteoarthritis. According to lead author Tasha Stanton, from the University of South Australia, the idea for the study originated from her observation that “people with knee osteoarthritis have an altered perception of their own body. [Their affected knee] often feels too big, and they also have changes to the way that touch and movement information is represented in the brain.” She hypothesized that patients may “respond to illusions that change the way their knee looks.” © 2018 Scientific American,

Keyword: Pain & Touch; Attention
Link ID: 25390 - Posted: 08.28.2018

By Anouk Bercht, Steven Laureys Steven Laureys greets me with a smile as I enter his office overlooking the hills of Lige. Although his phone rings constantly, he takes the time to talk to me about the fine points of what consciousness is and how to identify it in patients who seem to lack it. Doctors from all over Europe send their apparently unconscious patients to Laureys—a clinician and researcher at the University of Lige—for comprehensive testing. To provide proper care, physicians and family members need to know whether patients have some degree of awareness. At the same time, these patients add to Laureys’ understanding. The interview has been edited for clarity. What is consciousness? It is difficult enough to define “life,” even more so to define “conscious” life. There is no single definition. But of course, in clinical practice we need unambiguous criteria. In that setting, everyone needs to know what we mean by an “unconscious” patient. Consciousness is not “all or nothing.” We can be more or less awake, more or less conscious. Consciousness is often underestimated; much more is going on in the brains of newborns, animals and coma patients than we think. So how is it possible to study something as complex as consciousness? There are a number of ways to go about it, and the technology we have at our disposal is crucial in this regard. For example, without brain scanners we would know much, much less than we now do. We study the damaged brains of people who have at least partially lost consciousness. We examine what happens during deep sleep, when people temporarily lose consciousness. © 2018 Scientific American

Keyword: Consciousness
Link ID: 25375 - Posted: 08.24.2018

/ By Caroline Williams I‘m not the kind of girl who jumps into a strange man’s car and hopes for the best. Especially when a quick Google stalk reveals him to be recovering from an addiction to methamphetamine. But having been assured by someone I trust that he was “one of the good guys,” I accepted his offer of a ride to the airport and … hoped for the best. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, Caroline Williams shares a story that was left out of “My Plastic Brain: One Woman’s Yearlong Journey to Discover if Science Can Improve Her Mind,” published by Prometheus Books. Some books make it sound so easy: Change the way you think, and hey presto, you can become a different person. In hindsight I’m glad I did. After many months talking to scientists about brain change, it was this journey that prompted me to think more deeply about what that actually meant. I was in Lawrence, Kansas, researching a book that I hoped would apply the latest science to make real, measurable, and lasting changes to my brain. I wanted to learn, among other things, how to concentrate better and to overcome my irrational anxieties about life. I was in Kansas to try to boost my powers of creativity. Copyright 2018 Undark

Keyword: Learning & Memory; Depression
Link ID: 25349 - Posted: 08.18.2018