Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 838

By Erik Parens Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished? Some researchers working at the intersection of psychology, neuroscience and philosophy think the answer is yes. Their reasoning is straightforward: if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense. As Joshua Greene and Jonathan Cohen speculated in a 2004 essay: “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.” Just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, we need to view ourselves through two lenses to gain a greater depth of understanding of ourselves. This past summer, Greene and several other colleagues did empirical work that appears to confirm that 2004 speculation. The new work finds that when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished. According to the study’s authors, once students are led to question the concept of free will — understood as the idea that humans “can generate spontaneous choices and actions not determined by prior events” — they begin to find the idea of “just deserts” untenable. “When genuine choice is deemed impossible, condemnation is less justified,” the authors write. © 2014 The New York Times Company

Keyword: Consciousness; Aggression
Link ID: 20131 - Posted: 09.29.2014

By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was ­texting a girlfriend at the time, adding in unmistakable anguish that he can’t even ­remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20124 - Posted: 09.27.2014

Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.

Keyword: Attention
Link ID: 20115 - Posted: 09.25.2014

By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.

Keyword: Attention
Link ID: 20088 - Posted: 09.18.2014

by Bethany Brookshire Most of us wish we ate better. I know I certainly do. But when hunger strikes, and you’re standing in line at the grab-and-go food joint, that salad seems really lackluster sitting next to that tasty-looking cookie. I can’t help but think that my diet — and my waistline — would look a lot better if I just craved lettuce a little more. Now a new study shows that although we may never cease to love cookies, we might be able to make that carrot a little more appealing. In overweight people, a behavioral intervention was associated with changes in how their brains responded to high- and low-calorie foods. The small pilot study is intriguing, but with just 13 participants, a larger study is needed before scientists will know if training the brain can make us abstain. “Everyone responds more strongly to high-calorie foods than low-calorie foods. It’s just normal,” says study coauthor Susan Roberts, a behavioral nutrition scientist from Tufts University in Medford, Mass. While most people prefer brownies over beets, people who are overweight or obese have a harder time avoiding high-calorie foods, she says. “When someone becomes overweight, there’s a dampening effect on a number of brain structures, including the reward system,” she says. “It’s harder to enjoy food generally, and so when someone becomes overweight, they really want to eat those high-calorie foods, because those are the foods that activate reward systems to the biggest extent.” Craving is a particular issue. Craving is distinct from hunger and focuses on a particular food, often foods that are high calorie. Other studies show that people who are obese have more cravings than those who are not. © Society for Science & the Public 2000 - 2014

Keyword: Obesity; Aggression
Link ID: 20086 - Posted: 09.17.2014

By Megan Allison Diagnoses of Attention Hyperactivity Disorder are on the rise. The Centers for Disease Control and Prevention calculated that by 2011, 11 percent of children had been diagnosed with ADHD, and 6.1 percent of all US children were taking an ADHD medication. But could a solution be as simple as exercise? A study published this month in the Journal of Abnormal Child Psychology found that aerobic activity sessions before school helped children with ADHD with their moods and attention spans. The study involved a group of just over 200 students in kindergarten through second grade at schools in Indiana and Vermont. For 12 weeks, the students did 31 minutes of physical activity. The control group participated in classroom activities during this time. Although the results showed that all students showed improvement, authors noted that the exercise especially helped kids with ADHD. “It benefits all the kids, but I definitely see where it helps the kids with ADHD a lot,” said Jill Fritz, a fourth-grade teacher in Jacksonville, Fla. in an interview with The Wall Street Journal. “It really helps them get back on track and get focused.” In the Boston area, Dr. Sarah Sparrow Benes, Program Director of Physical and Health Education Programs at Boston University, teaches elementary and special educators how to use movement as a strategy in their classroom for learning. She finds that all students can benefit from exercise.

Keyword: ADHD; Aggression
Link ID: 20083 - Posted: 09.17.2014

Ewen Callaway A dozen volunteers watched Alfred Hitchcock for science while lying motionless in a magnetic-resonance scanner. Another participant, a man who has lived in a vegetative state for 16 years, showed brain activity remarkably similar to that of the healthy volunteers — suggesting that plot structure had an impact on him. The study is published in this week's Proceedings of the National Academy of Sciences1. The film, an 1961 episode of the TV show Alfred Hitchcock Presents that had been condensed down to 8 minutes, is a study in suspense. In it, a 5-year-old totes a partially loaded revolver — which she thinks is a toy — around her suburban neighbourhood, shouting “bang” each time she aims at someone and squeezes the trigger. While the study participants watched the film, researchers monitored their brain activity by functional magnetic resonance imaging (fMRI). All 12 healthy participants showed similar patterns of activity, particularly in parts of the brain that have been linked to higher cognition (frontal and parietal regions) as well as in regions involved in processing sensory information (auditory and visual cortices). One behaviourally non-responsive person, a 20-year-old woman, showed patterns of brain activity only in sensory areas. But another person, a 34-year-old man who has been in a vegetative state since he was 18, had patterns of brain activity in the executive and sensory brain areas, similarly to that of the healthy subjects. “It was actually indistinguishable from a healthy participant watching the movie,” says Adrian Owen, a neuroscientist at the University of Western Ontario in London, Canada (see: 'Neuroscience: The mind reader'). © 2014 Nature Publishing Group

Keyword: Consciousness
Link ID: 20080 - Posted: 09.16.2014

By GARY GUTTING Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it. Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view? Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative. The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20056 - Posted: 09.10.2014

By Smitha Mundasad Health reporter, BBC News More than 300 people a year in the UK and Ireland report they have been conscious during surgery - despite being given general anaesthesia. In the largest study of its kind, scientists suggests this happens in one in every 19,000 operations. They found episodes were more likely when women were given general anaesthesia for Caesarean sections or patients were given certain drugs. Experts say though rare, much more needs to be done to prevent such cases. Led by the Royal College of Anaesthetists and Association of Anaesthetists of Great Britain and Ireland, researchers studied three million operations over a period of one year. More than 300 people reported they had experienced some level of awareness during surgery. Most episodes were short-lived and occurred before surgery started or after operations were completed. But some 41% of cases resulted in long-term psychological harm. Patients described a variety of experiences - from panic and pain to choking - though not all episodes caused concern. The most alarming were feelings of paralysis and being unable to communicate, the researchers say. One patient, who wishes to remain anonymous, described her experiences of routine orthodontic surgery at the age of 12. She said: "I could hear voices around me and I realised with horror that I had woken up in the middle of the operation but couldn't move a muscle. BBC © 2014

Keyword: Consciousness; Aggression
Link ID: 20055 - Posted: 09.10.2014

By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.

Keyword: Attention
Link ID: 20049 - Posted: 09.09.2014

by Chris Higgins Neuroscientists have pinpointed where imagination hides in the brain and found it to be functionally distinct from related processes such as memory. The team from Brigham Young University (BYU), Utah-- including research proposer, undergraduate student Stefania Ashby -- used functional Magnetic Resonance Imaging (fMRI) to observe brain activity when subjects were remembering specific experiences and putting themselves in novel ones. "I was thinking a lot about planning for my own future and imagining myself in the future, and I started wondering how memory and imagination work together," Ashby said. "I wondered if they were separate or if imagination is just taking past memories and combining them in different ways to form something I've never experienced before." The two processes of remembering and imagining have been previously proposed to be the same cognitive task, and so thought to be carried out by the same areas of the brain. However, the experiments derived by Ashby and her mentor (and coauthor) BYU professor Brock Kirwan have refuted these ideas. The studies -- published in the journal Cognitive Neuroscience -- required participants to submit 60 photographs of previous life events and use them to create prompts for the "remember" sections. They then carried out a questionnaire before putting the subject into the MRI scanner to determine what scenarios were the most novel to them and force them into imagination. Then, under fMRI testing, the subjects were prompted with various scenarios and the areas of their brain that became active during each scenario was correlated with each scene's familiarity -- pure memory, or imagination. © Condé Nast UK 2014

Keyword: Learning & Memory; Aggression
Link ID: 20026 - Posted: 09.03.2014

by Tom Siegfried René Descartes was a very clever thinker. He proved his own existence, declaring that because he thought, he must exist: “I think, therefore I am.” But the 17th century philosopher-mathematician-scientist committed a serious mental blunder when he decided that the mind doing the thinking was somehow separate from the brain it lived in. Descartes believed that thought was insubstantial, transmitted from the ether to the pineal gland, which played the role of something like a Wi-Fi receiver embedded deep in the brain. Thereafter mind-brain dualism became the prevailing prejudice. Nowadays, though, everybody with a properly working brain realizes that the mind and brain are coexistent. Thought processes and associated cognitive mental activity all reflect the physics and chemistry of cells and molecules inhabiting the brain’s biological tissue. Many people today do not realize, though, that there’s a modern version of Descartes’ mistaken dichotomy. Just as he erroneously believed the mind was distinct from the brain, some scientists have mistakenly conceived of the brain as distinct from the body. Much of the early research in artificial intelligence, for instance, modeled the brain as a computer, seeking to replicate mental life as information processing, converting inputs to outputs by logical rules. But even if such a machine could duplicate the circuitry of the brain, it would be missing essential peripheral input from an attached body. Actual intelligence requires both body and brain, as the neurologist Antonio Damasio pointed out in his 1994 book, Descartes’ Error. “Mental activity, from its simplest aspects to its most sublime, requires both brain and body proper,” Damasio wrote. © Society for Science & the Public 2000 - 2013.

Keyword: Consciousness
Link ID: 20002 - Posted: 08.27.2014

|By Matthew H. Schneps “There are three types of mathematicians, those who can count and those who can’t.” Bad joke? You bet. But what makes this amusing is that the joke is triggered by our perception of a paradox, a breakdown in mathematical logic that activates regions of the brain located in the right prefrontal cortex. These regions are sensitive to the perception of causality and alert us to situations that are suspect or fishy — possible sources of danger where a situation just doesn’t seem to add up. Many of the famous etchings by the artist M.C. Escher activate a similar response because they depict scenes that violate causality. His famous “Waterfall” shows a water wheel powered by water pouring down from a wooden flume. The water turns the wheel, and is redirected uphill back to the mouth of the flume, where it can once again pour over the wheel, in an endless cycle. The drawing shows us a situation that violates pretty much every law of physics on the books, and our brain perceives this logical oddity as amusing — a visual joke. The trick that makes Escher’s drawings intriguing is a geometric construction psychologists refer to as an “impossible figure,” a line-form suggesting a three-dimensional object that could never exist in our experience. Psychologists, including a team led by Catya von Károlyi of the University of Wisconsin-Eau Claire, have used such figures to study human cognition. When the team asked people to pick out impossible figures from similarly drawn illustrations that did not violate causality, they were surprised to discover that some people were faster at this than others. And most surprising of all, among those who were the fastest were those with dyslexia. © 2014 Scientific American

Keyword: Dyslexia; Aggression
Link ID: 19978 - Posted: 08.20.2014

Helen Shen For most adults, adding small numbers requires little effort, but for some children, it can take all ten fingers and a lot of time. Research published online on 17 August in Nature Neuroscience1 suggests that changes in the hippocampus — a brain area associated with memory formation — could help to explain how children eventually pick up efficient strategies for mathematics, and why some children learn more quickly than others. Vinod Menon, a developmental cognitive neuroscientist at Stanford University in California, and his colleagues presented single-digit addition problems to 28 children aged 7–9, as well as to 20 adolescents aged 14–17 and 20 young adults. Consistent with previous psychology studies2, the children relied heavily on counting out the sums, whereas adolescents and adults tended to draw on memorized information to calculate the answers. The researchers saw this developmental change begin to unfold when they tested the same children at two time points, about one year apart. As the children aged, they began to move away from counting on fingers towards memory-based strategies, as measured by their own accounts and by decreased lip and finger movements during the task. Using functional magnetic resonance imaging (fMRI) to scan the children's brains, the team observed increased activation of the hippocampus between the first and second time point. Neural activation decreased in parts of the prefrontal and parietal cortices known to be involved in counting, suggesting that the same calculations had begun to engage different neural circuits. © 2014 Nature Publishing Group

Keyword: Development of the Brain; Aggression
Link ID: 19971 - Posted: 08.18.2014

Ever wonder why it’s hard to focus after a bad night’s sleep? Using mice and flashes of light, scientists show that just a few nerve cells in the brain may control the switch between internal thoughts and external distractions. The study, partly funded by the National Institutes of Health, may be a breakthrough in understanding how a critical part of the brain, called the thalamic reticular nucleus (TRN), influences consciousness. “Now we may have a handle on how this tiny part of the brain exerts tremendous control over our thoughts and perceptions,” said Michael Halassa, M.D., Ph.D., assistant professor at New York University’s Langone Medical Center and a lead investigator of the study. “These results may be a gateway into understanding the circuitry that underlies neuropsychiatric disorders.” The TRN is a thin layer of nerve cells on the surface of the thalamus, a center located deep inside the brain that relays information from the body to the cerebral cortex. The cortex is the outer, multi-folded layer of the brain that controls numerous functions, including one’s thoughts, movements, language, emotions, memories, and visual perceptions. TRN cells are thought to act as switchboard operators that control the flow of information relayed from the thalamus to the cortex. To understand how the switches may work, Dr. Halassa and his colleagues studied the firing patterns of TRN cells in mice during sleep and arousal, two states with very different information processing needs. The results published in Cell, suggest that the TRN has many switchboard operators, each dedicated to controlling specific lines of communication. Using this information, the researchers could alter the attention span of mice.

Keyword: Sleep; Aggression
Link ID: 19965 - Posted: 08.16.2014

|By Piercarlo Valdesolo In the summer of 2009 I tried to cure homemade sausages in my kitchen. One of the hazards of such a practice is preventing the growth of undesirable molds and diseases such as botulism. My wife was not on board with this plan, skeptical of my ability to safely execute the procedure. And so began many weeks of being peppered with warnings, relevant articles and concerned looks. When the time came for my first bite, nerves were high. My throat itched. My heart raced. My vision blurred. I had been botulized! Halfway through our walk to the hospital I regained my composure. Of course I had not been instantaneously struck by an incredibly rare disease that, by the way, takes at least 12 hours after consumption to manifest and does not share many symptoms with your garden variety anxiety attack. My experience had been shaped by my mindset. A decade of learning about the psychological power of expectations could not inoculate me from its effect. Psychologists know that beliefs about how experiences should affect us can bring about the expected outcomes. Though these “placebo effects” have primarily been studied in the context of pharmaceutical interventions (e.g. patients reporting pain relief after receiving saline they believed to be an analgesic), recent research has shown their strength in a variety of domains. Tell people that their job has exercise benefits and they will lose more weight than their coworkers who had no such belief. Convince people of a correlation between athleticism and visual acuity and they will show better vision after working out . Trick people into believing they are consuming caffeine and their vigilance and cognitive functioning increases. Some evidence shows that such interventions can even mitigate the negative effects of other experiences. For example, consuming placebo caffeine alleviates the cognitive consequences of sleep deprivation. © 2014 Scientific American

Keyword: Sleep; Aggression
Link ID: 19951 - Posted: 08.13.2014

|By Nathan Collins Time zips by when you're having fun and passes slowly when you're not—except when you are depressed, in which case your time-gauging abilities are pretty accurate. Reporting in PLOS ONE, researchers in England and Ireland asked 39 students—18 with mild depression—to estimate the duration of tones lasting between two and 65 seconds and to produce tones of specified lengths of time. Happier students overestimated intervals by 16 percent and produced tones that were short by 13 percent, compared with depressed students' 3 percent underestimation and 8 percent overproduction. The results suggest that depressive realism, a phenomenon in which depressed people perceive themselves more accurately (and less positively) than typical individuals, may extend to aspects of thought beyond self-perception—in this case, time. They speculate that mindfulness treatments may be effective for depression, partly because they help depressed people focus on the moment, rather than its passing. © 2014 Scientific American

Keyword: Depression; Aggression
Link ID: 19942 - Posted: 08.12.2014

By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American

Keyword: Attention; Aggression
Link ID: 19939 - Posted: 08.12.2014

By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company

Keyword: Attention; Aggression
Link ID: 19936 - Posted: 08.11.2014

By KATHARINE Q. SEELYE SPARTA, N.J. — When Gail Morris came home late one night after taking her daughter to college, she saw her teenage son, Alex, asleep on the sofa in the family room. Nothing seemed amiss. An unfinished glass of apple juice sat on the table. She tucked him in under a blanket and went to bed. The next morning, he would not wake up. He was stiff and was hardly breathing. Over the next several hours, Ms. Morris was shocked to learn that her son had overdosed on heroin. She was told he would not survive. He did survive, but barely. He was in a coma for six weeks. He went blind and had no function in his arms or legs. He could not speak or swallow. Hospitalized for 14 months, Alex, who is 6-foot-1, dropped to 90 pounds. One of his doctors said that Alex had come as close to dying as anyone he knew who had not actually died. Most people who overdose on heroin either die or fully recover. But Alex plunged into a state that was neither dead nor functional. There are no national statistics on how often opioid overdose leads to cases like Alex’s, but doctors say they worry that with the dramatic increase in heroin abuse and overdoses, they will see more such outcomes. “I would expect that we will,” said Dr. Nora Volkow, director of the National Institute on Drug Abuse. “They are starting to report isolated cases like this. And I would not be surprised if you have more intermediate cases with more subtle impairment.” More than 660,000 Americans used heroin in 2012, the federal government says, double the number of five years earlier. Officials attribute much of the increase to a crackdown on prescription painkillers, prompting many users to turn to heroin, which is cheaper and easier to get than other opioids. © 2014 The New York Times Company

Keyword: Consciousness; Aggression
Link ID: 19935 - Posted: 08.11.2014