Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 1241

Staring down a packed room at the Hyatt Regency Hotel in downtown San Francisco this March, Randy Gallistel gripped a wooden podium, cleared his throat, and presented the neuroscientists sprawled before him with a conundrum. “If the brain computed the way people think it computes," he said, "it would boil in a minute." All that information would overheat our CPUs. Humans have been trying to understand the mind for millennia. And metaphors from technology—like cortical CPUs—are one of the ways that we do it. Maybe it’s comforting to frame a mystery in the familiar. In ancient Greece, the brain was a hydraulics system, pumping the humors; in the 18th century, philosophers drew inspiration from the mechanical clock. Early neuroscientists from the 20th century described neurons as electric wires or phone lines, passing signals like Morse code. And now, of course, the favored metaphor is the computer, with its hardware and software standing in for the biological brain and the processes of the mind. In this technology-ridden world, it’s easy to assume that the seat of human intelligence is similar to our increasingly smart devices. But the reliance on the computer as a metaphor for the brain might be getting in the way of advancing brain research. As Gallistel continued his presentation to the Cognitive Neuroscience Society, he described the problem with the computer metaphor. If memory works the way most neuroscientists think it does—by altering the strength of connections between neurons—storing all that information would be way too energy-intensive, especially if memories are encoded in Shannon information, high fidelity signals encoded in binary.

Keyword: Learning & Memory; Consciousness
Link ID: 23764 - Posted: 06.23.2017

Kerin Higa After surgery to treat her epilepsy severed the connection between the two halves of her brain, Karen's left hand took on a mind of its own, acting against her will to undress or even to slap her. Amazing, to be sure. But what may be even more amazing is that most people who have split-brain surgery don't notice anything different at all. But there's more to the story than that. In the 1960s, a young neuroscientist named Michael Gazzaniga began a series of experiments with split-brain patients that would change our understanding of the human brain forever. Working in the lab of Roger Sperry, who later won a Nobel Prize for his work, Gazzaniga discovered that the two halves of the brain experience the world quite differently. When Gazzaniga and his colleagues flashed a picture in front of a patient's right eye, the information was processed in the left side of the brain and the split-brain patient could easily describe the scene verbally. But when a picture was flashed in front of the left eye, which connects to the right side of the brain, the patient would report seeing nothing. If allowed to respond nonverbally, however, the right brain could adeptly point at or draw what was seen by the left eye. So the right brain knew what it was seeing; it just couldn't talk about it. These experiments showed for the first time that each brain hemisphere has specialized tasks. In this third episode of Invisibilia, hosts Alix Spiegel and Hanna Rosin talk to several people who are trying to change their other self, including a man who confronts his own biases and a woman who has a rare condition that causes one of her hands to take on a personality of its own. © 2017 npr

Keyword: Consciousness; Laterality
Link ID: 23749 - Posted: 06.17.2017

by Helen Thompson Paper wasps have a knack for recognizing faces, and a new study adds to our understanding of what that means in a wasp’s brain. Most wasps of a given species look the same, but some species of paper wasp (Polistes sp.) display varied colors and markings. Recognizing these patterns is at the core of the wasps’ social interactions. One species, Polistes fuscatus, is especially good at detecting differences in faces — even better than they are at detecting other patterns. To zero on the roots of this ability, biologist Ali Berens of Georgia Tech and her colleagues set up recognition exercises of faces and basic patterns for P. fuscatus wasps and P. metricus wasps — a species that doesn’t naturally recognize faces but can be trained to do so in the lab. After the training, scientists extracted DNA from the wasps’ brains and looked at which bits of DNA or genes were active. The researchers found 237 genes that were at play only in P. fuscatus during facial recognition tests. A few of the genes have been linked to honeybee visual learning, and some correspond to brain signaling with the neurotransmitters serotonin and tachykinin. In the brain, picking up on faces goes beyond basic pattern learning, the researchers conclude June 14 in the Journal of Experimental Biology. It’s possible that some of the same genes also play a broader role in how organisms such as humans and sheep tell one face from another. © Society for Science & the Public 2000 - 2017

Keyword: Attention
Link ID: 23742 - Posted: 06.15.2017

Maria Temming Fascination with faces is nature, not nurture, suggests a new study of third-trimester fetuses. Scientists have long known that babies like looking at faces more than other objects. But research published online June 8 in Current Biology offers evidence that this preference develops before birth. In the first-ever study of prenatal visual perception, fetuses were more likely to move their heads to track facelike configurations of light projected into the womb than nonfacelike shapes. Past research has shown that newborns pay special attention to faces, even if a “face” is stripped down to its bare essentials — for instance, a triangle of three dots: two up top for eyes, one below for a mouth or nose. This preoccupation with faces is considered crucial to social development. “The basic tendency to pick out a face as being different from other things in your environment, and then to actually look at it, is the first step to learning who the important people are in your world,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. Using a 4-D ultrasound, the researchers watched how 34-week-old fetuses reacted to seeing facelike triangles compared with seeing triangles with one dot above and two below. They projected triangles of red light in both configurations through a mother’s abdomen into the fetus’s peripheral vision. Then, they slid the light across the mom’s belly, away from the fetus’s line of sight, to see if it would turn its head to continue looking at the image. |© Society for Science & the Public 2000 - 2017

Keyword: Development of the Brain; Attention
Link ID: 23726 - Posted: 06.09.2017

Alex Burmester When you need to remember a phone number, a shopping list or a set of instructions, you rely on what psychologists and neuroscientists refer to as working memory. It’s the ability to hold and manipulate information in mind, over brief intervals. It’s for things that are important to you in the present moment, but not 20 years from now. Researchers believe working memory is central to the functioning of the mind. It correlates with many more general abilities and outcomes – things like intelligence and scholastic attainment – and is linked to basic sensory processes. Given its central role in our mental life, and the fact that we are conscious of at least some of its contents, working memory may become important in our quest to understand consciousness itself. Psychologists and neuroscientists focus on different aspects as they investigate working memory: Psychologists try to map out the functions of the system, while neuroscientists focus more on its neural underpinnings. Here’s a snapshot of where the research stands currently. How much working memory do we have? Capacity is limited – we can keep only a certain amount of information “in mind” at any one time. But researchers debate the nature of this limit. Many suggest that working memory can store a limited number of “items” or “chunks” of information. These could be digits, letters, words or other units. Research has shown that the number of bits that can be held in memory can depend on the type of item – flavors of ice cream on offer versus digits of pi. © 2010–2017, The Conversation US, Inc.

Keyword: Learning & Memory; Attention
Link ID: 23711 - Posted: 06.06.2017

Laurel Hamers A monkey’s brain builds a picture of a human face somewhat like a Mr. Potato Head — piecing it together bit by bit. The code that a monkey’s brain uses to represent faces relies not on groups of nerve cells tuned to specific faces — as has been previously proposed — but on a population of about 200 cells that code for different sets of facial characteristics. Added together, the information contributed by each nerve cell lets the brain efficiently capture any face, researchers report June 1 in Cell. “It’s a turning point in neuroscience — a major breakthrough,” says Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester in England who wasn’t part of the work. “It’s a very simple mechanism to explain something as complex as recognizing faces.” Until now, Quiroga says, the leading explanation for the way the primate brain recognizes faces proposed that individual nerve cells, or neurons, respond to certain types of faces (SN: 6/25/05, p. 406). A system like that might work for the few dozen people with whom you regularly interact. But accounting for all of the peripheral people encountered in a lifetime would require a lot of neurons. It now seems that the brain might have a more efficient strategy, says Doris Tsao, a neuroscientist at Caltech. Tsao and coauthor Le Chang used statistical analyses to identify 50 variables that accounted for the greatest differences between 200 face photos. Those variables represented somewhat complex changes in the face — for instance, the hairline rising while the face becomes wider and the eyes becomes further-set. |© Society for Science & the Public 2000 - 2017.

Keyword: Attention
Link ID: 23701 - Posted: 06.02.2017

Giuseppe Gangarossa Could it be possible to run a normal existence without social life? Indeed, sociability is an important aspect for individuals and social interaction builds our lives. In fact, social interaction enhances quality of life and improves the stability of communities. Impaired sociability is a classical symptom observed in many neuropsychiatric disorders including autism, schizophrenia, depression, anxiety and generalized fear. Interestingly, many studies have pointed to the medial prefrontal cortex (mPFC), a brain area located in the ventromedial part of the frontal lobe, as key region involved in the neural bases of sociability (Valk et al, 2015; Treadway et al., 2015; Frith et al., 2007). The prelimbic cortex (PL) and the infralimbic cortex (IL), two subregions of the mPFC, have been strongly suggested to play an important role in the neural mechanisms underlying sociability as isolation rearing in rats results in impaired social behavior and structural modifications in the PL and IL. Isolation rearing is a neurodevelopmental manipulation that produces neurochemical, structural, and behavioral alterations in rodents that in many ways are consistent with psychiatric disorders such as schizophrenia, anxiety and depression. In particular, it has been shown that isolation rearing can alter the volume of mPFC, the dendritic length and the spine density of pyramidal neurons. However, the detailed mechanisms involved in sociability disorders remain elusive and poorly understood. A recent article published in Plos ONE by Minami and colleagues aimed at measuring neural activity in the PL and IL of control and isolated rats during social interaction in order to determine whether there is neural activity related to social behavior in these areas.

Keyword: Attention
Link ID: 23688 - Posted: 06.01.2017

By Alice Klein A DRUG normally used to treat narcolepsy and excessive daytime sleepiness also seems to improve symptoms of attention deficit hyperactivity disorder (ADHD) symptoms. The finding supports the idea that ADHD might be a sleep disorder. People who have been diagnosed with ADHD find it difficult to concentrate and are generally hyperactive. But many with the condition also find it difficult to fall asleep and stay asleep at night, and feel drowsy during the day. Could this mean ADHD is a type of sleep disorder? After all, the brain pathways involved in paying attention have also been linked to sleep. And there’s some evidence of similarly disrupted patterns of chemical signalling in the brains of people with sleep disorders and ADHD. One suggestion is that the circadian rhythm that controls our sleep-wake cycle over each 24 hour period may be misaligned in people with ADHD, causing them to be sleepy or alert at the wrong times. This idea inspired Eric Konofal at Robert-Debré Hospital in Paris to try using a drug for narcolepsy and excessive daytime sleepiness to treat ADHD. Mazindol mimics the effects of a brain chemical called orexin, which modulates wakefulness and appetite. It works as a stimulant to keep us awake, and is lacking in people with narcolepsy, who tend to fall asleep at inappropriate times.

Keyword: ADHD; Sleep
Link ID: 23681 - Posted: 05.31.2017

Rebecca Hersher Diagnosing attention deficit hyperactivity disorder can be difficult. The symptoms of the disorder, as defined by the Diagnostic and Statistical Manual, or DSM, have changed multiple times. Even if you know what to look for, many of the symptoms are pretty general, including things like trouble focusing and a tendency to interrupt people. Discerning the difference between people who have a problem and those who are just distracted requires real expertise. Which is why many people were excited when earlier this year a World Health Organization advisory group endorsed a six-question screening test that a study published in the Journal of the American Medical Association reported could reliably identify adults with ADHD. A lot of people were intrigued by the seeming simplicity of the screening. We reported on it, including one implication of the study's findings: that there could be a significant population of U.S. adults with undiagnosed ADHD. But that may not be the case, and even if it is, some ADHD researchers say the six-question screening test is not necessarily the simple diagnostic solution its proponents hope it will be. "Despite the questions put out by WHO and mentioned in JAMA, in America if your talents and temperament don't match your goals and aspirations, that incongruity generates a series of feelings or behaviors that match quite nicely the diagnostic criteria in the DSM-V," explains Dr. Lawrence Diller, a behavioral pediatrician and ADHD specialist who has been following trends in ADHD diagnosis and medication since the mid-1990s. © 2017 npr

Keyword: ADHD
Link ID: 23677 - Posted: 05.30.2017

A daily 30-minute regimen designed to help elderly surgery patients stay oriented can cut the rate of postoperative delirium in half and help them return home sooner, according to a test among 377 volunteers in Taipei. After they were moved out of an intensive care unit, 15.1 percent given conventional treatment experienced delirium. But when hospital workers got patients moving faster, helped them brush their teeth, gave them facial exercises and talked to them in ways to help them understand what was happening, the delirium rate was just 6.6 percent. And while the patients who didn’t get the intervention typically stayed in the hospital for 14 days, those who did were discharged an average two days sooner. The study “draws needed attention to delirium,” which can cause problems when confused patients, for example, try to extricate themselves from the tubes and equipment needed to recover, said Lillian Kao, acute care surgery chief for McGovern Medical School at the University of Texas Health Science Center in Houston, who wasn’t involved with the study. Estimates of delirium’s prevalence vary widely, ranging from 13 percent to 50 percent among people who have non-heart surgery, according to an editorial accompanying the study, which appears in JAMA Surgery. © 1996-2017 The Washington Post

Keyword: Alzheimers; Attention
Link ID: 23674 - Posted: 05.29.2017

Jon Hamilton Impulsive children become thoughtful adults only after years of improvements to the brain's information highways, a team reports in Current Biology. A study of nearly 900 young people ages 8 to 22 found that the ability to control impulses, stay on task and make good decisions increased steadily over that span as the brain remodeled its information pathways to become more efficient. The finding helps explain why these abilities, known collectively as executive function, take so long to develop fully, says Danielle Bassett, an author of the study and an associate professor of bioengineering at the University of Pennsylvania. "A child's ability to run or to see is very well developed by the time they're 8," she says. "However, their ability to inhibit inappropriate responses is not something that's well developed until well into the 20s." The results also suggest it may be possible to identify adolescents at risk of problems related to poor executive function, says Joshua Gordon, director of the National Institute of Mental Health, which helped fund the study. These include "all kinds of disorders such as substance abuse, depression and schizophrenia," he says. The study is part of an effort to understand the brain changes underlying the development of executive function. It used a technology called diffusion imaging that reveals the fibers that make up the brain's information highways. © 2017 npr

Keyword: ADHD; Development of the Brain
Link ID: 23668 - Posted: 05.27.2017

by Angela Chen@chengela What happens when you look up and see a ball headed toward you? Without even thinking about it, you flinch. That might be because our brains are constantly living our lives in fast-forward, playing out the action in our head before it happens. Humans have to navigate, and respond to, an environment that is always changing. Our brain compensates for this by constantly making predictions about what’s going to happen, says Mattias Ekman, a researcher at Radboud University Nijmegen in the Netherlands. We’ve known this for a while, but these predictions are usually associative. An example: if you see a hamburger, your brain might predict that there will be fries nearby. In a study published today in the journal Nature Communications, Ekman and other scientists focused instead on how the brain predicts motion. So they used brain scans to track what happened as participants observed a moving dot. First, 29 volunteers looked at a white dot the size of a ping-pong ball. The dot went from left to right and then reversed directions. The volunteers watched the dot for about five minutes while scientists scanned their brains with ultra-fast fMRI. This way, the researchers know what pattern of brain activity was activated in the visual cortex while they watched the dot. After these five minutes, the researchers showed only the beginning of the sequence to the volunteers. Here, the scans showed that the brain “autocompletes” the full sequence — and it does it at twice the rate of the actual event. So if a dot took two seconds to go across the screen, the brain predicted the entire sequence in one second. “You’re actually already trying to predict what’s going to happen,” says Ekman. “These predictions are hypothetical, so in a way you’re trying to generate new memories that match the future.” © 2017 Vox Media, Inc.

Keyword: Attention
Link ID: 23653 - Posted: 05.24.2017

Jon Hamilton It took an explosion and 13 pounds of iron to usher in the modern era of neuroscience. In 1848, a 25-year-old railroad worker named Phineas Gage was blowing up rocks to clear the way for a new rail line in Cavendish, Vt. He would drill a hole, place an explosive charge, then pack in sand using a 13-pound metal bar known as a tamping iron. But in this instance, the metal bar created a spark that touched off the charge. That, in turn, "drove this tamping iron up and out of the hole, through his left cheek, behind his eye socket, and out of the top of his head," says Jack Van Horn, an associate professor of neurology at the Keck School of Medicine at the University of Southern California. Gage didn't die. But the tamping iron destroyed much of his brain's left frontal lobe, and Gage's once even-tempered personality changed dramatically. "He is fitful, irreverent, indulging at times in the grossest profanity, which was not previously his custom," wrote John Martyn Harlow, the physician who treated Gage after the accident. This sudden personality transformation is why Gage shows up in so many medical textbooks, says Malcolm Macmillan, an honorary professor at the Melbourne School of Psychological Sciences and the author of An Odd Kind of Fame: Stories of Phineas Gage. "He was the first case where you could say fairly definitely that injury to the brain produced some kind of change in personality," Macmillan says. © 2017 npr

Keyword: Attention; Emotions
Link ID: 23643 - Posted: 05.22.2017

By MARTIN E. P. SELIGMAN and JOHN TIERNEY We are misnamed. We call ourselves Homo sapiens, the “wise man,” but that’s more of a boast than a description. What makes us wise? What sets us apart from other animals? Various answers have been proposed — language, tools, cooperation, culture, tasting bad to predators — but none is unique to humans. What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives. A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present. Behaviorists thought of animal learning as the ingraining of habit by repetition. Psychoanalysts believed that treating patients was a matter of unearthing and confronting the past. Even when cognitive psychology emerged, it focused on the past and present — on memory and perception. But it is increasingly clear that the mind is mainly drawn to the future, not driven by the past. Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities. Our brain sees the world not by processing every pixel in a scene but by focusing on the unexpected. © 2017 The New York Times Company

Keyword: Attention; Learning & Memory
Link ID: 23641 - Posted: 05.20.2017

By Clare Wilson Seeing shouldn’t always be believing. We all have blind spots in our vision, but we don’t notice them because our brains fill the gaps with made-up information. Now subtle tests show that we trust this “fake vision” more than the real thing. If the brain works like this in other ways, it suggests we should be less trusting of the evidence from our senses, says Christoph Teufel of Cardiff University, who wasn’t involved in the study. “Perception is not providing us with a [true] representation of the world,” he says. “It is contaminated by what we already know.” The blind spot is caused by a patch at the back of each eye where there are no light-sensitive cells, just a gap where neurons exit the eye on their way to the brain. We normally don’t notice blind spots because our two eyes can fill in for each other. When vision is obscured in one eye, the brain makes up what’s in the missing area by assuming that whatever is in the regions around the spot continues inwards. But do we subconsciously know that this filled-in vision is less trustworthy than real visual information? Benedikt Ehinger of the University of Osnabrück in Germany and his colleagues set out to answer this question by asking 100 people to look at a picture of a circle of vertical stripes, which contained a small patch of horizontal stripes. The circle was positioned so that with one eye obscured, the patch of horizontal stripes fell within the other eye’s blind spot. As a result, the circle appeared as though there was no patch and the vertical stripes were continuous. © Copyright New Scientist Ltd.

Keyword: Vision; Attention
Link ID: 23640 - Posted: 05.20.2017

Sarah Boseley in Porto A crinkly plate, designed with ridges that cunningly reduce the amount of food it holds, may be heading for the market to help people concerned about their weight to eat less. The plate is the brainchild of a Latvian graphic designer, Nauris Cinovics, from the Art Academy of Latvia, who is working with a Latvian government agency to develop the idea and hopes to trial it soon. It may look like just another arty designer plate, but it is intended to play tricks with the mind. “My idea is to make food appear bigger than it is. If you make the plate three-dimensional [with the ridges and troughs] it actually looks like there is the same amount of food as on a normal plate – but there is less of it,” said Cinovics. “You are tricking the brain into thinking you are eating more.” The plate will be made of clear glass and could turn eating dinner into a more complex and longer process than it is usually for most of us. Negotiating the folds in the glass where pieces of fish or stray carrots may lurk will slow down the speed with which people get through their meal. Cinovics has also designed heavy cutlery, with the idea of making eating more of a labour – that therefore lasts longer. His knife, fork and spoon weigh 1.3kg each. “We tested this and it took 11 minutes to finish a meal with this cutlery rather than seven minutes,” he said.

Keyword: Obesity; Attention
Link ID: 23639 - Posted: 05.20.2017

By Bret Stetka For many hours a day they pluck dirt, debris and bugs from each other’s fur. Between grooming sessions they travel in troops to search for food. When ignored by mom, they throw tantrums; when not ignored by zoo-goers, they throw feces. Through these behaviors, monkeys demonstrate they understand the meaning of social interactions with other monkeys. They recognize when their peers are grooming one another and infer social rank from seeing such actions within their group. But it has long been unclear how the brains of our close evolutionary relatives actually process what they observe of these social situations. New findings published Thursday in Science offer a clue. A team of researchers from The Rockefeller University have identified a network in the monkey brain dedicated exclusively to analyzing social interactions. And they believe this network could be akin to human brains’ social circuitry. In the new work—led by Winrich Freiwald, an associate professor of neurosciences and behavior—four rhesus macaques viewed videos of various social and physical interactions while undergoing functional magnetic resonance imaging. (Monkeys love watching TV, so they paid attention.) They were shown clips of monkeys interacting, as well as performing tasks on their own. They also watched videos of various physical interactions among inanimate objects. © 2017 Scientific American

Keyword: Attention; Evolution
Link ID: 23637 - Posted: 05.19.2017

Katherine Isbister The fidget spinner craze has been sweeping elementary and middle schools. As of May 17 every one of the top 10 best-selling toys on Amazon was a form of the hand-held toy people can spin and do tricks with. Kids and parents are even making them for themselves using 3D printers and other more homespun crafting techniques. But some teachers are banning them from classrooms. And experts challenge the idea that spinners are good for conditions like ADHD and anxiety. Meanwhile, the Kickstarter online fundraising campaign for the Fidget Cube – another popular fidget toy in 2017 – raised an astounding US$6.4 million, and can be seen on the desks of hipsters and techies across the globe. My research group has taken a deep look at how people use fidget items over the last several years. What we found tells us that these items are not a fad that will soon disappear. Despite sometimes being an annoying distraction for others, fidget items can have some practical uses for adults; our inquiry into their usefulness for children is underway. Fidgeting didn’t start with the spinner craze. If you’ve ever clicked a ballpoint pen again and again, you’ve used a fidget item. As part of our work, we’ve asked people what items they like to fidget with and how and when they use them. (We’re compiling their answers online and welcome additional contributions.) © 2010–2017, The Conversation US, Inc.

Keyword: ADHD; Attention
Link ID: 23630 - Posted: 05.18.2017

By Helen Thomson People in a minimally conscious state have been “woken” for a whole week after a brief period of brain stimulation. The breakthrough suggests we may be on the verge of creating a device that can be used at home to help people with disorders of consciousness communicate with friends and family. People with severe brain trauma can fall into a coma. If they begin to show signs of arousal but not awareness, they are said to be in a vegetative state. If they then show fluctuating signs of awareness but cannot communicate, they are described as being minimally consciousness. In 2014, Steven Laureys at the University of Liège in Belgium and his colleagues discovered that 13 people with minimal consciousness and two people in a vegetative state could temporarily show new signs of awareness when given mild electrical stimulation. The people in the trial received transcranial direct current stimulation (tDCS), which uses low-level electrical stimulation to make neurons more or less likely to fire. This was applied once over an area of the brain called the prefrontal cortex, which is involved in “higher” cognitive functions such as consciousness. Soon after, they showed signs of consciousness, including moving their hands or following instructions using their eyes. Two people were even able to answer questions for 2 hours by moving their body, before drifting back into their previous state. © Copyright New Scientist Ltd.

Keyword: Consciousness
Link ID: 23610 - Posted: 05.13.2017

By Reuters People with attention-deficit/hyperactivity disorder are at increased risk of motor-vehicle accidents, but it is significantly reduced when they are taking ADHD medication, a 10-year study finds. The researchers estimate that 1 in 5 of the accidents among more than 2 million people with ADHD during the study period could have been avoided if these individuals had been receiving medication the entire time. “The patients should be aware of the potential risk of [crashes], and seek specific treatment advice from their doctors if they experience difficulties in driving from their condition,” said lead author Zheng Chang, of the Karolinska Institute in Stockholm. Chang said that motor-vehicle crashes kill more than 1.25 million people around the world each year. ADHD is a common disorder with symptoms that include poor sustained attention, impaired impulse control and hyperactivity, he added. Past studies have found that people with ADHD are at an increased risk for crashes and that medication may reduce symptoms and ultimately improve driving skills. To examine the risk of crashes with ADHD and how it is influenced by medication, the researchers analyzed U.S. commercial health insurance claims between 2005 and 2014. They identified 2,319,450 adults with an ADHD diagnosis, half of whom were older than 33. About 1.9 million of them received at least one prescription to treat their ADHD during the study period. © 1996-2017 The Washington Post

Keyword: ADHD; Attention
Link ID: 23609 - Posted: 05.13.2017