Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1209

By Sam Wong Students who take Adderall to improve their test scores may get a slight benefit, but it’s mainly a placebo effect. The drug Adderall is a combination of the stimulants amphetamine and dextroamphetamine, and is used to treat attention deficit hyperactivity disorder (ADHD). But it’s growing in popularity as a study drug in the US, where around a third of college students are thought to try using prescription stimulants for non-medical reasons. But does it work? Rachel Fargason, a psychiatrist at the University of Alabama, Birmingham, says the idea of stimulants as cognitive enhancers didn’t tally with her experience of patients who were diagnosed incorrectly. “If they didn’t have ADHD, the stimulants generally didn’t help them cognitively,” she says. To investigate further, Fargason’s team set up a trial in 32 people between the ages of 19 and 30, none of whom had ADHD. Each participant took a batch of cognitive tests four times. On two of these occasions they were given 10 milligrams of Adderall, while they were given a placebo the other times. With each treatment, they were once told they were getting medication, and once told they were getting a placebo. © Copyright New Scientist Ltd.

Keyword: ADHD; Drug Abuse
Link ID: 23858 - Posted: 07.21.2017

By PERRI KLASS, M.D. We want to believe we’re raising our kids to think for themselves, and not to do dumb or unhealthy things just because the cool kids are doing them. But research shows that when it comes to smoking, children are heavily influenced by some of the folks they consider the coolest of the cool: actors in movies. “There’s a dose-response relationship: The more smoking kids see onscreen, the more likely they are to smoke,” said Dr. Stanton Glantz, a professor and director of the University of California, San Francisco, Center for Tobacco Control Research and Education. He is one of the authors of a new study that found that popular movies are showing more tobacco use onscreen. “The evidence shows it’s the largest single stimulus,” for smoking, he said; “it overpowers good parental role modeling, it’s more powerful than peer influence or even cigarette advertising.” He said that epidemiological studies have shown that if you control for all the other risk factors of smoking (whether parents smoke, attitudes toward risk taking, socioeconomic status, and so on), younger adolescents who are more heavily exposed to smoking on film are two to three times as likely to start smoking, compared with the kids who are more lightly exposed. Those whose parents smoke are more likely to smoke, he said, but exposure to smoking in movies can overcome the benefit of having nonsmoking parents. In one study, the children of nonsmoking parents with heavy exposure to movie smoking were as likely to smoke as the children of smoking parents with heavy movie exposure. To Dr. Glantz, and the other people who study this topic, that makes smoking in movies an “environmental toxin,” a factor endangering children. © 2017 The New York Times Company

Keyword: Drug Abuse; Attention
Link ID: 23844 - Posted: 07.18.2017

Tim Adams Henry Marsh made the decision to become a neurosurgeon after he had witnessed his three-month-old son survive the complex removal of a brain tumour. For two decades he was the senior consultant in the Atkinson Morley wing at St George’s hospital in London, one of the country’s largest specialist brain surgery units. He pioneered techniques in operating on the brain under local anaesthetic and was the subject of the BBC documentary Your Life in Their Hands. His first book, Do No Harm: Stories of Life, Death, and Brain Surgery, was published in 2014 to great acclaim, and became a bestseller across the world. Marsh retired from full-time work at St George’s in 2015, though he continues with long-standing surgical roles at hospitals in the Ukraine and Nepal. He is also an avid carpenter. Earlier this year he published a second volume of memoir, Admissions: a Life in Brain Surgery, in which he looks back on his career as he takes up a “retirement project” of renovating a decrepit lock-keeper’s cottage near where he grew up in Oxfordshire. He lives with his second wife, the social anthropologist and author Kate Fox. They have homes in Oxford, and in south London, which is where the following conversation took place. Have you officially retired now? Well, I still do one day a week for the NHS, though apparently they want a “business case” for it, so I’m not getting paid at present. Yes, well, people talk about the mind-matter problem – it’s not a problem for me: mind is matter. That’s not being reductionist. It is actually elevating matter. We don’t even begin to understand how electrochemistry and nerve cells generate thought and feeling. We have not the first idea. The relation of neurosurgery to neuroscience is a bit like the relationship between plumbing and quantum mechanics.

Keyword: Consciousness
Link ID: 23842 - Posted: 07.17.2017

By BENEDICT CAREY Keith Conners, whose work with hyperactive children established the first standards for diagnosing and treating what is now known as attention deficit hyperactivity disorder, or A.D.H.D. — and who late in life expressed misgivings about how loosely applied that label had become — died on July 5 in Durham, N.C. He was 84. His wife, Carolyn, said the cause was heart failure. The field of child psychiatry was itself still young when Dr. Conners joined the faculty of the Johns Hopkins University School of Medicine in the early 1960s as a clinical psychologist. Children with emotional and behavioral problems often got a variety of diagnoses, depending on the clinic, and often ended up being given strong tranquilizers as treatment. Working with Dr. Leon Eisenberg, a prominent child psychiatrist, Dr. Conners focused on a group of youngsters who were chronically restless, hyperactive and sometimes aggressive. Doctors had recognized this type — “hyperkinesis,” it was called, or “minimal brain dysfunction” — but Dr. Conners combined existing descriptions and, using statistical analysis, focused on the core symptoms. The 39-item questionnaire he devised, called the Conners Rating Scale, quickly became the worldwide standard for assessing the severity of such problems and measuring improvement. It was later abbreviated to 10 items, giving child psychiatry a scientific foothold and anticipating by more than a decade the kind of checklists that would come to define all psychiatric diagnosis. He used his scale to study the effects of stimulant drugs on hyperactive children. Doctors had known since the 1930s that amphetamines could, paradoxically, calm such youngsters; a Rhode Island doctor, Charles Bradley, had published a well-known report detailing striking improvements in attention and academic performance among many children at a children’s inpatient home he ran near Providence. But it was a series of rigorous studies by Dr. Conners, in the 1960s and ’70s, that established stimulants — namely Dexedrine and Ritalin — as the standard treatments. © 2017 The New York Times Company

Keyword: ADHD
Link ID: 23833 - Posted: 07.14.2017

Deborah Orr Most people know about SSRIs, the antidepressant drugs that stop the brain from re-absorbing too much of the serotonin we produce, to regulate mood, anxiety and happiness. And a lot of people know about these drugs first hand, for the simple reason that they have used them. Last year, according to NHS Digital, no fewer than 64.7m antidepressant prescriptions were given in England alone. In a decade, the number of prescriptions has doubled. On Tuesday I joined the throng, and popped my first Citalopram. It was quite a thing – not least because, like an idiot, I dropped my pill about 90 minutes before curtain up for the Royal Shakespeare Company’s production of The Tempest at the Barbican. That’s right. This isn’t just mental illness: this is metropolitan-elite mental illness. It was a pretty overwhelming theatrical experience. The first indication that something was up came as I approached my local tube station. I noticed that I was in a state of extreme dissociation, walking along looking as though I was entirely present in the world yet feeling completely detached from it. I had drifted into total mental autopilot. Luckily, I was able to recognise my fugue. It’s a symptom of my condition, which, as I’ve written before, is complex post-traumatic stress disorder. The drug-induced dissociation was more intense than I’m used to when it’s happening naturally. I use the word advisedly. Much of what is thought of as illness is actually an extreme and sensible protective reaction to unbearable interventions from outside the self. © 2017 Guardian News and Media Limited

Keyword: Depression; Attention
Link ID: 23818 - Posted: 07.09.2017

Hannah Devlin A Catholic priest, a Rabbi and a Buddhist walk into a bar and order some magic mushrooms. It may sound like the first line of a bad joke, but this scenario is playing out in one of the first scientific investigations into the effects of psychedelic drugs on religious experience – albeit in a laboratory rather than a bar. Scientists at Johns Hopkins University in Baltimore have enlisted two dozen religious leaders from a wide range of denominations, to participate in a study in which they will be given two powerful doses of psilocybin, the active ingredient in magic mushrooms. Dr William Richards, a psychologist at Johns Hopkins University in Baltimore, Maryland who is involved in the work, said: “With psilocybin these profound mystical experiences are quite common. It seemed like a no-brainer that they might be of interest, if not valuable, to clergy.” The experiment, which is currently under way, aims to assess whether a transcendental experience makes the leaders more effective and confident in their work and how it alters their religious thinking. Despite most organised religions frowning on the use of illicit substances, Catholic, Orthodox and Presbyterian priests, a Zen Buddhist and several rabbis were recruited. The team has yet to persuade a Muslim imam or Hindu priest to take part, but “just about all the other bases are covered,” according to Richards. After preliminary screening, including medical and psychological tests, the participants have been given two powerful doses of psilocybin in two sessions, one month apart. © 2017 Guardian News and Media Limited

Keyword: Drug Abuse; Attention
Link ID: 23814 - Posted: 07.09.2017

By Anil Ananthaswamy To understand human consciousness, we need to know why it exists in the first place. New experimental evidence suggests it may have evolved to help us learn and adapt to changing circumstances far more rapidly and effectively. We used to think consciousness was a uniquely human trait, but neuroscientists now believe we share it with many other animals, including mammals, birds and octopuses. While plants and arguably some animals like jellyfish seem able to respond to the world around them without any conscious awareness, many other animals consciously experience and perceive their environment. Read more: Why be conscious – The improbable origins of our unique mind In the 19th century, Thomas Henry Huxley and others argued that such consciousness is an “epiphenomenon” – a side effect of the workings of the brain that has no causal influence, the way a steam whistle has no effect on the way a steam engine works. More recently, neuroscientists have suggested that consciousness enables us to integrate information from different senses or keep such information active for long enough in the brain that we can experience the sight and sound of car passing by, for example, as one unified perception, even though sound and light travel at different speeds. © Copyright New Scientist Ltd.

Keyword: Consciousness; Learning & Memory
Link ID: 23785 - Posted: 06.28.2017

By THERESE HUSTON “Does being over 40 make you feel like half the man you used to be?” Ads like that have led to a surge in the number of men seeking to boost their testosterone. The Food and Drug Administration reports that prescriptions for testosterone supplements have risen to 2.3 million from 1.3 million in just four years. There is such a condition as “low-T,” or hypogonadism, which can cause fatigue and diminished sex drive, and it becomes more common as men age. But according to a study published in JAMA Internal Medicine, half of the men taking prescription testosterone don’t have a deficiency. Many are just tired and want a lift. But they may not be doing themselves any favors. It turns out that the supplement isn’t entirely harmless: Neuroscientists are uncovering evidence suggesting that when men take testosterone, they make more impulsive — and often faulty — decisions. Researchers have shown for years that men tend to be more confident about their intelligence and judgments than women, believing that solutions they’ve generated are better than they actually are. This hubris could be tied to testosterone levels, and new research by Gideon Nave, a cognitive neuroscientist at the University of Pennsylvania, along with Amos Nadler at Western University in Ontario, reveals that high testosterone can make it harder to see the flaws in one’s reasoning. How might heightened testosterone lead to overconfidence? One possible explanation lies in the orbitofrontal cortex, a region just behind the eyes that’s essential for self-evaluation, decision making and impulse control. The neuroscientists Pranjal Mehta at the University of Oregon and Jennifer Beer at the University of Texas, Austin, have found that people with higher levels of testosterone have less activity in their orbitofrontal cortex. Studies show that when that part of the brain is less active, people tend to be overconfident in their reasoning abilities. It’s as though the orbitofrontal cortex is your internal editor, speaking up when there’s a potential problem with your work. Boost your testosterone and your editor goes reassuringly (but misleadingly) silent. © 2017 The New York Times Company

Keyword: Hormones & Behavior; Attention
Link ID: 23776 - Posted: 06.26.2017

Staring down a packed room at the Hyatt Regency Hotel in downtown San Francisco this March, Randy Gallistel gripped a wooden podium, cleared his throat, and presented the neuroscientists sprawled before him with a conundrum. “If the brain computed the way people think it computes," he said, "it would boil in a minute." All that information would overheat our CPUs. Humans have been trying to understand the mind for millennia. And metaphors from technology—like cortical CPUs—are one of the ways that we do it. Maybe it’s comforting to frame a mystery in the familiar. In ancient Greece, the brain was a hydraulics system, pumping the humors; in the 18th century, philosophers drew inspiration from the mechanical clock. Early neuroscientists from the 20th century described neurons as electric wires or phone lines, passing signals like Morse code. And now, of course, the favored metaphor is the computer, with its hardware and software standing in for the biological brain and the processes of the mind. In this technology-ridden world, it’s easy to assume that the seat of human intelligence is similar to our increasingly smart devices. But the reliance on the computer as a metaphor for the brain might be getting in the way of advancing brain research. As Gallistel continued his presentation to the Cognitive Neuroscience Society, he described the problem with the computer metaphor. If memory works the way most neuroscientists think it does—by altering the strength of connections between neurons—storing all that information would be way too energy-intensive, especially if memories are encoded in Shannon information, high fidelity signals encoded in binary.

Keyword: Learning & Memory; Consciousness
Link ID: 23764 - Posted: 06.23.2017

Kerin Higa After surgery to treat her epilepsy severed the connection between the two halves of her brain, Karen's left hand took on a mind of its own, acting against her will to undress or even to slap her. Amazing, to be sure. But what may be even more amazing is that most people who have split-brain surgery don't notice anything different at all. But there's more to the story than that. In the 1960s, a young neuroscientist named Michael Gazzaniga began a series of experiments with split-brain patients that would change our understanding of the human brain forever. Working in the lab of Roger Sperry, who later won a Nobel Prize for his work, Gazzaniga discovered that the two halves of the brain experience the world quite differently. When Gazzaniga and his colleagues flashed a picture in front of a patient's right eye, the information was processed in the left side of the brain and the split-brain patient could easily describe the scene verbally. But when a picture was flashed in front of the left eye, which connects to the right side of the brain, the patient would report seeing nothing. If allowed to respond nonverbally, however, the right brain could adeptly point at or draw what was seen by the left eye. So the right brain knew what it was seeing; it just couldn't talk about it. These experiments showed for the first time that each brain hemisphere has specialized tasks. In this third episode of Invisibilia, hosts Alix Spiegel and Hanna Rosin talk to several people who are trying to change their other self, including a man who confronts his own biases and a woman who has a rare condition that causes one of her hands to take on a personality of its own. © 2017 npr

Keyword: Consciousness; Laterality
Link ID: 23749 - Posted: 06.17.2017

by Helen Thompson Paper wasps have a knack for recognizing faces, and a new study adds to our understanding of what that means in a wasp’s brain. Most wasps of a given species look the same, but some species of paper wasp (Polistes sp.) display varied colors and markings. Recognizing these patterns is at the core of the wasps’ social interactions. One species, Polistes fuscatus, is especially good at detecting differences in faces — even better than they are at detecting other patterns. To zero on the roots of this ability, biologist Ali Berens of Georgia Tech and her colleagues set up recognition exercises of faces and basic patterns for P. fuscatus wasps and P. metricus wasps — a species that doesn’t naturally recognize faces but can be trained to do so in the lab. After the training, scientists extracted DNA from the wasps’ brains and looked at which bits of DNA or genes were active. The researchers found 237 genes that were at play only in P. fuscatus during facial recognition tests. A few of the genes have been linked to honeybee visual learning, and some correspond to brain signaling with the neurotransmitters serotonin and tachykinin. In the brain, picking up on faces goes beyond basic pattern learning, the researchers conclude June 14 in the Journal of Experimental Biology. It’s possible that some of the same genes also play a broader role in how organisms such as humans and sheep tell one face from another. © Society for Science & the Public 2000 - 2017

Keyword: Attention
Link ID: 23742 - Posted: 06.15.2017

Maria Temming Fascination with faces is nature, not nurture, suggests a new study of third-trimester fetuses. Scientists have long known that babies like looking at faces more than other objects. But research published online June 8 in Current Biology offers evidence that this preference develops before birth. In the first-ever study of prenatal visual perception, fetuses were more likely to move their heads to track facelike configurations of light projected into the womb than nonfacelike shapes. Past research has shown that newborns pay special attention to faces, even if a “face” is stripped down to its bare essentials — for instance, a triangle of three dots: two up top for eyes, one below for a mouth or nose. This preoccupation with faces is considered crucial to social development. “The basic tendency to pick out a face as being different from other things in your environment, and then to actually look at it, is the first step to learning who the important people are in your world,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. Using a 4-D ultrasound, the researchers watched how 34-week-old fetuses reacted to seeing facelike triangles compared with seeing triangles with one dot above and two below. They projected triangles of red light in both configurations through a mother’s abdomen into the fetus’s peripheral vision. Then, they slid the light across the mom’s belly, away from the fetus’s line of sight, to see if it would turn its head to continue looking at the image. |© Society for Science & the Public 2000 - 2017

Keyword: Development of the Brain; Attention
Link ID: 23726 - Posted: 06.09.2017

Alex Burmester When you need to remember a phone number, a shopping list or a set of instructions, you rely on what psychologists and neuroscientists refer to as working memory. It’s the ability to hold and manipulate information in mind, over brief intervals. It’s for things that are important to you in the present moment, but not 20 years from now. Researchers believe working memory is central to the functioning of the mind. It correlates with many more general abilities and outcomes – things like intelligence and scholastic attainment – and is linked to basic sensory processes. Given its central role in our mental life, and the fact that we are conscious of at least some of its contents, working memory may become important in our quest to understand consciousness itself. Psychologists and neuroscientists focus on different aspects as they investigate working memory: Psychologists try to map out the functions of the system, while neuroscientists focus more on its neural underpinnings. Here’s a snapshot of where the research stands currently. How much working memory do we have? Capacity is limited – we can keep only a certain amount of information “in mind” at any one time. But researchers debate the nature of this limit. Many suggest that working memory can store a limited number of “items” or “chunks” of information. These could be digits, letters, words or other units. Research has shown that the number of bits that can be held in memory can depend on the type of item – flavors of ice cream on offer versus digits of pi. © 2010–2017, The Conversation US, Inc.

Keyword: Learning & Memory; Attention
Link ID: 23711 - Posted: 06.06.2017

Laurel Hamers A monkey’s brain builds a picture of a human face somewhat like a Mr. Potato Head — piecing it together bit by bit. The code that a monkey’s brain uses to represent faces relies not on groups of nerve cells tuned to specific faces — as has been previously proposed — but on a population of about 200 cells that code for different sets of facial characteristics. Added together, the information contributed by each nerve cell lets the brain efficiently capture any face, researchers report June 1 in Cell. “It’s a turning point in neuroscience — a major breakthrough,” says Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester in England who wasn’t part of the work. “It’s a very simple mechanism to explain something as complex as recognizing faces.” Until now, Quiroga says, the leading explanation for the way the primate brain recognizes faces proposed that individual nerve cells, or neurons, respond to certain types of faces (SN: 6/25/05, p. 406). A system like that might work for the few dozen people with whom you regularly interact. But accounting for all of the peripheral people encountered in a lifetime would require a lot of neurons. It now seems that the brain might have a more efficient strategy, says Doris Tsao, a neuroscientist at Caltech. Tsao and coauthor Le Chang used statistical analyses to identify 50 variables that accounted for the greatest differences between 200 face photos. Those variables represented somewhat complex changes in the face — for instance, the hairline rising while the face becomes wider and the eyes becomes further-set. |© Society for Science & the Public 2000 - 2017.

Keyword: Attention
Link ID: 23701 - Posted: 06.02.2017

Giuseppe Gangarossa Could it be possible to run a normal existence without social life? Indeed, sociability is an important aspect for individuals and social interaction builds our lives. In fact, social interaction enhances quality of life and improves the stability of communities. Impaired sociability is a classical symptom observed in many neuropsychiatric disorders including autism, schizophrenia, depression, anxiety and generalized fear. Interestingly, many studies have pointed to the medial prefrontal cortex (mPFC), a brain area located in the ventromedial part of the frontal lobe, as key region involved in the neural bases of sociability (Valk et al, 2015; Treadway et al., 2015; Frith et al., 2007). The prelimbic cortex (PL) and the infralimbic cortex (IL), two subregions of the mPFC, have been strongly suggested to play an important role in the neural mechanisms underlying sociability as isolation rearing in rats results in impaired social behavior and structural modifications in the PL and IL. Isolation rearing is a neurodevelopmental manipulation that produces neurochemical, structural, and behavioral alterations in rodents that in many ways are consistent with psychiatric disorders such as schizophrenia, anxiety and depression. In particular, it has been shown that isolation rearing can alter the volume of mPFC, the dendritic length and the spine density of pyramidal neurons. However, the detailed mechanisms involved in sociability disorders remain elusive and poorly understood. A recent article published in Plos ONE by Minami and colleagues aimed at measuring neural activity in the PL and IL of control and isolated rats during social interaction in order to determine whether there is neural activity related to social behavior in these areas.

Keyword: Attention
Link ID: 23688 - Posted: 06.01.2017

By Alice Klein A DRUG normally used to treat narcolepsy and excessive daytime sleepiness also seems to improve symptoms of attention deficit hyperactivity disorder (ADHD) symptoms. The finding supports the idea that ADHD might be a sleep disorder. People who have been diagnosed with ADHD find it difficult to concentrate and are generally hyperactive. But many with the condition also find it difficult to fall asleep and stay asleep at night, and feel drowsy during the day. Could this mean ADHD is a type of sleep disorder? After all, the brain pathways involved in paying attention have also been linked to sleep. And there’s some evidence of similarly disrupted patterns of chemical signalling in the brains of people with sleep disorders and ADHD. One suggestion is that the circadian rhythm that controls our sleep-wake cycle over each 24 hour period may be misaligned in people with ADHD, causing them to be sleepy or alert at the wrong times. This idea inspired Eric Konofal at Robert-Debré Hospital in Paris to try using a drug for narcolepsy and excessive daytime sleepiness to treat ADHD. Mazindol mimics the effects of a brain chemical called orexin, which modulates wakefulness and appetite. It works as a stimulant to keep us awake, and is lacking in people with narcolepsy, who tend to fall asleep at inappropriate times.

Keyword: ADHD; Sleep
Link ID: 23681 - Posted: 05.31.2017

Rebecca Hersher Diagnosing attention deficit hyperactivity disorder can be difficult. The symptoms of the disorder, as defined by the Diagnostic and Statistical Manual, or DSM, have changed multiple times. Even if you know what to look for, many of the symptoms are pretty general, including things like trouble focusing and a tendency to interrupt people. Discerning the difference between people who have a problem and those who are just distracted requires real expertise. Which is why many people were excited when earlier this year a World Health Organization advisory group endorsed a six-question screening test that a study published in the Journal of the American Medical Association reported could reliably identify adults with ADHD. A lot of people were intrigued by the seeming simplicity of the screening. We reported on it, including one implication of the study's findings: that there could be a significant population of U.S. adults with undiagnosed ADHD. But that may not be the case, and even if it is, some ADHD researchers say the six-question screening test is not necessarily the simple diagnostic solution its proponents hope it will be. "Despite the questions put out by WHO and mentioned in JAMA, in America if your talents and temperament don't match your goals and aspirations, that incongruity generates a series of feelings or behaviors that match quite nicely the diagnostic criteria in the DSM-V," explains Dr. Lawrence Diller, a behavioral pediatrician and ADHD specialist who has been following trends in ADHD diagnosis and medication since the mid-1990s. © 2017 npr

Keyword: ADHD
Link ID: 23677 - Posted: 05.30.2017

A daily 30-minute regimen designed to help elderly surgery patients stay oriented can cut the rate of postoperative delirium in half and help them return home sooner, according to a test among 377 volunteers in Taipei. After they were moved out of an intensive care unit, 15.1 percent given conventional treatment experienced delirium. But when hospital workers got patients moving faster, helped them brush their teeth, gave them facial exercises and talked to them in ways to help them understand what was happening, the delirium rate was just 6.6 percent. And while the patients who didn’t get the intervention typically stayed in the hospital for 14 days, those who did were discharged an average two days sooner. The study “draws needed attention to delirium,” which can cause problems when confused patients, for example, try to extricate themselves from the tubes and equipment needed to recover, said Lillian Kao, acute care surgery chief for McGovern Medical School at the University of Texas Health Science Center in Houston, who wasn’t involved with the study. Estimates of delirium’s prevalence vary widely, ranging from 13 percent to 50 percent among people who have non-heart surgery, according to an editorial accompanying the study, which appears in JAMA Surgery. © 1996-2017 The Washington Post

Keyword: Alzheimers; Attention
Link ID: 23674 - Posted: 05.29.2017

Jon Hamilton Impulsive children become thoughtful adults only after years of improvements to the brain's information highways, a team reports in Current Biology. A study of nearly 900 young people ages 8 to 22 found that the ability to control impulses, stay on task and make good decisions increased steadily over that span as the brain remodeled its information pathways to become more efficient. The finding helps explain why these abilities, known collectively as executive function, take so long to develop fully, says Danielle Bassett, an author of the study and an associate professor of bioengineering at the University of Pennsylvania. "A child's ability to run or to see is very well developed by the time they're 8," she says. "However, their ability to inhibit inappropriate responses is not something that's well developed until well into the 20s." The results also suggest it may be possible to identify adolescents at risk of problems related to poor executive function, says Joshua Gordon, director of the National Institute of Mental Health, which helped fund the study. These include "all kinds of disorders such as substance abuse, depression and schizophrenia," he says. The study is part of an effort to understand the brain changes underlying the development of executive function. It used a technology called diffusion imaging that reveals the fibers that make up the brain's information highways. © 2017 npr

Keyword: ADHD; Development of the Brain
Link ID: 23668 - Posted: 05.27.2017

by Angela Chen@chengela What happens when you look up and see a ball headed toward you? Without even thinking about it, you flinch. That might be because our brains are constantly living our lives in fast-forward, playing out the action in our head before it happens. Humans have to navigate, and respond to, an environment that is always changing. Our brain compensates for this by constantly making predictions about what’s going to happen, says Mattias Ekman, a researcher at Radboud University Nijmegen in the Netherlands. We’ve known this for a while, but these predictions are usually associative. An example: if you see a hamburger, your brain might predict that there will be fries nearby. In a study published today in the journal Nature Communications, Ekman and other scientists focused instead on how the brain predicts motion. So they used brain scans to track what happened as participants observed a moving dot. First, 29 volunteers looked at a white dot the size of a ping-pong ball. The dot went from left to right and then reversed directions. The volunteers watched the dot for about five minutes while scientists scanned their brains with ultra-fast fMRI. This way, the researchers know what pattern of brain activity was activated in the visual cortex while they watched the dot. After these five minutes, the researchers showed only the beginning of the sequence to the volunteers. Here, the scans showed that the brain “autocompletes” the full sequence — and it does it at twice the rate of the actual event. So if a dot took two seconds to go across the screen, the brain predicted the entire sequence in one second. “You’re actually already trying to predict what’s going to happen,” says Ekman. “These predictions are hypothetical, so in a way you’re trying to generate new memories that match the future.” © 2017 Vox Media, Inc.

Keyword: Attention
Link ID: 23653 - Posted: 05.24.2017