Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1141

Not much is definitively proven about consciousness, the awareness of one’s existence and surroundings, other than that it’s somehow linked to the brain. But theories as to how, exactly, grey matter generates consciousness are challenged when a fully-conscious man is found to be missing most of his brain. Several years ago, a 44-year-old Frenchman went to the hospital complaining of mild weakness in his left leg. It was discovered then that his skull was filled largely by fluid, leaving just a thin perimeter of actual brain tissue. And yet the man was a married father of two and a civil servant with an IQ of 75, below-average in his intelligence but not mentally disabled. Doctors believe the man’s brain slowly eroded over 30 years due to a build up of fluid in the brain’s ventricles, a condition known as “hydrocephalus.” His hydrocephalus was treated with a shunt, which drains the fluid into the bloodstream, when he was an infant. But it was removed when he was 14 years old. Over the following decades, the fluid accumulated, leaving less and less space for his brain. While this may seem medically miraculous, it also poses a major challenge for cognitive psychologists, says Axel Cleeremans of the Université Libre de Bruxelles.

Keyword: Intelligence; Consciousness
Link ID: 22430 - Posted: 07.13.2016

By SUNITA SAH A POPULAR remedy for a conflict of interest is disclosure — informing the buyer (or the patient, etc.) of the potential bias of the seller (or the doctor, etc.). Disclosure is supposed to act as a warning, alerting consumers to their adviser’s stake in the matter so they can process the advice accordingly. But as several recent studies I conducted show, there is an underappreciated problem with disclosure: It often has the opposite of its intended effect, not only increasing bias in advisers but also making advisees more likely to follow biased advice. When I worked as a physician, I witnessed how bias could arise from numerous sources: gifts or sponsorships from the pharmaceutical industry; compensation for performing particular procedures; viewing our own specialties as delivering more effective treatments than others’ specialties. Although most physicians, myself included, tend to believe that we are invulnerable to bias, thus making disclosures unnecessary, regulators insist on them, assuming that they work effectively. To some extent, they do work. Disclosing a conflict of interest — for example, a financial adviser’s commission or a physician’s referral fee for enrolling patients into clinical trials — often reduces trust in the advice. But my research has found that people are still more likely to follow this advice because the disclosure creates increased pressure to follow the adviser’s recommendation. It turns out that people don’t want to signal distrust to their adviser or insinuate that the adviser is biased, and they also feel pressure to help satisfy their adviser’s self-interest. Instead of functioning as a warning, disclosure can become a burden on advisees, increasing pressure to take advice they now trust less. © 2016 The New York Times Company

Keyword: Attention
Link ID: 22416 - Posted: 07.09.2016

By Emily Rosenzweig Life deals most of us a consistent stream of ego blows, be they failures at work, social slights, or unrequited love. Social psychology has provided decades of insight into just how adept we are at defending ourselves against these psychic threats. We discount negative feedback, compare ourselves favorably to those who are worse off than us, attribute our failures to others, place undue value on our own strengths, and devalue opportunities denied to us–all in service of protecting and restoring our sense of self-worth. As a group, this array of motivated mental processes that support mood repair and ego defense has been called the “psychological immune system.” Particularly striking to social psychologists is our ability to remain blind to our use of these motivated strategies, even when it is apparent to others just how biased we are. However there are times when we either cannot remain blind to our own psychological immune processes, or where we may find ourselves consciously wanting to use them expressly for the purpose of restoring our ego or our mood. What then? Can we believe a conclusion we reach even when we know that we arrived at it in a biased way? For example, imagine you’ve recently gone through a breakup and want to get over your ex. You decide to make a mental list of all of their character flaws in an effort to feel better about the relationship ending. A number of prominent social psychologists have suggested you’re out of luck—knowing that you’re focusing only on your ex’s worst qualities prevents you from believing the conclusion you’ve come to that you’re better off without him or her. In essence, they argue that we must remain blind to our own biased mental processes in order to reap their ego-restoring benefits. And in many ways this closely echoes the position that philosophers like Mele have taken about the possibility of agentic self-deception. © 2016 Scientific American

Keyword: Attention; Consciousness
Link ID: 22404 - Posted: 07.07.2016

George Johnson A paper in The British Medical Journal in December reported that cognitive behavioral therapy — a means of coaxing people into changing the way they think — is as effective as Prozac or Zoloft in treating major depression. In ways no one understands, talk therapy reaches down into the biological plumbing and affects the flow of neurotransmitters in the brain. Other studies have found similar results for “mindfulness” — Buddhist-inspired meditation in which one’s thoughts are allowed to drift gently through the head like clouds reflected in still mountain water. Findings like these have become so commonplace that it’s easy to forget their strange implications. Depression can be treated in two radically different ways: by altering the brain with chemicals, or by altering the mind by talking to a therapist. But we still can’t explain how mind arises from matter or how, in turn, mind acts on the brain. This longstanding conundrum — the mind-body problem — was succinctly described by the philosopher David Chalmers at a recent symposium at The New York Academy of Sciences. “The scientific and philosophical consensus is that there is no nonphysical soul or ego, or at least no evidence for that,” he said. Descartes’s notion of dualism — mind and body as separate things — has long receded from science. The challenge now is to explain how the inner world of consciousness arises from the flesh of the brain. © 2016 The New York Times Company

Keyword: Consciousness
Link ID: 22397 - Posted: 07.05.2016

Mo Costandi There’s much more to visual perception than meets the eye. What we see is not merely a matter of patterns of light falling on the retina, but rather is heavily influenced by so-called ‘top-down’ brain mechanisms, which can alter the visual information, and other types of sensory information, that enters the brain before it even reaches our conscious awareness. A striking example of this is a phenomenon called inattentional blindness, whereby narrowly focusing one’s attention on one visual stimulus makes us oblivious to other stimuli, even though they otherwise may be glaringly obvious, as demonstrated by the infamous ‘Invisible Gorilla’ study. Now researchers say they have discovered another extreme form of blindness, in which people fail to notice an unexpected image, even when shown by itself and staring them in the face. Marjan Persuh and Robert Melara of the City University of New York designed two experiments to investigate whether people’s prior expectations could block their awareness of meaningful and important visual stimuli. In the first, they recruited 20 student volunteers and asked them to perform a visual discrimination task. They were shown a series of images, consisting of successive pairs of faces, each of which were presented for half a second on a computer screen, and asked to indicate whether each pair showed faces of people of the same or different sex. Towards the end of each session, the participants were presented with a simple shape, which flashed onto the screen for one tenth of a second. They were then asked if they had seen anything new and, after replying, were told that a shape had indeed appeared, and asked to select the correct one from a display of four. This shape recognition task was then repeated in one final control trial. © 2016 Guardian News and Media Limited

Keyword: Attention
Link ID: 22394 - Posted: 07.04.2016

By Clare Wilson People who meditate are more aware of their unconscious brain activity – or so a new take on a classic “free will” experiment suggests. The results hint that the feeling of conscious control over our actions can vary – and provide more clues to understanding the complex nature of free will. The famous experiment that challenged our notions of free will was first done in 1983 by neuroscientist Benjamin Libet. It involved measuring electrical activity in someone’s brain while asking them to press a button, whenever they like, while they watch a special clock that allows them to note the time precisely. Typically people feel like they decide to press the button about 200 milliseconds before their finger moves – but the electrodes reveal activity in the part of their brain that controls movement occurs a further 350 milliseconds before they feel they make that decision. This suggests that in fact it is the unconscious brain that “decides” when to press the button. In the new study, a team at the University of Sussex in Brighton, UK, did a slimmed-down version of the experiment (omitting the brain electrodes), with 57 volunteers, 11 of whom regularly practised mindfulness mediation. The meditators had a longer gap in time between when they felt like they decided to move their finger and when it physically moved – 149 compared with 68 milliseconds for the other people. © Copyright Reed Business Information Ltd.

Keyword: Consciousness
Link ID: 22369 - Posted: 06.28.2016

By MOSHE BAR A FRIEND of mine has a bad habit of narrating his experiences as they are taking place. I tease him for being a bystander in his own life. To be fair, we all fail to experience life to the fullest. Typically, our minds are too occupied with thoughts to allow complete immersion even in what is right in front of us. Sometimes, this is O.K. I am happy not to remember passing a long stretch of my daily commute because my mind has wandered and my morning drive can be done on autopilot. But I do not want to disappear from too much of life. Too often we eat meals without tasting them, look at something beautiful without seeing it. An entire exchange with my daughter (please forgive me) can take place without my being there at all. Recently, I discovered how much we overlook, not just about the world, but also about the full potential of our inner life, when our mind is cluttered. In a study published in this month’s Psychological Science, the graduate student Shira Baror and I demonstrate that the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.” Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear. In a series of experiments, we gave participants a free-association task while simultaneously taxing their mental capacity to different degrees. In one experiment, for example, we asked half the participants to keep in mind a string of seven digits, and the other half to remember just two digits. While the participants maintained these strings in working memory, they were given a word (e.g., shoe) and asked to respond as quickly as possible with the first word that came to mind (e.g., sock). © 2016 The New York Times Company

Keyword: Attention
Link ID: 22360 - Posted: 06.25.2016

By Nancy Szokan Let’s begin by defining something psychologists call “ego depletion.” This is the idea that all of us have only a certain amount of self-control, and if we use up too much in one part of our lives, we will have less to use in others. An early example came from a 1998 study in which participants were tempted with a chocolate treat before being given a difficult puzzle: Those who resisted the temptation seemed to have used up some of their willpower, because they gave up on the puzzle faster than the treat eaters. There have been many subsequent studies about ego depletion, including its apparent effects on physical performance: In 2012, athletes who were given a difficult mental task before a physical challenge exhibited less determination to do well on the sports test than those who hadn’t done the puzzle. But recently a replication study (in which researchers repeat a published experiment to see if they come up with the same results) tested more than 2,000 participants at 24 labs and found the ego depletion effect to be very small or nonexistent. I Which, as Lea Winerman reports, has led such psychologists as Michael Inzlicht of the University of Toronto to a crisis of confidence. Maybe, he thinks, ego depletion and the other social psychological effects he has made a career of studying are “proven” by unreliable research. “I used to think there were errors, but that the errors were minor and it was fine,” Winerman quotes Inzlicht as saying in the June issue of Monitor on Psychology, a publication of the American Psychological Association. “But as I started surveying the field, I started thinking we’ve been making some major mistakes.”

Keyword: Attention
Link ID: 22337 - Posted: 06.20.2016

By Tanya Lewis The human brain may wind down when asleep, but it doesn’t lose all responsiveness. Researchers from the École Normale Supérieure in Paris and their colleagues recently used electroencephalography (EEG) to monitor the brains of volunteers listening to recordings of spoken words, which they were asked to classify as either objects or animals. Participants were able to classify words during light non-REM (NREM) sleep, but not during either deep NREM sleep or REM sleep, according to a study published today (June 14) in The Journal of Neuroscience. “With an elegant experimental design and sophisticated analyses of neural activity, [the authors] demonstrate the extent to which the sleeping brain is able to process sensory information, depending on sleep depth [or] stage,” Thomas Schreiner of the University of Fribourg in Switzerland, who was not involved in the study, wrote in an email to The Scientist. During sleep, the brain is thought to block out external stimuli through a gating mechanism at the level of the thalamus. But experiments dating back to the 1960s have shown that certain types of stimuli, such as hearing one’s name, can filter through and trigger awakening. However, the mechanisms that allow the brain to selectively take in information during sleep remain unknown. “When we fall asleep, it’s pretty similar to a coma because we lose consciousness of our self and of the [outside] world,” study coauthor Thomas Andrillon, a neuroscientist at the École Normale Supérieure, told The Scientist. The question was “whether the brain could still monitor what was going on around, just to be sure the environment was still safe,” he added. © 1986-2016 The Scientist

Keyword: Sleep; Attention
Link ID: 22334 - Posted: 06.18.2016

Alva Noë Sometimes the mind wanders. Thoughts pop into consciousness. Ideas or images are present when just a moment before they were not. Scientists recently have been turning their attention to making sense of this. One natural picture of the phenomenon goes something like this. Typically, our thoughts and feelings are shaped by what we are doing, by what there is around us. The world captures our attention and compels our minds this way or that. What explains the fact that you think of a red car when there is a red car in front of you is, well, the red car. And similarly, it is that loud noise that causes you to orient yourself to the commotion that is producing it. In such cases, we might say, the mind is coupled to the world around it and the world, in a way, plays us the way a person might play a piano. But sometimes, even without going to sleep, we turn away from the world. We turn inward. We are contemplative or detached. We decouple ourselves from the environment and we are set free, as it were, to let our minds play themselves. This natural picture has gained some support from the discovery of the so-called Default Mode Network. The DMN is a network of neural systems whose activation seems to be suppressed by active engagement with the world around us; DMN, in contrast, is activated (or rather, it tends to return to baseline levels of activity) precisely when we detach ourselves from what's going on around us. The DMN is the brain running in neutral. One of the leading hypotheses to explain mind-wandering and the emergence of spontaneous thoughts is that this is the result of the operation of the brain's Default Mode Network. (See this for a review of this literature.) © 2016 npr

Keyword: Attention
Link ID: 22331 - Posted: 06.18.2016

By Devi Shastri Calling someone a “bird brain” might not be the zinger of an insult you thought it was: A new study shows that—by the total number of forebrain neurons—some birds are much brainier than we thought. The study, published online today in the Proceedings of the National Academy of Sciences, found that 28 bird species have more neurons in their pallial telencephalons, the brain region responsible for higher level learning, than mammals with similar-sized brains. Parrots and songbirds in particular packed in the neurons, with parrots (like the gray parrot, above) ranging from 227 million to 3.14 billion, and songbirds—including the notoriously intelligent crow—from 136 million to 2.17 billion. That’s about twice as many neurons as primates with brains of the same mass and four times as many as rodent brains of the same mass. To come up with their count, the researchers dissected the bird brains and then dissolved them in a detergent solution, ensuring that the cells were suspended in what neuroscientist Suzana Herculano-Houzel of Vanderbilt University in Nashville calls “brain soup.” This allowed them to label, count, and estimate how many neurons were in a particular brain region. The region that they focused on allows some birds to hone skills like tool use, planning for the future, learning birdsong, and mimicking human speech. One surprising finding was that the neurons were much smaller than expected, with shorter and more compact connections between cells. The team’s next step is to examine whether these neurons started out small or instead shrank in order to keep the birds light enough for flights. One thing, at least, is clear: It’s time to find a new insult for your less brainy friends. © 2016 American Association for the Advancement of Science

Keyword: Evolution; Animal Communication
Link ID: 22315 - Posted: 06.14.2016

Michael Graziano Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it? The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species. Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing. © 2016 by The Atlantic Monthly Group

Keyword: Consciousness; Evolution
Link ID: 22306 - Posted: 06.09.2016

By Rachel Feltman Archerfish are already stars of the animal kingdom for their stunning spit-takes. They shoot high-powered water jets from their mouths to stun prey, making them one of just a few fish species known to use tools. But by training Toxotes chatareus to direct those jets of spit at certain individuals, scientists have shown that the little guys have another impressive skill: They seem to be able to distinguish one human face from another, something never before witnessed in fish and spotted just a few times in non-human animals. The results, published Tuesday in the Nature journal Scientific Reports, could help us understand how humans got so good at telling each other apart. Or how most people got to be good at that, anyway. I'm terrible at it. It's generally accepted that the fusiform gyrus, a brain structure located in the neocortex, allows humans to tell one another apart with a speed and accuracy that other species can't manage. But there's some debate over whether human faces are so innately complex — and that distinguishing them is more difficult than other tricks of memory or pattern recognition — that this region of the brain is a necessary facilitator of the skill that evolved especially for it. Birds, which have been shown to distinguish humans from one another, have the same structure. But some researchers still think that facial recognition might be something that humans learn — it's not an innate skill — and that the fusiform gyrus is just the spot where we happen to process all the necessary information.

Keyword: Attention; Evolution
Link ID: 22299 - Posted: 06.08.2016

By Clare Wilson We’ve all been there: after a tough mental slog your brain feels as knackered as your body does after a hard workout. Now we may have pinpointed one of the brain regions worn out by a mentally taxing day – and it seems to also affect our willpower, so perhaps we should avoid making important decisions when mentally fatigued. Several previous studies have suggested that our willpower is a finite resource, and if it gets depleted in one way – like finishing a difficult task – we find it harder to make other good choices, like resisting a slice of cake. In a small trial, Bastien Blain at INSERM in Paris and his colleagues asked volunteers to spend six hours doing tricky memory tasks, while periodically choosing either a small sum of cash now, or a larger amount after a delay. .. As the day progressed, people became more likely to act on impulse and to pick an immediate reward. This didn’t happen in the groups that spent time doing easier memory tasks, reading or gaming. For those engaged in difficult work, fMRI brain scans showed a decrease in activity in the middle frontal gyrus, a brain area involved in decision-making. “That suggests this region is becoming less excitable, which could be impairing people’s ability to resist temptation,” says Blain. It’s involved in decisions like ‘Shall I have a beer with my friends tonight, or shall I save money to buy a bike next month,’ he says. Previous research has shown that children with more willpower in a similar type of choice test involving marshmallows end up as more successful adults, by some measures. “Better impulse control predicts your eventual wealth and health,” says Blain. The idea that willpower can be depleted is contentious as some researchers have failed to replicate others’ findings. © Copyright Reed Business Information Ltd.

Keyword: Attention; Learning & Memory
Link ID: 22292 - Posted: 06.07.2016

By Hanoch Ben-Yami Adam Bear opens his article, What Neuroscience Says about Free Will by mentioning a few cases such as pressing snooze on the alarm clock or picking a shirt out of the closet. He continues with an assertion about these cases, and with a question: In each case, we conceive of ourselves as free agents, consciously guiding our bodies in purposeful ways. But what does science have to say about the true source of this experience? This is a bad start. To be aware of ourselves as free agents is not to have an experience. There’s no special tickle which tells you you’re free, no "freedom itch." Rather, to be aware of the fact that you acted freely is, among other things, to know that had you preferred to do something else in those circumstances, you would have done it. And in many circumstances we clearly know that this is the case, so in many circumstances we are aware that we act freely. No experience is involved, and so far there’s no question in Bear’s article for science to answer. Continuing with his alleged experience, Bear writes: …the psychologists Dan Wegner and Thalia Wheatley made a revolutionary proposal: The experience of intentionally willing an action, they suggested, is often nothing more than a post hoc causal inference that our thoughts caused some behavior. More than a revolutionary proposal, this is an additional confusion. What might "intentionally willing an action" mean? Is it to be contrasted with non-intentionally willing an action? But what could this stand for? © 2016 Scientific American

Keyword: Consciousness
Link ID: 22282 - Posted: 06.04.2016

By David Shultz We still may not know what causes consciousness in humans, but scientists are at least learning how to detect its presence. A new application of a common clinical test, the positron emission tomography (PET) scan, seems to be able to differentiate between minimally conscious brains and those in a vegetative state. The work could help doctors figure out which brain trauma patients are the most likely to recover—and even shed light on the nature of consciousness. “This is really cool what these guys did here,” says neuroscientist Nicholas Schiff at Cornell University, who was not involved in the study. “We’re going to make great use of it.” PET scans work by introducing a small amount of radionuclides into the body. These radioactive compounds act as a tracer and naturally emit subatomic particles called positrons over time, and the gamma rays indirectly produced by this process can be detected by imaging equipment. The most common PET scan uses fluorodeoxyglucose (FDG) as the tracer in order to show how glucose concentrations change in tissue over time—a proxy for metabolic activity. Compared with other imaging techniques, PET scans are relatively cheap and easy to perform, and are routinely used to survey for cancer, heart problems, and other diseases. In the new study, researchers used FDG-PET scans to analyze the resting cerebral metabolic rate—the amount of energy being used by the tissue—of 131 patients with a so-called disorder of consciousness and 28 healthy controls. Disorders of consciousness can refer to a wide range of problems, ranging from a full-blown coma to a minimally conscious state in which patients may experience brief periods where they can communicate and follow instructions. Between these two extremes, patients may be said to be in a vegetative state or exhibit unresponsive wakefulness, characterized by open eyes and basic reflexes, but no signs of awareness. Most disorders of consciousness result from head trauma, and where someone falls on the consciousness continuum is typically determined by the severity of the injury. © 2016 American Association for the Advancement of Science

Keyword: Consciousness; Brain imaging
Link ID: 22260 - Posted: 05.28.2016

By Amina Zafar, Tragically Hip frontman ​Gord Downie's resilience and openness about his terminal glioblastoma and his plans to tour could help to reduce stigma and improve awareness, some cancer experts say. Tuesday's news revealed that the singer has an aggressive form of cancer that originated in his brain. An MRI scan last week showed the tumour has responded well to surgery, radiation and chemotherapy, doctors said. "I was quickly impressed by Gord's resilience and courage," Downie's neuro-oncologist, Dr. James Perry of Sunnybrook Health Sciences Centre, told a news conference. Perry said it's daunting for many of his patients to reveal the diagnosis to their family, children and co-workers. "The news today, while sad, also creates for us in brain tumour research an unprecedented opportunity to create awareness and to create an opportunity for fundraising for research that's desperately needed to improve the odds for all people with this disease," Perry said. Dr. James Perry, head of neurology at Toronto's Sunnybrook Health Sciences Centre, calls Gord Downie's sad news an unprecedented opportunity to fundraise for brain tumour research. (Aaron Vincent Elkaim/Canadian Press) "Gord's courage in coming forward with his diagnosis will be a beacon for all patients with glioblastoma in Canada. They will see a survivor continuing with his craft despite its many challenges." ©2016 CBC/Radio-Canada.

Keyword: Glia
Link ID: 22251 - Posted: 05.26.2016

Dean Burnett A recent report by the National Obesity Forum stated that official advice about low-fat diets is wrong. As ever, there’s now heated debate over how valid/accurate this claim is. But let’s step back a moment and ask a revealing question: why do official government dietary guidelines even exist? Why are they necessary? From an entirely logical position, eating food fulfils several requirements. It provides the energy to do things, helps us build up stores of energy for when needed, and provides the materials required to build and maintain our bodies. Therefore, the human body requires a regular intake of nutrients, vitamins and calories to maintain day-to-day functioning. As a result, the human body has developed an intricate digestive system to monitor and regulate our food intake. The digestive system is quite cool. It has a sophisticated nervous system that can operate pretty much independently, so is often regarded as separate from the main one, leading some to describe it as a “second brain”, there to encourage, monitor and process the consumption and digestion of food. It also utilises hormones, namely leptin and ghrelin, which decrease and increase appetite respectively depending on how much food the body has/needs. It’s a painstakingly complex and precise system that’s evolved over aeons to make sure we eat what and when we need to, and get the most out of our food. However, at some point the human brain got involved, then everything went to hell. This is why we can now be presented with foodstuffs we’re repeatedly told are unhealthy, even dangerous, and say “Thanks. Extra chilli sauce on mine, please”.

Keyword: Obesity; Attention
Link ID: 22247 - Posted: 05.25.2016

By Lisa Rapaport (Reuters Health) - Attention deficit hyperactivity disorder (ADHD), usually diagnosed in children, may show up for the first time in adulthood, two recent studies suggest. And not only can ADHD appear for the first time after childhood, but the symptoms for adult-onset ADHD may be different from symptoms experienced by kids, the researchers found. “Although the nature of symptoms differs somewhat between children and adults, all age groups show impairments in multiple domains – school, family and friendships for kids and school, occupation, marriage and driving for adults,” said Stephen Faraone, a psychiatry researcher at SUNY Upstate Medical University in Syracuse, New York and author of an editorial accompanying the two studies in JAMA Psychiatry. Faraone cautions, however, that some newly diagnosed adults might have had undetected ADHD as children. Support from parents and teachers or high intelligence, for example, might prevent ADHD symptoms from emerging earlier in life. It’s not clear whether study participants “were completely free of psychopathology prior to adulthood,” Faraone said in an email. One of the studies, from Brazil, tracked more than 5,200 people born in 1993 until they were 18 or 19 years old. © 2016 Scientific American

Keyword: ADHD; Development of the Brain
Link ID: 22245 - Posted: 05.25.2016

Stephen Cave For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness. Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.” So what happens if this faith erodes? The sciences have grown steadily bolder in their claim that all human behavior can be explained through the clockwork laws of cause and effect. This shift in perception is the continuation of an intellectual revolution that began about 150 years ago, when Charles Darwin first published On the Origin of Species. Shortly after Darwin put forth his theory of evolution, his cousin Sir Francis Galton began to draw out the implications: If we have evolved, then mental faculties like intelligence must be hereditary. But we use those faculties—which some people have to a greater degree than others—to make decisions. So our ability to choose our fate is not free, but depends on our biological inheritance. © 2016 by The Atlantic Monthly Group.

Keyword: Consciousness
Link ID: 22228 - Posted: 05.18.2016