Chapter 14. Attention and Consciousness

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 869

By Quassim Cassam Most people wonder at some point in their lives how well they know themselves. Self-knowledge seems a good thing to have, but hard to attain. To know yourself would be to know such things as your deepest thoughts, desires and emotions, your character traits, your values, what makes you happy and why you think and do the things you think and do. These are all examples of what might be called “substantial” self-knowledge, and there was a time when it would have been safe to assume that philosophy had plenty to say about the sources, extent and importance of self-knowledge in this sense. Not any more. With few exceptions, philosophers of self-knowledge nowadays have other concerns. Here’s an example of the sort of thing philosophers worry about: suppose you are wearing socks and believe you are wearing socks. How do you know that that’s what you believe? Notice that the question isn’t: “How do you know you are wearing socks?” but rather “How do you know you believe you are wearing socks?” Knowledge of such beliefs is seen as a form of self-knowledge. Other popular examples of self-knowledge in the philosophical literature include knowing that you are in pain and knowing that you are thinking that water is wet. For many philosophers the challenge is explain how these types of self-knowledge are possible. This is usually news to non-philosophers. Most certainly imagine that philosophy tries to answer the Big Questions, and “How do you know you believe you are wearing socks?” doesn’t sound much like one of them. If knowing that you believe you are wearing socks qualifies as self-knowledge at all — and even that isn’t obvious — it is self-knowledge of the most trivial kind. Non-philosophers find it hard to figure out why philosophers would be more interested in trivial than in substantial self-knowledge. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20402 - Posted: 12.08.2014

|By Piercarlo Valdesolo Google “successful Thanksgiving” and you will get a lot of different recommendations. Most you’ve probably heard before: plan ahead, get help, follow certain recipes. But according to new research from Florida State University, enjoying your holiday also requires a key ingredient that few guests consider as they wait to dive face first into the turkey: a belief in free will. What does free will have to do with whether or not Aunt Sally leaves the table in a huff? These researchers argue that belief in free will is essential to experiencing the emotional state that makes Thanksgiving actually about giving thanks: gratitude. Previous research has shown that our level of gratitude for an act depends on three things: 1) the cost to the benefactor (in time, effort or money), 2) the value of the act to the beneficiary, and 3) the sincerity of the benefactor’s intentions. For example, last week my 4-year-old daughter gave me a drawing of our family. This act was costly (she spent time and effort), valuable (I love the way she draws herself bigger than everyone else in the family), and sincere (she drew it because she knew I would like it). But what if I thought that she drew it for a different reason? What if I thought that she was being coerced by my wife? Or if I thought that this was just an assignment at her pre-school? In other words, what if I thought she had no choice but to draw it? I wouldn’t have defiantly thrown it back in her face, but I surely would have felt differently about the sincerity of the action. It would have diminished my gratitude. © 2014 Scientific American

Keyword: Consciousness; Emotions
Link ID: 20360 - Posted: 11.26.2014

|By Christof Koch Point to any one organ in the body, and doctors can tell you something about what it does and what happens if that organ is injured by accident or disease or is removed by surgery—whether it be the pituitary gland, the kidney or the inner ear. Yet like the blank spots on maps of Central Africa from the mid-19th century, there are structures whose functions remain unknown despite whole-brain imaging, electroencephalographic recordings that monitor the brain's cacophony of electrical signals and other advanced tools of the 21st century. Consider the claustrum. It is a thin, irregular sheet of cells, tucked below the neocortex, the gray matter that allows us to see, hear, reason, think and remember. It is surrounded on all sides by white matter—the tracts, or wire bundles, that interconnect cortical regions with one another and with other brain regions. The claustra—for there are two of them, one on the left side of the brain and one on the right—lie below the general region of the insular cortex, underneath the temples, just above the ears. They assume a long, thin wisp of a shape that is easily overlooked when inspecting the topography of a brain image. Advanced brain-imaging techniques that look at the white matter fibers coursing to and from the claustrum reveal that it is a neural Grand Central Station. Almost every region of the cortex sends fibers to the claustrum. These connections are reciprocated by other fibers that extend back from the claustrum to the originating cortical region. Neuroanatomical studies in mice and rats reveal a unique asymmetry—each claustrum receives input from both cortical hemispheres but only projects back to the overlying cortex on the same side. Whether or not this is true in people is not known. Curiouser and curiouser, as Alice would have said. © 2014 Scientific American

Keyword: Consciousness
Link ID: 20350 - Posted: 11.24.2014

Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o

Keyword: Attention
Link ID: 20347 - Posted: 11.24.2014

By CLYDE HABERMAN The notion that a person might embody several personalities, each of them distinct, is hardly new. The ancient Romans had a sense of this and came up with Janus, a two-faced god. In the 1880s, Robert Louis Stevenson wrote “Strange Case of Dr. Jekyll and Mr. Hyde,” a novella that provided us with an enduring metaphor for good and evil corporeally bound. Modern comic books are awash in divided personalities like the Hulk and Two-Face in the Batman series. Even heroic Superman has his alternating personas. But few instances of the phenomenon captured Americans’ collective imagination quite like “Sybil,” the study of a woman said to have had not two, not three (like the troubled figure in the 1950s’ “Three Faces of Eve”), but 16 different personalities. Alters, psychiatrists call them, short for alternates. As a mass-market book published in 1973, “Sybil” sold in the millions. Tens of millions watched a 1976 television movie version. The story had enough juice left in it for still another television film in 2007. Sybil Dorsett, a pseudonym, became the paradigm of a psychiatric diagnosis once known as multiple personality disorder. These days, it goes by a more anodyne label: dissociative identity disorder. Either way, the strange case of the woman whose real name was Shirley Ardell Mason made itself felt in psychiatrists’ offices across the country. Pre-"Sybil,” the diagnosis was rare, with only about 100 cases ever having been reported in medical journals. Less than a decade after “Sybil” made its appearance, in 1980, the American Psychiatric Association formally recognized the disorder, and the numbers soared into the thousands. People went on television to tell the likes of Jerry Springer and Leeza Gibbons about their many alters. One woman insisted that she had more than 300 identities within her (enough, if you will, to fill the rosters of a dozen major-league baseball teams). Even “Eve,” whose real name is Chris Costner Sizemore, said in the mid-1970s that those famous three faces were surely an undercount. It was more like 22, she said. © 2014 The New York Times Company

Keyword: Consciousness
Link ID: 20346 - Posted: 11.24.2014

By MAX BEARAK MUMBAI, India — The young man sat cross-legged atop a cushioned divan on an ornately decorated stage, surrounded by other Jain monks draped in white cloth. His lip occasionally twitched, his hands lay limp in his lap, and for the most part his eyes were closed. An announcer repeatedly chastised the crowd for making even the slightest noise. From daybreak until midafternoon, members of the audience approached the stage, one at a time, to show the young monk a random object, pose a math problem, or speak a word or phrase in one of at least six different languages. He absorbed the miscellany silently, letting it slide into his mind, as onlookers in their seats jotted everything down on paper. After six hours, the 500th and last item was uttered — it was the number 100,008. An anxious hush descended over the crowd. And the monk opened his eyes and calmly recalled all 500 items, in order, detouring only once to fill in a blank he had momentarily set aside. When he was done, and the note-keepers in the audience had confirmed his achievement, the tense atmosphere dissolved and the announcer led the crowd in a series of triumphant chants. The opportunity to witness the feat of memory drew a capacity crowd of 6,000 to the Sardar Vallabhbhai Patel stadium in Mumbai on Sunday. The exhibition was part of a campaign to encourage schoolchildren to use meditation to build brainpower, as Jain monks have done for centuries in India, a country drawn both toward ancient religious practices and more recent ambitions. But even by Jain standards, the young monk — Munishri Ajitchandrasagarji, 24 — is something special. His guru, P. P. Acharya Nayachandrasagarji, said no other monk in many years had come close to his ability. © 2014 The New York Times Company

Keyword: Learning & Memory; Attention
Link ID: 20334 - Posted: 11.20.2014

James Gorman Evidence has been mounting for a while that birds and other animals can count, particularly when the things being counted are items of food. But most of the research is done under controlled conditions. In a recent experiment with New Zealand robins, Alexis Garland and Jason Low at Victoria University of Wellington tested the birds in a natural setting, giving them no training and no rewards, and showed that they knew perfectly well when a scientist had showed them two mealworms in a box, but then delivered only one. The researchers reported the work this fall in the journal Behavioural Processes. The experiment is intriguing to watch, partly because it looks like a child’s magic trick. The apparatus used is a wooden box that has a sliding drawer. After clearly showing a robin that she was dropping two mealworms in a circular well in the box, Dr. Garland would slide in the drawer. It covered the two worms with an identical-looking circular well containing only one worm. When the researcher moved away and the robin flew down and lifted off a cover, it would find only one worm. The robins pecked intensely at the box, behavior they didn’t show if they found the two worms they were expecting. Earlier experiments had also shown the birds to be good at counting, and Dr. Garland said that one reason might be that they are inveterate thieves. Mates, in particular, steal from one another’s food caches, where they hide perishable prey like worms or insects. “If you’ve got a mate that steals 50 or more percent of your food,” she said, you’d better learn how to keep track of how many mealworms you’ve got. © 2014 The New York Times Company

Keyword: Intelligence; Evolution
Link ID: 20324 - Posted: 11.18.2014

By Anna North Do you devour the latest neuroscience news, eager to learn more about how your brain works? Or do you click past it to something else, something more applicable to your life? If you’re in the latter camp, you may be in the majority. A new study suggests that many people just don’t pay that much attention to brain science, and its findings may raise a question: Is “neuro-literacy” really necessary? At Wired, Christian Jarrett writes, “It feels to me like interest in the brain has exploded.” He cites the prevalence of the word “brain” in headlines as well as “the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law.” But as a neuroscience writer, he notes, he may be “heavily biased” — and in fact, some research “suggests neuroscience has yet to make an impact on most people’s everyday lives.” For instance, he reports, Cliodhna O’Connor and Helene Joffe recently interviewed 48 Londoners about brain science for a paper published in the journal Science Communication. Anyone who thinks we live in an era of neuro-fixation may find the results a bit of a shock. Said one participant in the research: “Science of the brain? I haven’t a clue. Nothing at all. I’d be lying if I said there was.” Another: “Brain research I understand, an image of, I don’t know, a monkey or a dog with like the top of their head off and electrodes and stuff on their brain.” And another: “I might have seen it on the news or something, you know, some report of some description. But because they probably mentioned the word ‘science,’ or ‘We’re going to go now to our science correspondent Mr. Lala,’ that’s probably when I go, okay, it’s time for me to make a cup of tea.” According to the study authors, 71 percent of respondents “took pains to convey that neuroscience was not salient in their day-to-day life: it was ‘just not really on my radar.’” Some respondents associated brain research with scientists in white coats or with science classes (asked to free-associate about the term “brain research,” one respondent drew a mean-faced stick figure labeled “cross teacher”). And 42 percent saw science as something alien to them, removed from their own lives. © 2014 The New York Times Company

Keyword: Miscellaneous
Link ID: 20315 - Posted: 11.15.2014

By ALAN SCHWARZ CONCORD, Calif. — Every time Matthias is kicked out of a school or day camp for defying adults and clashing with other children, his mother, Joelle Kendle, inches closer to a decision she dreads. With each morning of arm-twisting and leg-flailing as she tries to get him dressed and out the door for first grade, the temptation intensifies. Ms. Kendle is torn over whether to have Matthias, just 6 and already taking the stimulant Adderall for attention deficit hyperactivity disorder, go on a second and more potent medication: the antipsychotic Risperdal. Her dilemma is shared by a steadily rising number of American families who are using multiple psychotropic drugs — stimulants, antipsychotics, antidepressants and others — to temper their children’s troublesome behavior, even though many doctors who mix such medications acknowledge that little is known about the overall benefits and risks for children. In 2012 about one in 54 youngsters ages 6 through 17 covered by private insurance was taking at least two psychotropic medications — a rise of 44 percent in four years, according to Express Scripts, which processes prescriptions for 85 million Americans. Academic studies of children covered by Medicaid have also found higher rates and growth. Combined, the data suggest that about one million children are currently taking various combinations of psychotropics. Risks of antipsychotics alone, for example, are known to include substantial weight gain and diabetes. Stimulants can cause appetite suppression, insomnia and, far more infrequently, hallucinations. Some combinations of medication classes, like antipsychotics and antidepressants, have shown improved benefits (for psychotic depression) but also heightened risks (for heart rhythm disturbances). But this knowledge has been derived substantially from studies in adults — children are rarely studied because of concerns about safety and ethics — leaving many experts worried that the use of multiple psychotropics in youngsters has not been explored fully. There is also debate over whether the United States Food and Drug Administration’s database of patients’ adverse drug reactions reliably monitors the hazards of psychotropic drug combinations, primarily because only a small fraction of cases are ever reported. Some clinicians are left somewhat queasy about relying mostly on anecdotal reports of benefit and harm. © 2014 The New York Times Company

Keyword: ADHD; Schizophrenia
Link ID: 20314 - Posted: 11.15.2014

By Paula Span A few days after I wrote about conditions that can mimic dementia, reader Sue Murray emailed me from Westchester County. Her subject line: “Have you heard of Charles Bonnet Syndrome?” I hadn’t, and until about six months ago, neither had Ms. Murray. Her mother Elizabeth, who is 91, has glaucoma and macular degeneration, and has been gradually losing her vision, Ms. Murray explained. So at first, her family was excited when Elizabeth seemed to be seeing things more clearly. Maybe, they thought, her vision was returning. But the things she was seeing — patterns and colors, strangers, a green man — weren’t there. She insisted that “there were people in the cellar, people on the porch, people in the house,” Ms. Murray said. “She’d point and say, ‘Don’t you see them?’ And she’d get mad when we didn’t.” Elizabeth and her husband Victor, 95, live in Connecticut, in a house they bought 50 years ago. For a while, the Green Man, as Elizabeth began calling him, seemed to have moved in, too. “She’d start hiding things in the closet so the Green Man wouldn’t take them,” Ms. Murray said. “There wasn’t any real fear; it was just, ‘Look at that!’” Elizabeth’s ophthalmologist promptly supplied the name for this condition: Charles Bonnet Syndrome, named for a Swiss philosopher who described such visual hallucinations in the 18th century. “We were relieved,” said Ms. Murray. What they feared, of course, was mental illness or dementia. “To have an eye doctor say, ‘I’m familiar with this,’ it’s still jarring but it’s not so terrible.” Bonnet Syndrome (pronounced Boh-NAY) isn’t terribly rare, it turns out. Oliver Sacks described several cases in his 2012 book, “Hallucinations.” Dr. Abdhish Bhavsar, a clinical spokesperson for the American Academy of Ophthalmology and a retina specialist in Minneapolis, estimates that he has probably seen about 200 patients with the syndrome over 17 years of practice. © 2014 The New York Times Company

Keyword: Vision; Attention
Link ID: 20297 - Posted: 11.10.2014

By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.

Keyword: Attention
Link ID: 20296 - Posted: 11.10.2014

By Katy Waldman How much control do you have over how much control you think you have? The researchers Michael R. Ent and Roy F. Baumeister have been studying what makes a person more or less likely to believe in free will. Is it a deep connection to the philosophy of David Hume? An abiding faith in divine omnipotence? Try a really, really full bladder. In an online survey, 81 adults ages 18 to 70 reported the extent to which they felt hungry, tired, desirous of sex, and desirous of a toilet. They then rated the extent to which they considered themselves in command of their destinies. People experiencing intense physical needs were less likely to say they believed in free will. People who were not inexplicably taking an online survey while desperately holding in their pee (or starving, or wanting sex, or trying to stay awake) mostly claimed that the universe had handed them the keys to their lives. Also, people who brought their laptops with them into the bathroom to fill out the survey reported that they were God. (I kid on that last part.) Ent and Baumeister also used a survey to take the free will temperature of 23 people with panic disorder, 16 people with epilepsy, and 35 healthy controls. Those suffering from the two conditions—both of which can unpredictably plunge the mind into chaos—tended to put less stock in the notion of mental autonomy. There was a third experiment, too. I said earlier that people not taking an online survey while jonesing for various creature comforts mostly claimed that they wore the metaphysical pants. However, despite robust results for horniness, fatigue, and needing-to-go-ness, Ent and Baumeister didn’t initially see much correlation between people’s philosophical visions and their hunger levels. So they re-administered the survey to 112 new volunteers, some of whom were dieting and some of whom were not. © 2014 The Slate Group LLC.

Keyword: Consciousness
Link ID: 20294 - Posted: 11.10.2014

By Greg Miller This robot causes people to experience the illusory sensation of someone standing behind them. © Alain Herzog/EPFL People who’ve stared death in the face and lived to tell about it—mountain climbers who’ve made a harrowing descent, say, or survivors of the World Trade Center attacks—sometimes report that just when their situation seemed impossible, a ghostly presence appeared. People with schizophrenia and certain types of neurological damage sometimes report similar experiences, which scientists call, aptly, “feeling of presence.” Now a team of neuroscientists says it has identified a set of brain regions that seems to be involved in generating this illusion. Better yet, they’ve built a robot that can cause ordinary people to experience it in the lab. The team was led by Olaf Blanke, a neurologist and neuroscientist at the Swiss Federal Institute of Technology in Lausanne. Blanke has a long-standing interest in creepy illusions of bodily perception. Studying these bizarre phenomena, he says, could point to clues about the biology of mental illness and the mechanisms of human consciousness. In 2006, for example, Blanke and colleagues published a paper in Nature that had one of the best titles you’ll ever see in a scientific journal: “Induction of an illusory shadow person.” In that study, they stimulated the brain of a young woman who was awaiting brain surgery for severe epilepsy. Surgeons had implanted electrodes on the surface of her brain to monitor her seizures, and when the researchers passed a mild current through the electrodes, stimulating a small region at the intersection of the temporal and parietal lobes of her brain, she experienced what she described as a shadowy presence lurking nearby, mimicking her own posture. Colored areas indicate regions of overlap in the lesions of neurological patients who experienced feeling of presence illusions. © 2014 Condé Nast.

Keyword: Attention; Emotions
Link ID: 20290 - Posted: 11.08.2014

By Dwayne Godwin and Jorge Cham © 2014 Scientific American

Keyword: Consciousness; Robotics
Link ID: 20287 - Posted: 11.08.2014

|By Lindsey Konkel and Environmental Health News New York City children exposed in the womb to high levels of pollutants in vehicle exhaust had a five times higher risk of attention problems at age 9, according to research by Columbia University scientists published Wednesday. The study adds to earlier evidence that mothers' exposures to polycyclic aromatic hydrocarbons (PAHs), which are emitted by the burning of fossil fuels and other organic materials, are linked to children's behavioral problems associated with Attention Deficit Hyperactivity Disorder (ADHD). “Our research suggests that environmental factors may be contributing to attention problems in a significant way,” said Frederica Perera, an environmental health scientist at Columbia’s Mailman School of Public Health who was the study's lead author. About one in 10 U.S. kids is diagnosed with ADHD, according to the Centers for Disease Control and Prevention. Children with ADHD are at greater risk of poor academic performance, risky behaviors and lower earnings in adulthood, the researchers wrote. “Air pollution has been linked to adverse effects on attention span, behavior and cognitive functioning in research from around the globe. There is little question that air pollutants may pose a variety of potential health risks to children of all ages, possibly beginning in the womb,” said Dr. Andrew Adesman, chief of developmental and behavioral pediatrics at Steven & Alexandra Cohen Children’s Medical Center of New York. He did not participate in the new study. © 2014 Scientific American

Keyword: ADHD; Neurotoxins
Link ID: 20285 - Posted: 11.06.2014

by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.

Keyword: Attention; Consciousness
Link ID: 20284 - Posted: 11.06.2014

By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.

Keyword: Attention
Link ID: 20282 - Posted: 11.06.2014

By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company

Keyword: ADHD; Learning & Memory
Link ID: 20272 - Posted: 11.03.2014

Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR

Keyword: Learning & Memory; Attention
Link ID: 20271 - Posted: 11.03.2014

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20253 - Posted: 10.28.2014