Chapter 18. Attention and Higher Cognition

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1055

Scientists have come up with a questionnaire they say should help diagnose a condition called face blindness. Prosopagnosia, as doctors call it, affects around two in every 100 people in the UK and is the inability to recognise people by their faces alone. In its most extreme form, people cannot even recognise their family or friends. Milder forms, while still distressing, can be tricky to diagnose, which is why tests are needed. People with prosopagnosia often use non-facial cues to recognise others, such as their hairstyle, clothes, voice, or distinctive features. Some may be unaware they have the condition, instead believing they have a "bad memory for faces". But prosopagnosia is entirely unrelated to intelligence or broader memory ability. One [anonymous] person with prosopagnosia explains: "My biggest problem is seeing the difference between ordinary-looking people, especially faces with few specific traits. "I work at a hospital with an awful lot of employees and I often introduce myself to colleagues with whom I have worked several times before. I also often have problems recognising my next-door neighbour, even though we have been neighbours for eight years now. She often changes clothes, hairstyle and hair colour. When I strive to recognise people, I try to use technical clues like clothing, hairstyle, scars, glasses, their dialect and so on." Doctors can use computer-based tests to see if people can spot famous faces and memorise and recognise a set of unfamiliar faces. And now Drs Richard Cook, Punit Shah and City University London and Kings College London have come up with a 20-item questionnaire to help measure the severity of someone's face blindness. © 2015 BBC

Keyword: Attention
Link ID: 21598 - Posted: 11.04.2015

By Christian Jarrett Neuroscientists, for obvious reasons, are really interested in finding out what’s different about the brains of people with unpleasant personalities, such as narcissists, or unsavory habits, like porn addiction. Their hope is that by studying these people’s brains we might learn more about the causes of bad character, and ways to helpfully intervene. Now to the list of character flaws that've received the brain-scanner treatment we can apparently add sexism — a new Japanese study published in Scientific Reports claims to have found its neurological imprint. The researchers wanted to know whether there is something different about certain individuals’ brains that potentially predisposes them to sexist beliefs and attitudes (of course, as with so much neuroscience research like this, it’s very hard to disentangle whether any observed brain differences are the cause or consequence of the trait or behavior that’s being studied, a point I’ll come back to). More specifically, they were looking to see if people who publicly endorse gender inequality have brains that are anatomically different from people who believe in gender equality. In short, it seems the answer is yes. Neuroscientist Hikaru Takeuchi at Tohoku University and his colleagues have identified two brain areas where people who hold sexist attitudes have different levels of gray-matter density (basically, a measure of how many brain cells are packed into a given area), as compared with people who profess a belief in gender equality (their study doesn’t speak to any subconsciously held sexist beliefs). What’s more, these neural differences were correlated with psychological characteristics that could help explain some people’s sexist beliefs. © 2015, New York Media LLC.

Keyword: Attention; Emotions
Link ID: 21579 - Posted: 10.29.2015

By ALEX HUTCHINSON WHEN marketing researchers at the University of Pennsylvania’s Wharton School rigged shopping carts at a major East Coast supermarket with motion-tracking radio-frequency tags, they unwittingly stumbled on a metaphor for our path through the aisles of life. Route data from more than 1,000 shoppers, matched to their purchases at checkout, revealed a clear pattern: Drop a bunch of kale into your cart and you’re more likely to head next to the ice cream or beer section. The more “virtuous” products you have in your basket, the stronger your temptation to succumb to vice. Such hedonic balancing acts are neither unpredictable — who, after all, hasn’t rewarded themselves with a piece of cake or an extra beer after a killer workout? — nor inherently bad. But an emerging body of research into what psychologists call the “licensing effect” suggests that this tit-for-tat tendency is deeply wired in us, operating even when we’re not aware of it. And in a world where we’re bombarded by pitches for an endless array of health-boosting products of dubious efficacy, that can be a problem. The key insight underlying the licensing effect, which was first described in 2006 by Uzma Khan, then a professor of marketing at Carnegie Mellon University, and Ravi Dhar of the Yale School of Management, is that our choices are contingent: Since we each have a fairly stable self-concept of how good/bad, healthy/unhealthy or selfish/altruistic we are, when one decision swings too far from this self-concept, we automatically take action to balance it out. © 2015 The New York Times Company

Keyword: Obesity; Attention
Link ID: 21565 - Posted: 10.26.2015

Dyscalculia is like dyslexia — but for those who have trouble with math instead of reading. But not enough people know about it, according to a neuroscientist. "There is a lack of awareness among teachers and educators," said Daniel Ansari, professor and Canada Research Chair in Developmental Cognitive Neuroscience at the University of Western Ontario. Individuals with dyscalculia have trouble with simple calculations. "If I ask you what is 1 + 3, you don't need to calculate. Four will pop in to your head, it is stored in your long-term memory," he said. But those with dyscalculia will have to use their hands to count. Scientists have known about dyscalculia since the 1940's but little research has been done on it, even though it is probably just as common as dyslexia, says Ansari. Currently, there is no existing universal form of testing for dyscalculia. But Ansari has come up with screening tests for children in kindergarten. He says it's important to diagnose dyscalculia early on, so individuals can learn to adapt and improve their skills before it's too late. "We don't just need math to be good in school but to function in society," said Ansari. He says research has shown poor math skills can lead to an increased chance of unemployment, imprisonment or mortgage default. ©2015 CBC/Radio-Canada.

Keyword: Attention
Link ID: 21564 - Posted: 10.26.2015

In a study of mice, scientists discovered that a brain region called the thalamus may be critical for filtering out distractions. The study, published in Nature and partially funded by the National Institutes of Health, paves the way to understanding how defects in the thalamus might underlie symptoms seen in patients with autism, attention deficit hyperactivity disorder (ADHD), and schizophrenia. “We are constantly bombarded by information from our surroundings,” said James Gnadt, Ph.D., program director at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS). “This study shows how the circuits of the brain might decide which sensations to pay attention to.” Thirty years ago Dr. Francis Crick proposed that the thalamus “shines a light” on regions of the cortex, which readies them for the task at hand, leaving the rest of the brain’s circuits to idle in darkness. “We typically use a very small percentage of incoming sensory stimuli to guide our behavior, but in many neurological disorders the brain is overloaded,” said Michael Halassa, M.D., Ph.D., the study’s senior author and an assistant professor at New York University’s Langone Medical Center. “It gets a lot of sensory input that is not well-controlled because this filtering function might be broken.” Neuroscientists have long believed that an area at the very front of the brain called the prefrontal cortex (PFC) selects what information to focus on, but how this happens remains unknown. One common theory is that neurons in the PFC do this by sending signals to cells in the sensory cortices located on the outer part of the brain. However, Dr. Halassa’s team discovered that PFC neurons may instead tune the sensitivity of a mouse brain to sights and sounds by sending signals to inhibitory thalamic reticular nucleus (TRN) cells located deep inside the brain.

Keyword: Attention
Link ID: 21545 - Posted: 10.22.2015

Susan Gaidos CHICAGO — Teens like high-tech gadgets so much that they often use them all at once. While doing homework or playing video games, teens may listen to music or watch TV, all the while texting their friends. Some of these multitaskers think they are boosting their ability to attend to multiple activities, but in fact are more likely impairing their ability to focus, psychologist Mona Moisala of the University of Helsinki, reported October 18 at the annual meeting of the Society for Neuroscience. Moisala and colleagues tested 149 adolescents and young adults, ages 13 to 24, who regularly juggle multiple forms of media or play video games daily. Each participant had to focus attention on sentences (some logical, some illogical) under three conditions: without any distractions, while listening to distracting sounds, and while both listening to a sentence and reading another sentence. Using functional MRI to track brain activity, the researchers found that daily gaming had no effect on participants’ ability to focus. Those who juggle multiple forms of electronic media, however, had more trouble paying attention. Multitaskers performed lower overall, even when they weren’t being distracted. Brain images showed that the multitaskers also showed a higher level of activity in the right prefrontal cortex, an area of the brain implicated in problem solving and in processing complex thoughts and emotions. “Participants with the highest reported frequency of multimedia use showed the highest levels of brain activation in this area,” Moisala said. “In addition, these adolescents did worse on the task.” © Society for Science & the Public 2000 - 2015

Keyword: Attention
Link ID: 21529 - Posted: 10.20.2015

by Bethany Brookshire It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before. The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again. “Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past. But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. © Society for Science & the Public 2000 - 2015

Keyword: Attention
Link ID: 21511 - Posted: 10.14.2015

By ERICA GOODE Women who suffer from anorexia are often thought of as having an extraordinary degree of self-control, even if that discipline is used self-destructively. But a new study suggests that the extreme dieting characteristic of anorexia may instead be well-entrenched habit — behavior governed by brain processes that, once set in motion, are inflexible and slow to change. The study’s findings may help explain why the eating disorder, which has the highest mortality rate of any mental illness, is so stubbornly difficult to treat. But they also add to increasing evidence that the brain circuits involved in habitual behavior play a role in disorders where people persist in making self-destructive choices no matter the consequences, like cocaine addiction or compulsive gambling. In the case of anorexia, therapists often feel helpless to interrupt the relentless dieting that anorexic patients pursue. Even when patients say they want to recover, they often continue to eat only low-fat, low-calorie foods. Neither psychiatric medications nor talk therapies that are used successfully for other eating disorders are much help in most cases. And research suggests that 50 percent or more of hospitalized anorexic patients who are discharged at a normal weight will relapse within a year. “The thing about people with anorexia nervosa is that they can’t stop,” said Dr. Joanna E. Steinglass, an associate professor in clinical psychiatry at the New York State Psychiatric Institute at Columbia University Medical Center and a co-author of the new study, which appears in the journal Nature Neuroscience. “They come into treatment saying they want to get better, and they can’t do it,” Dr. Steinglass added. Karin Foerde, a research scientist at the psychiatric institute and Columbia, was the lead author on the study. © 2015 The New York Times Company

Keyword: Apoptosis; Attention
Link ID: 21505 - Posted: 10.13.2015

By KENNETH D. MILLER SOME hominid along the evolutionary path to humans was probably the first animal with the cognitive ability to understand that it would someday die. To be human is to cope with this knowledge. Many have been consoled by the religious promise of life beyond this world, but some have been seduced by the hope that they can escape death in this world. Such hopes, from Ponce de León’s quest to find a fountain of youth to the present vogue for cryogenic preservation, inevitably prove false. In recent times it has become appealing to believe that your dead brain might be preserved sufficiently by freezing so that some future civilization could bring your mind back to life. Assuming that no future scientists will reverse death, the hope is that they could analyze your brain’s structure and use this to recreate a functioning mind, whether in engineered living tissue or in a computer with a robotic body. By functioning, I mean thinking, feeling, talking, seeing, hearing, learning, remembering, acting. Your mind would wake up, much as it wakes up after a night’s sleep, with your own memories, feelings and patterns of thought, and continue on into the world. I am a theoretical neuroscientist. I study models of brain circuits, precisely the sort of models that would be needed to try to reconstruct or emulate a functioning brain from a detailed knowledge of its structure. I don’t in principle see any reason that what I’ve described could not someday, in the very far future, be achieved (though it’s an active field of philosophical debate). But to accomplish this, these future scientists would need to know details of staggering complexity about the brain’s structure, details quite likely far beyond what any method today could preserve in a dead brain. © 2015 The New York Times Company

Keyword: Robotics; Consciousness
Link ID: 21499 - Posted: 10.12.2015

By Erika Hayasaki For 40 years, Joel Dreyer was a respected psychiatrist who oversaw a clinic for troubled children, belonged to an exclusive country club, and doted on his four daughters and nine grandchildren. Then, suddenly, he became a major drug dealer. Why? In the 1980s, psychiatrist Joel Dreyer was a fixture on Detroit’s WXYZ Channel 7. His commercials promoting his treatment center, InnerVisions, which he named after the Stevie Wonder album, sometimes ran up to five times a day. In one ad, Dreyer blocks a bartender from serving a mug of beer to a patron and says, “Don’t let your marriage or your job suffer from alcohol or drugs.” In another, Dreyer, in a navy pinstriped suit with a white pocket square, looks into the camera, his expression concerned and sympathetic. “Don’t you want to talk to someone who will listen?” he asks. “Someone who won’t pass judgment? Someone who cares? Come talk to me.” InnerVisions, which was based in Southfield, a suburb northwest of Detroit, had a staff of 80 physicians, psychologists, and therapists and took up two floors of a high-rise. It had made Dreyer not only a public figure but also wealthy. He maintained a side career as an expert witness. Attorneys called on him because he was smart, charming, and persuasive. Dreyer mostly testified for the defense, and with each high-profile case, his celebrity grew. Between the clinic, trial work, and his private practice, he was earning as much as $450,000 a year. Dreyer loved to be the center of attention. He would sometimes ride to work on a motorcycle in a bejeweled Elvis outfit to entertain his colleagues.

Keyword: Attention; Alzheimers
Link ID: 21478 - Posted: 10.06.2015

Archy de Berker and Sven Bestmann A great deal of excitement has been generated in recent weeks by a review paper examining the literature on the drug modafinil, which concluded that “modafinil may well deserve the title of the first well-validated pharmaceutical ‘nootropic’ [cognitive enhancing] agent”. Coverage in the Guardian, Telegraph, British Medical Journal, and the Independent all called attention to the work, with a press release from Oxford University trumpeting “Review of ‘smart drug’ shows modafinil does enhance cognition”. The paper in question is a well-written summary of the recent literature (although though it probably underestimates side effects, as pointed out in the British Medical Journal). A deeper problem is that reviews do not “show” anything. Reviews can be educational and informative, but that’s not the same as using all of the available data to test whether something works or not. Two different scientists can write reviews on the same topic and come to completely different conclusions. You can think of reviews as a watercolour painting of current knowledge. We sometimes forget that this is a far cry from a technical drawing, each element measured, quantified, and bearing a strict resemblance to reality. Scientists, and the public, trying to figure out what works face a tricky problem: there will often be many papers on a given topic, offering a variety of sometimes conflicting conclusions. Fortunately, we have a well-developed toolkit for assessing the state of the current literature and drawing conclusions from it. This procedure is called meta-analysis; it combines the available sources of data (e.g., published studies), and is extensively used to assess the efficacy of medical interventions. Initiatives such as the Cochrane Collaboration use meta-analyses to synthesize available evidence into a consensus on what works and what doesn’t. © 2015 Guardian News and Media Limited

Keyword: Attention; Narcolepsy
Link ID: 21476 - Posted: 10.05.2015

By Kelli Whitlock Burton They say beauty is in the eye of the beholder. But whether the beholder’s opinion is a product of one's genes or one's environment has long been a question for scientists. Although some research suggests that a preference for certain physical traits, such as height or muscular build, may be encoded in our genes, a new study finds it’s our individual life experiences that lead us to find one face more attractive than another. To get some closure on the nature versus nurture debate in human aesthetics, researchers asked 547 pairs of identical twins and 214 pairs of same-gender fraternal twins to view 200 faces and rate them on a scale of one to seven, with one being the least attractive and seven the most attractive. A group of 660 nontwins then completed the same survey. If genes were more involved in facial preference, identical twins would have had similar ratings; if the influence of a familial environment carried more weight, fraternal twins would have also answered similarly. However, most twins’ scores were quite different from one another, suggesting that something else was at play. The researchers suspect that it’s an individual’s life experiences that guide our opinions of attractiveness. The findings, reported today in Current Biology, build on earlier work by the same team that shows the ability to recognize faces is largely a genetic trait. The research is ongoing, and you can participate, too. Just complete the facial preference survey through the researchers’ website at: www.TestMyBrain.org. © 2015 American Association for the Advancement of Science.

Keyword: Attention; Genes & Behavior
Link ID: 21467 - Posted: 10.03.2015

Are you good at picking someone out of a crowd? Most of us are better at recognising faces than distinguishing between other similar objects, so it’s long been suspected there’s something mysterious about the way the brain processes a face. Now further evidence has emerged that this is a special, highly evolved skill. A study of twins suggests there are genes influencing face recognition abilities that are distinct from the ones affecting intelligence – so it’s not that people who are good with faces just have a better memory, for instance. “The idea is that telling friend from foe was so important to survival that there was very strong pressure to improve that trait,” says Nicholas Shakeshaft of King’s College London. Previous studies using brain scanning have suggested there is a part of the brain dedicated to recognising faces, called the fusiform face area. But others have suggested this region may in fact just be used for discriminating between any familiar objects. Wondering if genetics could shed any light, Shakeshaft’s team tested more than 900 sets of UK twins – including both identical and non-identical pairs – on their face recognition skills. The ability turned out to be highly heritable, with identical twins having more similar abilities than fraternal ones. The same went for intelligence, which had earlier been tested as part of a long-running study. © Copyright Reed Business Information Ltd.

Keyword: Attention; Evolution
Link ID: 21461 - Posted: 09.30.2015

By Judith Berck The 73-year-old widow came to see Dr. David Goodman, an assistant professor in the psychiatry and behavioral sciences department at Johns Hopkins School of Medicine, after her daughter had urged her to “see somebody” for her increasing forgetfulness. She was often losing her pocketbook and keys and had trouble following conversations, and 15 minutes later couldn’t remember much of what was said. But he did not think she had early Alzheimer’s disease. The woman’s daughter and granddaughter had both been given a diagnosis of A.D.H.D. a few years earlier, and Dr. Goodman, who is also the director of a private adult A.D.H.D. clinical and research center outside of Baltimore, asked about her school days as a teenager. “She told me: ‘I would doodle because I couldn’t pay attention to the teacher, and I wouldn’t know what was going on. The teacher would move me to the front of the class,’ ” Dr. Goodman said, After interviewing her extensively, noting the presence of patterns of impairment that spanned the decades, Dr. Goodman diagnosed A.D.H.D. He prescribed Vyvanse, a short-acting stimulant of the central nervous system. A few weeks later, the difference was remarkable. “She said: ‘I’m surprised, because I’m not misplacing my keys now, and I can remember things better. My mind isn’t wandering off, and I can stay in a conversation. I can do something until I finish it,’ ” Dr. Goodman said. Once seen as a disorder affecting mainly children and young adults, attention deficit hyperactivity disorder is increasingly understood to last throughout one’s lifetime. © 2015 The New York Times Company

Keyword: ADHD; Alzheimers
Link ID: 21455 - Posted: 09.29.2015

HOW would you punish a murderer? Your answer will depend on how active a certain part of your brain happens to be. Joshua Buckholtz at the University of Harvard and his colleagues gave 66 volunteers scenarios involving a fictitious criminal called John. Some of his crimes were planned. In others, he was experiencing psychosis or distress – for example, his daughter’s life under threat. The volunteers had to decide how responsible John was for each crime and the severity of his punishment on a scale of 0 to 9. Before hearing the stories, some of the volunteers received magnetic stimulation to a brain region involved in decision-making, called the dorsolateral prefrontal cortex (DLPFC), which dampened its activity. The others were given a sham treatment. Inhibiting the DLPFC didn’t affect how responsible the volunteers thought John was for the crimes, or the punishment he should receive when he was not culpable for his actions. But they meted out a much less severe punishment than the control group when John had planned his crime (Neuron, doi.org/7rh). “By altering one process in the brain, we can alter our judgements,” says Christian Ruff at the Swiss Federal Institute of Technology in Zurich. In the justice system, the judgment stage to determine guilt is separated from sentencing, says James Tabery at the University of Utah. “It turns out that our brains work in a similar fashion.” © Copyright Reed Business Information Ltd.

Keyword: Emotions; Attention
Link ID: 21440 - Posted: 09.24.2015

By Martin Enserink AMSTERDAM—Is being a woman a disadvantage when you're applying for grant money in the Netherlands? Yes, say the authors of a paper published by the Proceedings of the National Academy of Sciences (PNAS) this week. The study showed that women have a lower chance than men of winning early career grants from the Netherlands Organization for Scientific Research (NWO), the country's main grant agency. NWO, which commissioned the study, accepted the results and announced several changes on Monday to rectify the problem. "NWO will devote more explicit attention to the gender awareness of reviewers in its methods and procedures," a statement said. But several Dutch scientists who have taken a close look at the data say they see no evidence of sexism. The PNAS paper, written by Romy van der Lee and Naomi Ellemers of Leiden University's Institute of Psychology, is an example of a classic statistical trap, says statistician Casper Albers of the University of Groningen, who tore the paper apart in a blog post yesterday. (In Dutch; a shortened translation in English is here.) Albers says he plans to send the piece as a commentary to PNAS as well. Van der Lee and Ellemers analyzed 2823 applications for NWO's Veni grants for young researchers in the years 2010, 2011, and 2012. Overall, women had a success rate of 14.9%, compared with 17.7% for men, they wrote, and that difference was statistically significant. But Albers says the difference evaporates if you look more closely at sex ratios and success rates in NWO's nine scientific disciplines. Those data, which Van der Lee and Ellemers provided in a supplement to their paper, show that women simply apply more often in fields where the chance of success is low. © 2015 American Association for the Advancement of Science

Keyword: Sexual Behavior; Attention
Link ID: 21439 - Posted: 09.24.2015

By Simon Makin Most people associate the term “subliminal conditioning” with dystopian sci-fi tales, but a recent study has used the technique to alter responses to pain. The findings suggest that information that does not register consciously teaches our brain more than scientists previously suspected. The results also offer a novel way to think about the placebo effect. Our perception of pain can depend on expectations, which explains placebo pain relief—and placebo's evil twin, the nocebo effect (if we think something will really hurt, it can hurt more than it should). Researchers have studied these expectation effects using conditioning techniques: they train people to associate specific stimuli, such as certain images, with different levels of pain. The subjects' perception of pain can then be reduced or increased by seeing the images during something painful. Most researchers assumed these pain-modifying effects required conscious expectations, but the new study, from a team at Harvard Medical School and the Karolinska Institute in Stockholm, led by Karin Jensen, shows that even subliminal input can modify pain—a more cognitively complex process than most that have previously been discovered to be susceptible to subliminal effects (timeline below). The scientists conditioned 47 people to associate two faces with either high or low pain levels from heat applied to their forearm. Some participants saw the faces normally, whereas others were exposed subliminally—the images were flashed so briefly, the participants were not aware of seeing them, as verified by recognition tests. © 2015 Scientific American

Keyword: Pain & Touch; Attention
Link ID: 21438 - Posted: 09.24.2015

William Sutcliffe Most epidemics are the result of a contagious disease. ADHD – Attention Deficit Hyperactivity Disorder – is not contagious, and it may not even be a genuine malady, but it has acquired the characteristics of an epidemic. New data has revealed that UK prescriptions for Ritalin and other similar ADHD medications have more than doubled in the last decade, from 359,100 in 2004 to 922,200 last year. In America, the disorder is now the second most frequent long-term diagnosis made in children, narrowly trailing asthma. It generates pharmaceutical sales worth $9bn (£5.7bn) per year. Yet clinical proof of ADHD as a genuine illness has never been found. Sami Timimi, consultant child psychiatrist at Lincolnshire NHS Trust and visiting professor of child psychiatry, is a vocal critic of the Ritalin-friendly orthodoxy within the NHS. While he is at pains to stress that he is “not saying those who have the diagnosis don’t have any problem”, he is adamant that “there is no robust evidence to demonstrate that what we call ADHD correlates with any known biological or neurological abnormality”. The hyperactivity, inattentiveness and lack of impulse control that are at the heart of an ADHD diagnosis are, according to Timimi, simply “a collection of behaviours”. Any psychiatrist who claims that a behaviour is being caused by ADHD is perpetrating a “philosophical tautology” – he is doing nothing more than telling you that hyperactivity is caused by an alternative name for hyperactivity. There is still no diagnostic test – no marker in the body – that can identify a person with ADHD. The results of more than 40 brain scan studies are described by Timimi as “consistently inconsistent”. No conclusive pattern in brain activity had been found to explain or identify ADHD. © independent.co.uk

Keyword: ADHD
Link ID: 21425 - Posted: 09.21.2015

By AMY HARMON Some neuroscientists believe it may be possible, within a century or so, for our minds to continue to function after death — in a computer or some other kind of simulation. Others say it’s theoretically impossible, or impossibly far off in the future. A lot of pieces have to fall into place before we can even begin to start thinking about testing the idea. But new high-tech efforts to understand the brain are also generating methods that make those pieces seem, if not exactly imminent, then at least a bit more plausible. Here’s a look at how close, and far, we are to some requirements for this version of “mind uploading.” The hope of mind uploading rests on the premise that much of the key information about who we are is stored in the unique pattern of connections between our neurons, the cells that carry electrical and chemical signals through living brains. You wouldn't know it from the outside, but there are more of those connections — individually called synapses, collectively known as the connectome — in a cubic centimeter of the human brain than there are stars in the Milky Way galaxy. The basic blueprint is dictated by our genes, but everything we do and experience alters it, creating a physical record of all the things that make us US — our habits, tastes, memories, and so on. It is exceedingly tricky to transition that pattern of connections, known as the connectome, into a state where it is both safe from decay and can be verified as intact. But in recent months, two sets of scientists said they had devised separate ways to do that for the brains of smaller mammals. If either is scaled up to work for human brains — still a big if — then theoretically your brain could sit on a shelf or in a freezer for centuries while scientists work on the rest of these steps. © 2015 The New York Times Company

Keyword: Learning & Memory; Consciousness
Link ID: 21407 - Posted: 09.14.2015

Neel V. Patel The concept of the insanity defense dates back to ancient Greece and the Roman Empire. The idea has always been the same: Protect individuals from being held accountable for behavior they couldn’t control. Yet there have been more than a few historical and recent instances of a judge or jury issuing a controversial “by reason of…” verdict. What was intended as a human rights effort has become a last-ditch way to save killers (though it didn’t work for James Holmes). The question that hangs in the air at these sort of proceedings has always been the same: Is there a way to make determinations more scientific and less traditionally judicial? Adam Shniderman, a criminal justice researcher at Texas Christian University, has been studying the role of neuroscience in the court system for several years now. He explains that neurological data and explanations don’t easily translate into the world of lawyers and legal text. Inverse spoke with Shniderman to learn more about how neuroscience is used in today’s insanity defenses, and whether this is likely to change as the technology used to observe the brain gets better and better. Can you give me a quick overview of how the role of neuroscience in the courts, has changed over the years? Especially in the last few decades with new advances in technology. Obviously, [neuroscientific evidence] has become more widely used as brain-scanning technology has gotten better. Some of the scanning technology we use now, like functional MRI that measures blood oxygenation as a proxy for neurological activity, is relatively new within the last 20 years or so. The nature of brain scanning has changed, but the knowledge that the brain influences someone’s actions is not new.

Keyword: Consciousness; Schizophrenia
Link ID: 21397 - Posted: 09.11.2015