Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Penny Sarchet Children with ADHD are more likely to succeed in cognitive tasks when they are fidgeting. Rather than telling them to stop, is it time to let them squirm in class? The results, from a small study of teens and pre-teens, add to growing evidence that movement may help children with attention-deficit hyperactivity disorder to think. One of the theories about ADHD is that the brain is somehow under-aroused. Physical movements could help wake it up or maintain alertness, perhaps by stimulating the release of brain-signalling chemicals like dopamine or norepinephrine. This hypothesis would help explain why countries like the US are experiencing an epidemic of ADHD – it might be that a lack of physical activity leads to reduced brain function. Fidget britches In the latest study, Julie Schweitzer of the University of California, Davis, and her colleagues asked 44 children with ADHD and 29 kids without to describe an arrangement of arrows. The children with ADHD were more likely to focus on the task and answer correctly if the test coincided with them fidgeting, as tracked by an ankle monitor. Intriguingly, Schwietzer found that it is the vigour of movements, rather than how often children make them, that seems to be related to improvements in test scores. This might mean, for example, that it helps children to swing their legs in longer arcs, but not to swing them faster. "I think we need to consider that fidgeting is helpful," says Schweitzer. "We need to find ways that children with ADHD can move without being disruptive to others." Dustin Sarver at the University of Mississippi, who recently found a link between fidgeting and improved working memory, agrees. "We should revisit the targets we want for these children, such as improving the work they complete and paying attention, rather than focusing on sitting still." He suggests that movements that are not disruptive to other schoolchildren, such as squirming, bouncing and leg movements, as opposed to getting up in the middle of lessons, could be encouraged in classrooms. © Copyright Reed Business Information Ltd
By Gretchen Reynolds Treadmill desks are popular, even aspirational, in many offices today since they can help those of us who are deskbound move more, burn extra calories and generally improve our health. But an interesting new study raises some practical concerns about the effects of walking at your workspace and suggests that there may be unacknowledged downsides to using treadmill desks if you need to type or think at the office. The drumbeat of scientific evidence about the health benefits of sitting less and moving more during the day continues to intensify. One study presented last month at the 2015 annual meeting of the American College of Sports Medicine in San Diego found that previously sedentary office workers who walked slowly at a treadmill desk for two hours each workday for two months significantly improved their blood pressure and slept better at night. But as attractive as the desks are for health reasons, they must be integrated into a work setting so it seems sensible that they should be tested for their effects on productivity. But surprisingly little research had examined whether treadmill desks affect someone’s ability to get work done. So for the new study, which was published in April in PLOS One, researchers at Brigham Young University in Provo, Utah, recruited 75 healthy young men and women and randomly assigned them to workspaces outfitted with a computer and either a chair or a treadmill desk. The treadmill desk was set to move at a speed of 1.5 miles per hour with zero incline. None of the participants had used a treadmill desk before, so they received a few minutes of instruction and practice. Those assigned a chair were assumed to be familiar with its use. © 2015 The New York Times Company
Mo Costandi According to the old saying, the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide. Although modern science precludes the existence of the soul, it does suggest that there is a kernel of truth in this saying: it turns out the eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions. Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously. When we read, for instance, we make a series of very quick eye movements called saccades that fixate rapidly on one word after another. When we enter a room, we make larger sweeping saccades as we gaze around. Then there are the small, involuntary eye movements we make as we walk, to compensate for the movement of our head and stabilise our view of the world. And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep. What is now becoming clear is that some of our eye movements may actually reveal our thought process. Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate. This change in the eye may also reveal what a decision-maker is about to say: one group of researchers, for example, found that watching for dilation made it possible to predict when a cautious person used to saying ‘no’ was about to make the tricky decision to say ‘yes’. © 2015 Guardian News and Media Limited
By Sandra G. Boodman When B. Paul Turpin was admitted to a Tennessee hospital in January, the biggest concern was whether the 69-year-old endocrinologist would survive. But as he battled a life-threatening infection, Turpin developed terrifying hallucinations, including one in which he was performing on a stage soaked with blood. Doctors tried to quell his delusions with increasingly large doses of sedatives, which only made him more disoriented. Nearly five months later, Turpin’s infection has been routed, but his life is upended. Delirious and too weak to go home after his hospital discharge, he spent months in a rehab center, where he fell twice, once hitting his head. Until recently he did not remember where he lived and believed he had been in a car wreck. “I tell him it’s more like a train wreck,” said his wife, Marylou Turpin. “They kept telling me in the hospital, ‘Everybody does this,’ and that his confusion would disappear,” she said. Instead, her once astute husband has had great difficulty “getting past the scramble.” Turpin’s experience illustrates the consequences of delirium, a sudden disruption of consciousness and cognition marked by vivid hallucinations, delusions and an inability to focus that affects 7 million hospitalized Americans annually. The disorder can occur at any age — it has been seen in preschoolers — but disproportionately affects people older than 65 and is often misdiagnosed as dementia. While delirium and dementia can coexist, they are distinctly different illnesses. Dementia develops gradually and worsens progressively, while delirium occurs suddenly and typically fluctuates during the course of a day. Some patients with delirium are agitated and combative, while others are lethargic and inattentive.
Link ID: 21010 - Posted: 06.02.2015
By Arlene Karidis Health-care professionals, educators and patient advocates debate endlessly over attention deficit disorder. Some argue about the cause of the condition, which is associated with inattentiveness and, often, hyperactivity. Many disagree on treatment and parenting techniques. A dwindling group disputes whether it actually exists. Even its name — to be formal, it’s attention-deficit/hyperactivity disorder — has been a source of debate. The label ADHD trivializes the disorder, asserts Russell Barkley, a neuropsychiatrist and professor of psychiatry and pediatrics at the Medical University of South Carolina who has published more than 300 peer-reviewed articles on the condition. “ADHD is not simply about not being able to pay attention. Describing it as such is like calling autism a ‘not looking at people’ problem,” he said, and there is much more to ADHD. Some practitioners and researchers say drugs are by far the most effective treatment. Others argue that long-term drug use addresses symptoms only and does not provide important tools to help people manage their inattentiveness. They say it’s more helpful to focus on behavioral interventions, nutrition, exercise and special accommodations at school. The American Psychiatric Association says there is no doubt that ADHD exists — and it estimates that 5 percent of U.S. children have the condition.
Link ID: 21008 - Posted: 06.02.2015
by Helen Thomson Imagine a world where you think of something and it happens. For instance, what if the moment you realise you want a cup of tea, the kettle starts boiling? That reality is on the cards, now that a brain implant has been developed that can decode a person's intentions. It has already allowed a man paralysed from the neck down to control a robotic arm with unprecedented fluidity. But the implications go far beyond prosthetics. By placing an implant in the area of the brain responsible for intentions, scientists are investigating whether brain activity can give away future decisions – before a person is even aware of making them. Such a result may even alter our understanding of free will. Fluid movement "These are exciting times," says Pedro Lopes, who works at the human-computer interaction lab at Hasso Plattner Institute in Potsdam, Germany. "These developments give us a glimpse of an exciting future where devices will understand our intentions as a means of adapting to our plans." The implant was designed for Erik Sorto, who was left unable to move his limbs after a spinal cord injury 12 years ago. The idea was to give him the ability to move a stand-alone robotic arm by recording the activity in his posterior parietal cortex – a part of the brain used in planning movements. "We thought this would allow us to decode brain activity associated with the overall goal of a movement – for example, 'I want to pick up that cup'," Richard Andersen at the California Institute of Technology in Pasadena told delegates at the NeuroGaming Conference in San Francisco earlier this month. © Copyright Reed Business Information Ltd
Alison Abbott Redouan Bshary well remembers the moment he realized that fish were smarter than they are given credit for. It was 1998, and Bshary was a young behavioural ecologist with a dream project: snorkelling in Egypt's Red Sea to observe the behaviour of coral-reef fish. That day, he was watching a grumpy-looking grouper fish as it approached a giant moray eel. As two of the region's top predators, groupers and morays might be expected to compete for their food and even avoid each other — but Bshary saw them team up to hunt. First, the grouper signalled to the eel with its head, and then the two swam side by side, with the eel dipping into crevices, flushing out fish beyond the grouper's reach and getting a chance to feed alongside. Bshary was astonished by the unexpected cooperation; if he hadn't had a snorkel in his mouth, he would have gasped. This underwater observation was the first in a series of surprising discoveries that Bshary has gone on to make about the social behaviour of fish. Not only can they signal to each other and cooperate across species, but they can also cheat, deceive, console or punish one another — even show concern about their personal reputations. “I have always had a lot of respect for fish,” says Bshary. “But one after the other, these behaviours took me by surprise.” His investigations have led him to take a crash course in scuba diving, go beach camping in Egypt and build fake coral reefs in Australia. The work has also destroyed the stereotypical idea that fish are dumb creatures, capable of only the simplest behaviours — and it has presented a challenge to behavioural ecologists in a different field. Scientists who study primates have claimed that human-like behaviours such as cooperation are the sole privilege of animals such as monkeys and apes, and that they helped to drive the evolution of primates' large brains. Bshary — quiet, but afraid of neither adventure nor of contesting others' ideas — has given those scientists reason to think again. © 2015 Nature Publishing Grou
Anya Kamenetz Are you a pen-clicker? A hair-twirler? A knee-bouncer? Did you ever get in trouble for fidgeting in class? Don't hang your head in shame. All that movement may be helping you think. A new study suggests that for children with attention disorders, hyperactive movements meant better performance on a task that requires concentration. The researchers gave a small group of boys, ages 8 to 12, a sequence of random letters and numbers. Their job: Repeat back the numbers in order, plus the last letter in the bunch. All the while, the kids were sitting in a swiveling chair. For the subjects with ADHD, moving and spinning in the chair were correlated with better performance. For typically developing kids, however, it was the opposite: the more they moved, the worse they did on the task. Dustin Sarver at the University of Mississippi Medical Center is the lead author of this study. ADHD is his field, and he has a theory as to why fidgeting helps these kids. "We think that part of the reason is that when they're moving more they're increasing their alertness." That's right — increasing. The prevailing scientific theory on attention disorders holds that they are caused by chronic underarousal of the brain. That's why stimulants are prescribed as treatment. Sarver believes that slight physical movements "wake up" the nervous system in much the same way that Ritalin does, thus improving cognitive performance. © 2015 NPR
Link ID: 20931 - Posted: 05.14.2015
By GREGORY HICKOK IN 1890, the American psychologist William James famously likened our conscious experience to the flow of a stream. “A ‘river’ or a ‘stream’ are the metaphors by which it is most naturally described,” he wrote. “In talking of it hereafter, let’s call it the stream of thought, consciousness, or subjective life.” While there is no disputing the aptness of this metaphor in capturing our subjective experience of the world, recent research has shown that the “stream” of consciousness is, in fact, an illusion. We actually perceive the world in rhythmic pulses rather than as a continuous flow. Some of the first hints of this new understanding came as early as the 1920s, when physiologists discovered brain waves: rhythmic electrical currents measurable on the surface of the scalp by means of electroencephalography. Subsequent research cataloged a spectrum of such rhythms (alpha waves, delta waves and so on) that correlated with various mental states, such as calm alertness and deep sleep. Researchers also found that the properties of these rhythms varied with perceptual or cognitive events. The phase and amplitude of your brain waves, for example, might change if you saw or heard something, or if you increased your concentration on something, or if you shifted your attention. But those early discoveries themselves did not change scientific thinking about the stream-like nature of conscious perception. Instead, brain waves were largely viewed as a tool for indexing mental experience, much like the waves that a ship generates in the water can be used to index the ship’s size and motion (e.g., the bigger the waves, the bigger the ship). Recently, however, scientists have flipped this thinking on its head. We are exploring the possibility that brain rhythms are not merely a reflection of mental activity but a cause of it, helping shape perception, movement, memory and even consciousness itself. What this means is that the brain samples the world in rhythmic pulses, perhaps even discrete time chunks, much like the individual frames of a movie. From the brain’s perspective, experience is not continuous but quantized. © 2015 The New York Times Company
Ian Sample, science editor Brain scans of children who were born prematurely have revealed differences in the connectivity of key regions that may play a role in developmental disorders. Previous studies have already highlighted that children who are born preterm are more at risk of autism and other behavioural conditions, such as the poor attention that is associated with ADHD, or attention deficit hyperactivity disorder. The new findings could help doctors understand why preterm children are so often affected, and work out whether medications or different styles of care could help the children reach their full potential. Researchers at King’s College London scanned the brains of 66 infants on average 42 weeks after their mothers’ last period before the birth. Forty seven of the babies were born prematurely, at less than 33 weeks. The other 19 babies were born on average after 40 weeks gestation. In their final weeks in the womb, babies’ brains are building connections at an incredible rate, which makes them particularly sensitive to changes in the last trimester. If a baby is born prematurely, the crucial period of brain growth happens in the radically different environment of the neonatal unit. From the MRI scans, the scientists found that infants born prematurely had increased connectivity in only one part of the brain they tested. A region called the thalamus, a kind of neural relay station, was better connected to a part called the lateral sensory cortex, which handles signals from the mouth, lips and jaw. The result might be explained by pre-term babies breast or bottle feeding much earlier, or being given dummies while on supportive breathing machines. © 2015 Guardian News and Media Limited
Paul Oswell “Cool” is a bit of a moving target. Sixty years ago it was James Dean, nonchalantly smoking a cigarette as he sat on a motorbike, glaring down 1950s conformity with brooding disapproval. Five years ago it was Zooey Deschanel holding a cupcake. In a phone interview with Steve Quartz, the co-author of the recently published Cool: How the Brain’s Hidden Quest for Cool Drives Our Economy and Shapes Our World, we skirted around a working definition. Defining cool turns out to be tricky even for someone who has just written an entire book examining the neurological processes behind it. Quartz’s most succinct definition was that cool is “the sweet spot between being innovative and unconventional, but not weird”. Quartz is the director of the Social Cognitive Neuroscience Laboratory at the California Institute of Technology. So when asked to describe what the lab does, he did not deliver a “cool” answer, but rather a precise one: it is, he said, “concerned with all the dimensions of decision making, from simple gambles and risk assessment right up to very complex reasoning and the nature of moral behaviour”. He wrote the book with his colleague Anette Asp, with whom he has long done research on “neuroeconomics” and “neuromarketing”. Those fields use imaging techniques to look at the ways our brains process the emotions and responses we have to brands and products. The results, as Quartz and Asp posit in the book, reflect primal instincts we have around ideas of status. Their technique gives results that are much more accurate about what the kids are into, these days, than traditional marketing focus groups have ever been able to give us. © 2015 Guardian News and Media Limited
by Helen Thomson Giving people the illusion of teleporting around a room has revealed how the brain constructs our sense of self. The findings may aid treatments for schizophrenia and asomatognosia – a rare condition characterised by a lack of awareness of a part of one's body. As we go about our daily lives, we experience our body as a physical entity with a specific location. For instance, when you sit at a desk you are aware of your body and its rough position with respect to objects around you. These experiences are thought to form a fundamental aspect of self-consciousness. Arvid Guterstam, a neuroscientist at the Karolinska Institute in Stockholm, Sweden, and his colleagues wondered how the brain produces these experiences. To find out, Guterstam's team had 15 people lie in an fMRI brain scanner while wearing a head-mounted display. This was connected to a camera on a dummy body lying elsewhere in the room, enabling the participants to see the room – and themselves inside the scanner - from the dummy's perspective. A member of the team then stroked the participant's body and the dummy's body at the same time. This induced the out-of-body experience of owning the dummy body and being at its location. The experiment was repeated with the dummy body positioned in different parts of the room, allowing the person to be perceptually teleported between the different locations, says Guterstam. All that was needed to break the illusion was to touch the participant's and the dummy's bodies at different times. © Copyright Reed Business Information Ltd.
Monya Baker An ambitious effort to replicate 100 research findings in psychology ended last week — and the data look worrying. Results posted online on 24 April, which have not yet been peer-reviewed, suggest that key findings from only 39 of the published studies could be reproduced. But the situation is more nuanced than the top-line numbers suggest (See graphic, 'Reliability test'). Of the 61 non-replicated studies, scientists classed 24 as producing findings at least “moderately similar” to those of the original experiments, even though they did not meet pre-established criteria, such as statistical significance, that would count as a successful replication. The results should convince everyone that psychology has a replicability problem, says Hal Pashler, a cognitive psychologist at the University of California, San Diego, and an author of one of the papers whose findings were successfully repeated. “A lot of working scientists assume that if it’s published, it’s right,” he says. “This makes it hard to dismiss that there are still a lot of false positives in the literature.” But Daniele Fanelli, who studies bias and scientific misconduct at Stanford University in California, says the results suggest that the reproducibility of findings in psychology does not necessarily lag behind that in other sciences. There is plenty of room for improvement, he adds, but earlier studies have suggested that reproducibility rates in cancer biology and drug discovery could be even lower1, 2. “From my expectations, these are not bad at all,” Fanelli says. “Though I have spoken to psychologists who are quite disappointed.” © 2015 Nature Publishing Group,
Link ID: 20871 - Posted: 05.02.2015
By JEFFREY ELY, ALEXANDER FRANKEL and EMIR KAMENICA IMAGINE the following situation. After a grueling day at work, you plop down in front of your TV, ready to relax. Your TiVo has recorded all of the day’s March Madness games. You’ve sequestered yourself away from any news about who won or lost. Which game to watch? Suddenly, your spouse pops in and tells you to stay away from Villanova versus Lafayette, which was a blowout, and to watch Baylor versus Georgia State, a nail-biter. Is this recommendation appreciated? Hardly. Baylor versus Georgia State was exciting because the unexpected happened: It was a back-and-forth affair in which Georgia State, the underdog, clinched the upset only in the final moments. But if you know in advance that it’s a nail-biter, you will expect the unexpected, ruining the surprise. It’s a lesson that the filmmaker M. Night Shyamalan, for one, seems to have missed. Once it’s common knowledge that your movie will have a dramatic, unexpected plot twist at the end, then your movie no longer has a dramatic, unexpected plot twist at the end. To be thrilling, you must occasionally be boring. This is one of several lessons that came out of our recent study of drama-based entertainment using the tools of information economics — the results of which were published in the Journal of Political Economy in February. When we recognize that the capacity to surprise an audience is a scarce resource (“You can’t fool all of the people all of the time”), it becomes natural to use economic theory to optimize that resource.
Link ID: 20860 - Posted: 04.29.2015
Amy Coats Those split second decisions, made almost without thinking. When to put your foot on the pedal when you’re at the red light. When to check how those sausages are doing. Remembering to grab your lunch from the fridge seconds before you leave the house. Or – too often – 20 minutes after. And those carefully considered ones. Do I just finish this paragraph before I make a cup of tea? Or do I wait until the boss is clear of the kitchen? Timing, that is our perception and estimation of time, is key in determining how we behave and in the decisions we make. New findings suggest that time in the brain is relative, not absolute. This means that your brain ‘encodes’ your sense of time depending on what happens to you, and not by the second, minute or hour. And this in turn determines how you behave. Alas, you could be forgiven for feeling that the units of time common to everyone worldwide, except perhaps the odd Amazonian tribe, are pretty well ingrained. My partner and I will often make a quick bet on what time it is before we check our phone (all sigh!/rejoice! [delete as appropriate], the dwindling watch-less generation). And we’re both pretty good at getting to within 5 or 10 minutes, even if we haven’t known the exact time all day. He’s normally better at it, perhaps because he’s male? Perhaps it tends to fly/drag for me because I’m having more/less fun? Perhaps that’s another story. In the 2004 reality TV show Shattered, contestants who had been sleep-deprived for over 140 hours went head-to-head to predict when an arbitrary amount of time had passed – in this case, one minute and seven seconds. With the pressure of £100,000 prize money at stake, Dermot O’Leary grimacing nearby, a studio audience rustling in the darkness, and no cues except their ‘inner clock’, contestants were almost unbelievably close. The loser, Jonathan, was 0.4 seconds out, while Jimmy, the winner, was just one tenth of a second out. © 2015 Guardian News and Media Limited
Link ID: 20859 - Posted: 04.29.2015
Julian Baggini is that happy thing – a philosopher who recognises that readers go glassy-eyed if presented with high-octane philosophical discourse. And yet, as his latest book, Freedom Regained: The Possibility of Free Will, makes clear, it is in all our interests to consider crucial aspects of what it means to be human. Indeed, in this increasingly complex world, maybe more so than ever. Freedom is one of the great, emotive political watchwords. The emancipation of slaves and women has inspired political movements on a grand scale. But, latterly, the concept of freedom has defected from the public realm to the personal. How responsible are we as individuals for the actions we take? To what degree are we truly autonomous agents? The argument that environmental circumstances are crucial determinates on our actions – the “Officer Krupke” argument (from the West Side Story song: “Gee, Officer Krupke, we’re very upset/We never had the love that every child ought to get”) – has for some time carried weight, not least in the defence of violent crime. Defective genes are also a common part of the artillery in the argument against the possibility of free choice. Excessive testosterone and low resting heart rates, for example, both statistically bias a person towards violence. And now neuroscience brings us the unnerving news that while even the most sane, genetically well endowed and law-abiding of us believe we make free choices, the evidence of brain scans suggests otherwise. Neuroscience reveals the seemingly novel fact that “we are not the authors of our thoughts and actions in the way people generally suppose”. I say “seemingly novel”, for it is no news that many of our apparently willed choices have unconscious determinates, which are at variance from our known wishes and desires. The whole of psychoanalysis is predicated on that principle but, as anyone who can drive a car will attest, often routine physical actions take their source from an internalised history rather than any conscious decision-making. The neural information that has made waves, however, is the fact that scans indicates the brain’s chemistry consistently determines a decision prior to our consciously making that decision. So when I deliberate over a menu and finally choose a mushroom risotto over a rare steak, my brain has anticipated this before I am aware of my choice. © 2015 Guardian News and Media Limited
Link ID: 20856 - Posted: 04.28.2015
|By Rebecca Harrington Kraft Macaroni & Cheese—that favorite food of kids, packaged in the nostalgic blue box—will soon be free of yellow dye. Kraft announced Monday that it will remove artificial food coloring, notably Yellow No. 5 and Yellow No. 6 dyes, from its iconic product by January 2016. Instead, the pasta will maintain its bright yellow color by using natural ingredients: paprika, turmeric and annatto (the latter of which is derived from achiote tree seeds). The company said it decided to pull the dyes in response to growing consumer pressure for more natural foods. But claims that the dyes may be linked to attention-deficit hyperactivity disorder (ADHD) in children have also risen recently, as they did years ago, putting food dyes under sharp focus once again. On its Web site Kraft says synthetic colors are not harmful, and that their motivation to remove them is because consumers want more foods with no artificial colors. The U.S. Food and Drug Administration maintains artificial food dyes are safe but some research studies have found the dyes can contribute to hyperactive behavior in children. Food dyes have been controversial since pediatrician Benjamin Feingold published findings in the 1970s that suggested a link between artificial colors and hyperactive behavior, but scientists, consumers and the government have not yet reached a consensus on the extent of this risk or the correct path to address it. After a 2007 study in the U.K. showed that artificial colors and/or the common preservative sodium benzoate increased hyperactivity in children, the European Union started requiring food labels indicating that a product contains any one of six dyes that had been investigated. The label states the product "may have an adverse effect on activity and attention in children." © 2015 Scientific American
by Katie Collins Sarah-Jayne Blakemore is just as fascinated by the links between neuroscience and education as she is outraged by the pseudo science that often intrudes upon this territory. Neuroscience in education has really been flourishing in recent years, she says on stage at WIRED Health 2015, but some theories about neuroscience have already infiltrated schools, and not necessarily in a good way. Some products that makes claims about having a positive effect on cognition make bogus claims that may well have positive effects in the classroom, but at the same time promote completely inaccurate science. Blakemore points specifically to the Brain Gym educational model, which claims to improve memory, concentration and information retention. There are no problems with the exercises themselves, she says, but the claims made about the brain are baseless. For a start, she said, Brain Gym claims that children can push "brain buttons" on their bodies that will stimulate blood flow to the brain. Another physical exercise claimed to increase and improve connectivity between the two sides of the brain. "This makes no sense -- they are in communication anyway," says Blakemore. Teachers like Brain Gym because it does what it says and results in improvements in the classroom, but it could just as easily be placebo or novelty causing the effects. One thing Blakemore is sure of? "They're nothing to do with brain buttons or coordinating the two brain hemispheres."
Keyword: Development of the Brain
Link ID: 20844 - Posted: 04.25.2015
By Jerry Adler Smithsonian Magazine | In London, Benjamin Franklin once opened a bottle of fortified wine from Virginia and poured out, along with the refreshment, three drowned flies, two of which revived after a few hours and flew away. Ever the visionary, he wondered about the possibility of incarcerating himself in a wine barrel for future resurrection, “to see and observe the state of America a hundred years hence.” Alas, he wrote to a friend in 1773, “we live in an age too early . . . to see such an art brought in our time to its perfection.” If Franklin were alive today he would find a kindred spirit in Ken Hayworth, a neuroscientist who also wants to be around in 100 years but recognizes that, at 43, he’s not likely to make it on his own. Nor does he expect to get there preserved in alcohol or a freezer; despite the claims made by advocates of cryonics, he says, the ability to revivify a frozen body “isn’t really on the horizon.” So Hayworth is hoping for what he considers the next best thing. He wishes to upload his mind—his memories, skills and personality—to a computer that can be programmed to emulate the processes of his brain, making him, or a simulacrum, effectively immortal (as long as someone keeps the power on). Hayworth’s dream, which he is pursuing as president of the Brain Preservation Foundation, is one version of the “technological singularity.” It envisions a future of “substrate-independent minds,” in which human and machine consciousness will merge, transcending biological limits of time, space and memory. “This new substrate won’t be dependent on an oxygen atmosphere,” says Randal Koene, who works on the same problem at his organization, Carboncopies.org. “It can go on a journey of 1,000 years, it can process more information at a higher speed, it can see in the X-ray spectrum if we build it that way.”
|By Tara Haelle When it comes to treating attention-deficit hyperactivity disorder (ADHD) a lot of kids are getting the meds they need—but they may be missing out on other treatments. Despite clinical guidelines that urge that behavioral therapy always be used alongside medication, less than half of the children with ADHD received therapy as part of treatment in 2009 and 2010, according to the first nationally representative study of ADHD treatment in U.S. children. The findings, published online March 31 in The Journal of Pediatrics, come from data collected during that period on 9,459 children, aged four to 17, with diagnosed ADHD—just before the American Academy of Pediatrics (AAP) issued its clinical practice guidelines on treatments of the condition in 2011. They provide a baseline for comparison when the next report is issued in 2017. Medication alone was the most common treatment for children with ADHD: 74 percent had taken medication in the previous week whereas 44 percent had received behavioral therapy in the past year. Just under a third of children of all ages had received both medication and behavioral therapy, the AAP-recommended treatment for all ages. “It’s not at all surprising that medication is the most common treatment,” says Heidi Feldman, a professor of developmental and behavioral pediatrics at Stanford University School of Medicine who served on the AAP clinical practice guidelines committee. “It works very effectively to reduce the core symptoms of the condition,” she adds, “and stimulants are relatively safe if used properly. The limitation of stimulant medications for ADHD is that studies do not show a long-term functional benefit from medication use.” © 2015 Scientific American
Link ID: 20827 - Posted: 04.21.2015