Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Helen Shen For most adults, adding small numbers requires little effort, but for some children, it can take all ten fingers and a lot of time. Research published online on 17 August in Nature Neuroscience1 suggests that changes in the hippocampus — a brain area associated with memory formation — could help to explain how children eventually pick up efficient strategies for mathematics, and why some children learn more quickly than others. Vinod Menon, a developmental cognitive neuroscientist at Stanford University in California, and his colleagues presented single-digit addition problems to 28 children aged 7–9, as well as to 20 adolescents aged 14–17 and 20 young adults. Consistent with previous psychology studies2, the children relied heavily on counting out the sums, whereas adolescents and adults tended to draw on memorized information to calculate the answers. The researchers saw this developmental change begin to unfold when they tested the same children at two time points, about one year apart. As the children aged, they began to move away from counting on fingers towards memory-based strategies, as measured by their own accounts and by decreased lip and finger movements during the task. Using functional magnetic resonance imaging (fMRI) to scan the children's brains, the team observed increased activation of the hippocampus between the first and second time point. Neural activation decreased in parts of the prefrontal and parietal cortices known to be involved in counting, suggesting that the same calculations had begun to engage different neural circuits. © 2014 Nature Publishing Group
Ever wonder why it’s hard to focus after a bad night’s sleep? Using mice and flashes of light, scientists show that just a few nerve cells in the brain may control the switch between internal thoughts and external distractions. The study, partly funded by the National Institutes of Health, may be a breakthrough in understanding how a critical part of the brain, called the thalamic reticular nucleus (TRN), influences consciousness. “Now we may have a handle on how this tiny part of the brain exerts tremendous control over our thoughts and perceptions,” said Michael Halassa, M.D., Ph.D., assistant professor at New York University’s Langone Medical Center and a lead investigator of the study. “These results may be a gateway into understanding the circuitry that underlies neuropsychiatric disorders.” The TRN is a thin layer of nerve cells on the surface of the thalamus, a center located deep inside the brain that relays information from the body to the cerebral cortex. The cortex is the outer, multi-folded layer of the brain that controls numerous functions, including one’s thoughts, movements, language, emotions, memories, and visual perceptions. TRN cells are thought to act as switchboard operators that control the flow of information relayed from the thalamus to the cortex. To understand how the switches may work, Dr. Halassa and his colleagues studied the firing patterns of TRN cells in mice during sleep and arousal, two states with very different information processing needs. The results published in Cell, suggest that the TRN has many switchboard operators, each dedicated to controlling specific lines of communication. Using this information, the researchers could alter the attention span of mice.
|By Piercarlo Valdesolo In the summer of 2009 I tried to cure homemade sausages in my kitchen. One of the hazards of such a practice is preventing the growth of undesirable molds and diseases such as botulism. My wife was not on board with this plan, skeptical of my ability to safely execute the procedure. And so began many weeks of being peppered with warnings, relevant articles and concerned looks. When the time came for my first bite, nerves were high. My throat itched. My heart raced. My vision blurred. I had been botulized! Halfway through our walk to the hospital I regained my composure. Of course I had not been instantaneously struck by an incredibly rare disease that, by the way, takes at least 12 hours after consumption to manifest and does not share many symptoms with your garden variety anxiety attack. My experience had been shaped by my mindset. A decade of learning about the psychological power of expectations could not inoculate me from its effect. Psychologists know that beliefs about how experiences should affect us can bring about the expected outcomes. Though these “placebo effects” have primarily been studied in the context of pharmaceutical interventions (e.g. patients reporting pain relief after receiving saline they believed to be an analgesic), recent research has shown their strength in a variety of domains. Tell people that their job has exercise benefits and they will lose more weight than their coworkers who had no such belief. Convince people of a correlation between athleticism and visual acuity and they will show better vision after working out . Trick people into believing they are consuming caffeine and their vigilance and cognitive functioning increases. Some evidence shows that such interventions can even mitigate the negative effects of other experiences. For example, consuming placebo caffeine alleviates the cognitive consequences of sleep deprivation. © 2014 Scientific American
|By Nathan Collins Time zips by when you're having fun and passes slowly when you're not—except when you are depressed, in which case your time-gauging abilities are pretty accurate. Reporting in PLOS ONE, researchers in England and Ireland asked 39 students—18 with mild depression—to estimate the duration of tones lasting between two and 65 seconds and to produce tones of specified lengths of time. Happier students overestimated intervals by 16 percent and produced tones that were short by 13 percent, compared with depressed students' 3 percent underestimation and 8 percent overproduction. The results suggest that depressive realism, a phenomenon in which depressed people perceive themselves more accurately (and less positively) than typical individuals, may extend to aspects of thought beyond self-perception—in this case, time. They speculate that mindfulness treatments may be effective for depression, partly because they help depressed people focus on the moment, rather than its passing. © 2014 Scientific American
By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American
By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company
By KATHARINE Q. SEELYE SPARTA, N.J. — When Gail Morris came home late one night after taking her daughter to college, she saw her teenage son, Alex, asleep on the sofa in the family room. Nothing seemed amiss. An unfinished glass of apple juice sat on the table. She tucked him in under a blanket and went to bed. The next morning, he would not wake up. He was stiff and was hardly breathing. Over the next several hours, Ms. Morris was shocked to learn that her son had overdosed on heroin. She was told he would not survive. He did survive, but barely. He was in a coma for six weeks. He went blind and had no function in his arms or legs. He could not speak or swallow. Hospitalized for 14 months, Alex, who is 6-foot-1, dropped to 90 pounds. One of his doctors said that Alex had come as close to dying as anyone he knew who had not actually died. Most people who overdose on heroin either die or fully recover. But Alex plunged into a state that was neither dead nor functional. There are no national statistics on how often opioid overdose leads to cases like Alex’s, but doctors say they worry that with the dramatic increase in heroin abuse and overdoses, they will see more such outcomes. “I would expect that we will,” said Dr. Nora Volkow, director of the National Institute on Drug Abuse. “They are starting to report isolated cases like this. And I would not be surprised if you have more intermediate cases with more subtle impairment.” More than 660,000 Americans used heroin in 2012, the federal government says, double the number of five years earlier. Officials attribute much of the increase to a crackdown on prescription painkillers, prompting many users to turn to heroin, which is cheaper and easier to get than other opioids. © 2014 The New York Times Company
|By William Skaggs One of the most frustrating and mysterious medical conditions affecting the mind is impaired consciousness, as can occur with brain damage. Patients in a coma or a vegetative or minimally conscious state sometimes spontaneously recover to varying degrees, but in most cases there is little that doctors can do to help. Now a rigorous study by a group at Liège University Hospital Center in Belgium has found that a simple treatment called transcranial direct-current stimulation (tDCS) can temporarily raise awareness in minimally conscious patients. In tDCS, electrodes are glued to the scalp, and a weak electric current is passed through them to stimulate the underlying brain tissue. Scientists led by neurologist Steven Laureys applied the electric current for 20 minutes to patients' left prefrontal cortex, an area known to be involved in attentiveness and working memory. Afterward, the effects on consciousness were measured by doctors who did not know whether the patient had received real tDCS or a sham treatment, in which the apparatus ran, but no current was delivered. For patients in a vegetative state, who display no communication or purposeful behavior, the stimulation might have led to improvement in two patients, but no statistically compelling evidence emerged. Yet 13 of 30 patients in a minimally conscious state—defined by occasional moments of low-level awareness—showed measurable gains in their responses to questions and sensory stimuli. Some had only recently been injured, but others had been minimally conscious for months. © 2014 Scientific American
Link ID: 19934 - Posted: 08.11.2014
Ian Sample, science correspondent The human brain can judge the apparent trustworthiness of a face from a glimpse so fleeting, the person has no idea they have seen it, scientists claim. Researchers in the US found that brain activity changed in response to how trustworthy a face appeared to be when the face in question had not been consciously perceived. Scientists made the surprise discovery during a series of experiments that were designed to shed light on the the neural processes that underpin the snap judgments people make about others. The findings suggest that parts of our brains are doing more complex subconscious processing of the outside world than many researchers thought. Jonathan Freeman at New York University said the results built on previous work that shows "we form spontaneous judgments of other people that can be largely outside awareness." The study focused on the activity of the amygdala, a small almond-shaped region deep inside the brain. The amygdala is intimately involved with processing strong emotions, such as fear. Its central nucleus sends out the signals responsible for the famous and evolutionarily crucial "fight-or-flight" response. Prior to the study, Freeman asked a group of volunteers to rate the trustworthiness of a series of faces. People tend to agree when they rank trustworthiness – faces with several key features, such as more furrowed brows and shallower cheekbones, are consistently rated as less trustworthy. Freeman then invited a different group of people to take part in the experiments. Each lay in an MRI scanner while images of faces flashed up on a screen before them. Each trustworthy or untrustworthy face flashed up for a matter of milliseconds. Though their eyes had glimpsed the images, the participants were not aware they had seen the faces. © 2014 Guardian News and Media Limited
Link ID: 19924 - Posted: 08.07.2014
David Robson It’s not often that you look at your meal to find it staring back at you. But when Diane Duyser picked up her cheese toastie, she was in for a shock. “I went to take a bite out of it, and then I saw this lady looking back at me,” she told the Chicago Tribune. “It scared me at first.” As word got around, it soon began to spark more attention, and eventually a casino paid Duyser $28,000 to exhibit the toasted sandwich. For many, the woman’s soft, full features and serene expression recalls famous depictions of the Virgin Mary. But I’ve always thought the curled hair, parted lips and heavy eyelids evoke a more modern idol. Whichever Madonna you think you can see, she joins good company; Jesus has also been seen in toast, as well as a taco, a pancake and a banana peel, while Buzzfeed recently ran photos of peppers that look like British politicians. “If someone reports seeing Jesus in a piece of toast, you’d think they must be nuts,” says Kang Lee, at the University of Toronto, Canada. “But it’s very pervasive... We are primed to see faces in every corner of the visual world.” Lee has shown that rather than being a result of divine intervention, these experiences reflect the powerful influence of our imagination over our perception. Indeed, his explanation may mean that you never trust your eyes again. Pareidolia, as this experience is known, is by no means a recent phenomenon. Leonardo da Vinci described seeing characters in natural markings on stone walls, which he believed could help inspire his artworks. In the 1950s, the Bank of Canada had to withdraw a series of banknotes because a grinning devil leapt from the random curls of the Queen’s hair (although I can’t, for the life of me, see the merest hint of a horn in Her Majesty’s locks). The Viking I spacecraft, meanwhile, appeared to photograph a carved face in the rocky landscape of Mars. BBC © 2014
Link ID: 19912 - Posted: 08.02.2014
By Marek Kohn “You know how they say that we can only access 20% of our brain?” says the man who offers stressed-out writer Eddie Morra a fateful pill in the 2011 film Limitless. “Well, what this does, it lets you access all of it.” Morra is instantly transformed into a superhuman by the fictitious drug NZT-48. Granted access to all cognitive areas, he learns to play the piano in three days, finishes writing his book in four, and swiftly makes himself a millionaire. Limitless is what you get when you flatter yourself that your head houses the most complex known object in the universe, and you run away with the notion that it must have powers to match. A number of so-called ‘smart drugs’ or cognitive enhancers have captured attention recently, from stimulants such as modafinil, to amphetamines (often prescribed under the name Adderall) and methylphenidate (also known by its brand name Ritalin). According to widespread news reports, students have begun using these drugs to enhance their performance in school and college, and are continuing to do so in their professional lives. Yet are these smart drugs all they are cracked up to be? Can they really make all of us more intelligent or learn more? Should we be asking deeper questions about what these pharmaceuticals can and can’t do? BBC © 2014
By Smitha Mundasad Health reporter, BBC News Scientists say a part of the brain, smaller than a pea, triggers the instinctive feeling that something bad is about to happen. Writing in the journal PNAS, they suggest the habenula plays a key role in how humans predict, learn from and respond to nasty experiences. And they question whether hyperactivity in this area is responsible for the pessimism seen in depression. They are now investigating whether the structure is involved in the condition. Animal studies have shown that the habenula fires up when subjects expect or experience adverse events, But in humans this tiny structure (less than 3mm in diameter) has proved difficult to see on scans. Inventing a technique to pinpoint the area, scientists at University College London put 23 people though MRI scanners to monitor their brain activity. Participants were shown a range of abstract pictures. A few seconds later, the images were linked to either punishment (painful electric shocks), reward (money) or neutral responses. For some images, a punishment or reward followed each time but for others this varied - leaving people uncertain whether they were going to feel pain or not. And when people saw pictures associated with shocks the habenula lit up. And the more certain they were a picture was going to result in a punishment, the stronger and faster the activity in this area. Scientists suggests the habenula is involved in helping people learn when it is best to stay away from something and may also signal just how bad a nasty event is likely to be. BBC © 2014
By KATE MURPHY ONE of the biggest complaints in modern society is being overscheduled, overcommitted and overextended. Ask people at a social gathering how they are and the stock answer is “super busy,” “crazy busy” or “insanely busy.” Nobody is just “fine” anymore. When people aren’t super busy at work, they are crazy busy exercising, entertaining or taking their kids to Chinese lessons. Or maybe they are insanely busy playing fantasy football, tracing their genealogy or churning their own butter. And if there is ever a still moment for reflective thought — say, while waiting in line at the grocery store or sitting in traffic — out comes the mobile device. So it’s worth noting a study published last month in the journal Science, which shows how far people will go to avoid introspection. “We had noted how wedded to our devices we all seem to be and that people seem to find any excuse they can to keep busy,” said Timothy Wilson, a psychology professor at the University of Virginia and lead author of the study. “No one had done a simple study letting people go off on their own and think.” The results surprised him and have created a stir in the psychology and neuroscience communities. In 11 experiments involving more than 700 people, the majority of participants reported that they found it unpleasant to be alone in a room with their thoughts for just 6 to 15 minutes. Moreover, in one experiment, 64 percent of men and 15 percent of women began self-administering electric shocks when left alone to think. These same people, by the way, had previously said they would pay money to avoid receiving the painful jolt. It didn’t matter if the subjects engaged in the contemplative exercise at home or in the laboratory, or if they were given suggestions of what to think about, like a coming vacation; they just didn’t like being in their own heads. © 2014 The New York Times Company
By MICHAEL INZLICHT and SUKHVINDER OBHI I FEEL your pain. These words are famously associated with Bill Clinton, who as a politician seemed to ooze empathy. A skeptic might wonder, though, whether he truly was personally distressed by the suffering of average Americans. Can people in high positions of power — presidents, bosses, celebrities, even dominant spouses — easily empathize with those beneath them? Psychological research suggests the answer is no. Studies have repeatedly shown that participants who are in high positions of power (or who are temporarily induced to feel powerful) are less able to adopt the visual, cognitive or emotional perspective of other people, compared to participants who are powerless (or are made to feel so). For example, Michael Kraus, a psychologist now at the University of Illinois at Urbana-Champaign, and two colleagues found that among full-time employees of a public university, those who were higher in social class (as determined by level of education) were less able to accurately identify emotions in photographs of human faces than were co-workers who were lower in social class. (While social class and social power are admittedly not the same, they are strongly related.) Why does power leave people seemingly coldhearted? Some, like the Princeton psychologist Susan Fiske, have suggested that powerful people don’t attend well to others around them because they don’t need them in order to access important resources; as powerful people, they already have plentiful access to those. We suggest a different, albeit complementary, reason from cognitive neuroscience. On the basis of a study we recently published with the researcher Jeremy Hogeveen, in the Journal of Experimental Psychology: General, we contend that when people experience power, their brains fundamentally change how sensitive they are to the actions of others. © 2014 The New York Times Company
Posted by Katie Langin In a battle of wits, could a bird outsmart a kindergartner? Don’t be too quick to say no: One clever young bird solved a problem that has stumped 5-year-old children, according to a new study. The bird—a New Caledonian crow named Kitty—figured out that dropping rocks in one water-filled tube was the key to raising the water level in another, seemingly unconnected tube, giving her access to a floating morsel of meat. To solve this problem, Kitty needed to decipher a confusing cause-and-effect relationship, basically akin to figuring out that if you flip a switch on the wall, a ceiling light will turn on. This mental ability was once thought to be restricted to humans, but causal reasoning—the ability to understand cause and effect—has now been identified in a handful of animals, from chimpanzees to rats. Crows are the Einsteins of the bird world, renowned for their ability to make tools and solve complex puzzles. (Watch a video of a New Caledonian crow solving problems.) Their impressive mental capacity was even apparent to the ancient Greeks. In one of Aesop’s fables, a thirsty crow is presented with a dilemma when he cannot reach the water at the bottom of a pitcher. He figures out that the water level rises when he drops pebbles into the pitcher, and many pebbles later he is rewarded with a drink. As it turns out, there’s some truth to this fictional story. A study published earlier this year reported that New Caledonian crows will place rocks in water-filled tubes if they can’t reach a piece of meat that is attached to a floating cork. © 1996-2013 National Geographic Society.
by Douglas Heaven Hijacking how neurons of nematode worms are wired is the first step in an approach that could revolutionise our understanding of brains and consciousness CALL it the first brain hack. The humble nematode worm has had its neural connections hot-wired, changing the way it responds to salt and smells. As well as offering a way to create souped-up organisms, changing neural connectivity could one day allow us to treat brain damage in people by rerouting signals around damaged neurons. What's more, it offers a different approach to probing brain mysteries such as how consciousness arises from wiring patterns – much like exploring the function of an electronic circuit by plugging and unplugging cables. In our attempts to understand the brain, a lot of attention is given to neurons. A technique known as optogenetics, for example, lets researchers study the function of individual neurons by genetically altering them so they can be turned on and off by a light switch. But looking at the brain's connections is as important as watching the activity of neurons. Higher cognitive functions, such as an awareness of our place in the world, do not spring from a specific area, says Fani Deligianni at University College London. Deligianni and her colleagues are developing imaging techniques to map the brain's connections, as are other groups around the world (see "Start with a worm..."). "From this we can begin to answer some of the big questions about the workings of the brain and consciousness which seem to depend on connectivity," she says. Tracing how the brain is wired is a great first step but to find out how this linking pattern produces a particular behaviour we need to be able to see how changing these links affects brain function. This is what a team led by William Schafer at the MRC Laboratory of Molecular Biology in Cambridge, UK, is attempting. © Copyright Reed Business Information Ltd.
By Neuroskeptic An entertaining paper just out in Frontiers in Systems Neuroscience offers a panoramic view of the whole of neuroscience: Enlarging the scope: grasping brain complexity The paper is remarkable not just for its content but also for its style. Some examples: How does the brain work? This nagging question is an habitué from the top ten lists of enduring problems in Science’s grand challenges. Grasp this paradox: how is one human brain – a chef d’oeuvre of complexity honed by Nature – ever to reach such a feast as to understand itself? Where one brain may fail at this notorious philosophical riddle, may be a strong and diversely-skilled army of brains may come closer. Or It remains an uneasy feeling that so much of Brain Science is built upon the foundation of a pair of neurons, outside the context of their networks, and with two open-ended areas of darkness at either of their extremities that must be thought of as the entire remainder of the organism’s brain (and body). And my favorite: As humans tend to agree, increased size makes up for smarter brains (disclosure: both authors are human) I love it. I’m not sure I understand it, though. The authors, Tognoli and Kelso, begin by framing a fundamental tension between directed information transfer and neural synchrony, pointing out that neurons firing perfectly in synch with each other could not transfer information between themselves.
Link ID: 19829 - Posted: 07.15.2014
|By Roni Jacobson Prozac, Paxil, Celexa, Zoloft, Lexapro. These so-called selective serotonin reuptake inhibitors (SSRIs) are among the most widely prescribed drugs in the U.S. Although they are typically used to treat depression and anxiety disorders, they are also prescribed off-label for conditions such as chronic pain, premature ejaculation, bulimia, irritable bowel syndrome, premenstrual syndrome and hot flashes. Even if you have never taken an SSRI, chances are you know someone who has. About one in every 10 American adults is being prescribed one now. For women aged 40 to 59 years old, the proportion increases to one in four. SSRIs block the body from reabsorbing serotonin, a neurotransmitter mostly found in the brain, spinal cord and digestive tract whose roles include regulation of mood, appetite, sexual function and sleep. Specifically, SSRIs bind to the protein that carries serotonin between nerve cells—called SERT, for serotonin transporter—intercepting it before it can escort the released neurotransmitter back into the cell. This action leaves more active serotonin in the body, a chemical effect that is supposed to spur feelings of happiness and well-being. But there are hints that SSRIs are doing something other than simply boosting serotonin levels. First, people vary in their response to SSRIs: Studies have shown that the drugs are not very effective for mild to moderate depression, but work well when the disorder is severe. If low serotonin were the only culprit in depression, SSRIs would be more uniformly helpful in alleviating symptoms. Second, it takes weeks after starting an SSRI for depression and anxiety to lift even though changes in serotonin ought to happen pretty much right away. © 2014 Scientific American
Sara Reardon For chimps, nature and nurture appear to contribute equally to intelligence. Smart chimpanzees often have smart offspring, researchers suggest in one of the first analyses of the genetic contribution to intelligence in apes. The findings, published online today in Current Biology1, could shed light on how human intelligence evolved, and might even lead to discoveries of genes associated with mental capacity. A team led by William Hopkins, a psychologist at Georgia State University in Atlanta, tested the intelligence of 99 chimpanzees aged 9 to 54 years old, most of them descended from the same group of animals housed at the Yerkes National Primate Research Center in Atlanta. The chimps faced cognitive challenges such as remembering where food was hidden in a rotating object, following a human’s gaze and using tools to solve problems. A subsequent statistical analysis revealed a correlation between the animals' performance on these tests and their relatedness to other chimpanzees participating in the study. About half of the difference in performance between individual apes was genetic, the researchers found. In humans, about 30% of intelligence in children can be explained by genetics; for adults, who are less vulnerable to environmental influences, that figure rises to 70%. Those numbers are comparable to the new estimate of the heritability of intelligence across a wide age range of chimps, says Danielle Posthuma, a behavioural geneticist at VU University in Amsterdam, who was not involved in the research. “This study is much overdue,” says Rasmus Nielsen, a computational biologist at the University of California, Berkeley. “There has been enormous focus on understanding heritability of intelligence in humans, but very little on our closest relatives.” © 2014 Nature Publishing Group
By Dominic Basulto It turns out that the human brain may not be as mysterious as it has always seemed to be. Researchers at George Washington University, led by Mohamad Koubeissi, may have found a way to turn human consciousness on and off by targeting a specific region of the brain with electrical currents. For brain researchers, unlocking the mystery of human consciousness has always been viewed as one of the keys for eventually building an artificial brain, and so this could be a big win for the future of brain research. What the researchers did was deliver a serious of high frequency electrical impulses to the claustrum region of the brain in a woman suffering from epilepsy. Before the electric shocks, the woman was capable of writing and talking. During the electric shocks, the woman faded out of consciousness, and started staring blankly into space, incapable of even the most basic sensory functions. Even her breathing slowed. As soon as the electrical shocks stopped, the woman immediately regained her sensory skills with no memory of the event. The researchers claim that this test case is evidence of being able to turn consciousness on and off. Granted, there’s a lot still to be done. That George Washington test, for example, has only been successfully performed on one person. And that woman had already had part of her hippocampus removed, so at least one researcher says the whole experiment must be interpreted carefully. There have been plenty of scientific experiments that have been “one and done,” so it remains to be seen whether these results can be replicated again.
Link ID: 19817 - Posted: 07.12.2014