Chapter 18. Attention and Higher Cognition
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.
Link ID: 20049 - Posted: 09.09.2014
by Chris Higgins Neuroscientists have pinpointed where imagination hides in the brain and found it to be functionally distinct from related processes such as memory. The team from Brigham Young University (BYU), Utah-- including research proposer, undergraduate student Stefania Ashby -- used functional Magnetic Resonance Imaging (fMRI) to observe brain activity when subjects were remembering specific experiences and putting themselves in novel ones. "I was thinking a lot about planning for my own future and imagining myself in the future, and I started wondering how memory and imagination work together," Ashby said. "I wondered if they were separate or if imagination is just taking past memories and combining them in different ways to form something I've never experienced before." The two processes of remembering and imagining have been previously proposed to be the same cognitive task, and so thought to be carried out by the same areas of the brain. However, the experiments derived by Ashby and her mentor (and coauthor) BYU professor Brock Kirwan have refuted these ideas. The studies -- published in the journal Cognitive Neuroscience -- required participants to submit 60 photographs of previous life events and use them to create prompts for the "remember" sections. They then carried out a questionnaire before putting the subject into the MRI scanner to determine what scenarios were the most novel to them and force them into imagination. Then, under fMRI testing, the subjects were prompted with various scenarios and the areas of their brain that became active during each scenario was correlated with each scene's familiarity -- pure memory, or imagination. © Condé Nast UK 2014
by Tom Siegfried René Descartes was a very clever thinker. He proved his own existence, declaring that because he thought, he must exist: “I think, therefore I am.” But the 17th century philosopher-mathematician-scientist committed a serious mental blunder when he decided that the mind doing the thinking was somehow separate from the brain it lived in. Descartes believed that thought was insubstantial, transmitted from the ether to the pineal gland, which played the role of something like a Wi-Fi receiver embedded deep in the brain. Thereafter mind-brain dualism became the prevailing prejudice. Nowadays, though, everybody with a properly working brain realizes that the mind and brain are coexistent. Thought processes and associated cognitive mental activity all reflect the physics and chemistry of cells and molecules inhabiting the brain’s biological tissue. Many people today do not realize, though, that there’s a modern version of Descartes’ mistaken dichotomy. Just as he erroneously believed the mind was distinct from the brain, some scientists have mistakenly conceived of the brain as distinct from the body. Much of the early research in artificial intelligence, for instance, modeled the brain as a computer, seeking to replicate mental life as information processing, converting inputs to outputs by logical rules. But even if such a machine could duplicate the circuitry of the brain, it would be missing essential peripheral input from an attached body. Actual intelligence requires both body and brain, as the neurologist Antonio Damasio pointed out in his 1994 book, Descartes’ Error. “Mental activity, from its simplest aspects to its most sublime, requires both brain and body proper,” Damasio wrote. © Society for Science & the Public 2000 - 2013.
Link ID: 20002 - Posted: 08.27.2014
|By Matthew H. Schneps “There are three types of mathematicians, those who can count and those who can’t.” Bad joke? You bet. But what makes this amusing is that the joke is triggered by our perception of a paradox, a breakdown in mathematical logic that activates regions of the brain located in the right prefrontal cortex. These regions are sensitive to the perception of causality and alert us to situations that are suspect or fishy — possible sources of danger where a situation just doesn’t seem to add up. Many of the famous etchings by the artist M.C. Escher activate a similar response because they depict scenes that violate causality. His famous “Waterfall” shows a water wheel powered by water pouring down from a wooden flume. The water turns the wheel, and is redirected uphill back to the mouth of the flume, where it can once again pour over the wheel, in an endless cycle. The drawing shows us a situation that violates pretty much every law of physics on the books, and our brain perceives this logical oddity as amusing — a visual joke. The trick that makes Escher’s drawings intriguing is a geometric construction psychologists refer to as an “impossible figure,” a line-form suggesting a three-dimensional object that could never exist in our experience. Psychologists, including a team led by Catya von Károlyi of the University of Wisconsin-Eau Claire, have used such figures to study human cognition. When the team asked people to pick out impossible figures from similarly drawn illustrations that did not violate causality, they were surprised to discover that some people were faster at this than others. And most surprising of all, among those who were the fastest were those with dyslexia. © 2014 Scientific American
Helen Shen For most adults, adding small numbers requires little effort, but for some children, it can take all ten fingers and a lot of time. Research published online on 17 August in Nature Neuroscience1 suggests that changes in the hippocampus — a brain area associated with memory formation — could help to explain how children eventually pick up efficient strategies for mathematics, and why some children learn more quickly than others. Vinod Menon, a developmental cognitive neuroscientist at Stanford University in California, and his colleagues presented single-digit addition problems to 28 children aged 7–9, as well as to 20 adolescents aged 14–17 and 20 young adults. Consistent with previous psychology studies2, the children relied heavily on counting out the sums, whereas adolescents and adults tended to draw on memorized information to calculate the answers. The researchers saw this developmental change begin to unfold when they tested the same children at two time points, about one year apart. As the children aged, they began to move away from counting on fingers towards memory-based strategies, as measured by their own accounts and by decreased lip and finger movements during the task. Using functional magnetic resonance imaging (fMRI) to scan the children's brains, the team observed increased activation of the hippocampus between the first and second time point. Neural activation decreased in parts of the prefrontal and parietal cortices known to be involved in counting, suggesting that the same calculations had begun to engage different neural circuits. © 2014 Nature Publishing Group
Ever wonder why it’s hard to focus after a bad night’s sleep? Using mice and flashes of light, scientists show that just a few nerve cells in the brain may control the switch between internal thoughts and external distractions. The study, partly funded by the National Institutes of Health, may be a breakthrough in understanding how a critical part of the brain, called the thalamic reticular nucleus (TRN), influences consciousness. “Now we may have a handle on how this tiny part of the brain exerts tremendous control over our thoughts and perceptions,” said Michael Halassa, M.D., Ph.D., assistant professor at New York University’s Langone Medical Center and a lead investigator of the study. “These results may be a gateway into understanding the circuitry that underlies neuropsychiatric disorders.” The TRN is a thin layer of nerve cells on the surface of the thalamus, a center located deep inside the brain that relays information from the body to the cerebral cortex. The cortex is the outer, multi-folded layer of the brain that controls numerous functions, including one’s thoughts, movements, language, emotions, memories, and visual perceptions. TRN cells are thought to act as switchboard operators that control the flow of information relayed from the thalamus to the cortex. To understand how the switches may work, Dr. Halassa and his colleagues studied the firing patterns of TRN cells in mice during sleep and arousal, two states with very different information processing needs. The results published in Cell, suggest that the TRN has many switchboard operators, each dedicated to controlling specific lines of communication. Using this information, the researchers could alter the attention span of mice.
|By Piercarlo Valdesolo In the summer of 2009 I tried to cure homemade sausages in my kitchen. One of the hazards of such a practice is preventing the growth of undesirable molds and diseases such as botulism. My wife was not on board with this plan, skeptical of my ability to safely execute the procedure. And so began many weeks of being peppered with warnings, relevant articles and concerned looks. When the time came for my first bite, nerves were high. My throat itched. My heart raced. My vision blurred. I had been botulized! Halfway through our walk to the hospital I regained my composure. Of course I had not been instantaneously struck by an incredibly rare disease that, by the way, takes at least 12 hours after consumption to manifest and does not share many symptoms with your garden variety anxiety attack. My experience had been shaped by my mindset. A decade of learning about the psychological power of expectations could not inoculate me from its effect. Psychologists know that beliefs about how experiences should affect us can bring about the expected outcomes. Though these “placebo effects” have primarily been studied in the context of pharmaceutical interventions (e.g. patients reporting pain relief after receiving saline they believed to be an analgesic), recent research has shown their strength in a variety of domains. Tell people that their job has exercise benefits and they will lose more weight than their coworkers who had no such belief. Convince people of a correlation between athleticism and visual acuity and they will show better vision after working out . Trick people into believing they are consuming caffeine and their vigilance and cognitive functioning increases. Some evidence shows that such interventions can even mitigate the negative effects of other experiences. For example, consuming placebo caffeine alleviates the cognitive consequences of sleep deprivation. © 2014 Scientific American
|By Nathan Collins Time zips by when you're having fun and passes slowly when you're not—except when you are depressed, in which case your time-gauging abilities are pretty accurate. Reporting in PLOS ONE, researchers in England and Ireland asked 39 students—18 with mild depression—to estimate the duration of tones lasting between two and 65 seconds and to produce tones of specified lengths of time. Happier students overestimated intervals by 16 percent and produced tones that were short by 13 percent, compared with depressed students' 3 percent underestimation and 8 percent overproduction. The results suggest that depressive realism, a phenomenon in which depressed people perceive themselves more accurately (and less positively) than typical individuals, may extend to aspects of thought beyond self-perception—in this case, time. They speculate that mindfulness treatments may be effective for depression, partly because they help depressed people focus on the moment, rather than its passing. © 2014 Scientific American
By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American
By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company
By KATHARINE Q. SEELYE SPARTA, N.J. — When Gail Morris came home late one night after taking her daughter to college, she saw her teenage son, Alex, asleep on the sofa in the family room. Nothing seemed amiss. An unfinished glass of apple juice sat on the table. She tucked him in under a blanket and went to bed. The next morning, he would not wake up. He was stiff and was hardly breathing. Over the next several hours, Ms. Morris was shocked to learn that her son had overdosed on heroin. She was told he would not survive. He did survive, but barely. He was in a coma for six weeks. He went blind and had no function in his arms or legs. He could not speak or swallow. Hospitalized for 14 months, Alex, who is 6-foot-1, dropped to 90 pounds. One of his doctors said that Alex had come as close to dying as anyone he knew who had not actually died. Most people who overdose on heroin either die or fully recover. But Alex plunged into a state that was neither dead nor functional. There are no national statistics on how often opioid overdose leads to cases like Alex’s, but doctors say they worry that with the dramatic increase in heroin abuse and overdoses, they will see more such outcomes. “I would expect that we will,” said Dr. Nora Volkow, director of the National Institute on Drug Abuse. “They are starting to report isolated cases like this. And I would not be surprised if you have more intermediate cases with more subtle impairment.” More than 660,000 Americans used heroin in 2012, the federal government says, double the number of five years earlier. Officials attribute much of the increase to a crackdown on prescription painkillers, prompting many users to turn to heroin, which is cheaper and easier to get than other opioids. © 2014 The New York Times Company
|By William Skaggs One of the most frustrating and mysterious medical conditions affecting the mind is impaired consciousness, as can occur with brain damage. Patients in a coma or a vegetative or minimally conscious state sometimes spontaneously recover to varying degrees, but in most cases there is little that doctors can do to help. Now a rigorous study by a group at Liège University Hospital Center in Belgium has found that a simple treatment called transcranial direct-current stimulation (tDCS) can temporarily raise awareness in minimally conscious patients. In tDCS, electrodes are glued to the scalp, and a weak electric current is passed through them to stimulate the underlying brain tissue. Scientists led by neurologist Steven Laureys applied the electric current for 20 minutes to patients' left prefrontal cortex, an area known to be involved in attentiveness and working memory. Afterward, the effects on consciousness were measured by doctors who did not know whether the patient had received real tDCS or a sham treatment, in which the apparatus ran, but no current was delivered. For patients in a vegetative state, who display no communication or purposeful behavior, the stimulation might have led to improvement in two patients, but no statistically compelling evidence emerged. Yet 13 of 30 patients in a minimally conscious state—defined by occasional moments of low-level awareness—showed measurable gains in their responses to questions and sensory stimuli. Some had only recently been injured, but others had been minimally conscious for months. © 2014 Scientific American
Link ID: 19934 - Posted: 08.11.2014
Ian Sample, science correspondent The human brain can judge the apparent trustworthiness of a face from a glimpse so fleeting, the person has no idea they have seen it, scientists claim. Researchers in the US found that brain activity changed in response to how trustworthy a face appeared to be when the face in question had not been consciously perceived. Scientists made the surprise discovery during a series of experiments that were designed to shed light on the the neural processes that underpin the snap judgments people make about others. The findings suggest that parts of our brains are doing more complex subconscious processing of the outside world than many researchers thought. Jonathan Freeman at New York University said the results built on previous work that shows "we form spontaneous judgments of other people that can be largely outside awareness." The study focused on the activity of the amygdala, a small almond-shaped region deep inside the brain. The amygdala is intimately involved with processing strong emotions, such as fear. Its central nucleus sends out the signals responsible for the famous and evolutionarily crucial "fight-or-flight" response. Prior to the study, Freeman asked a group of volunteers to rate the trustworthiness of a series of faces. People tend to agree when they rank trustworthiness – faces with several key features, such as more furrowed brows and shallower cheekbones, are consistently rated as less trustworthy. Freeman then invited a different group of people to take part in the experiments. Each lay in an MRI scanner while images of faces flashed up on a screen before them. Each trustworthy or untrustworthy face flashed up for a matter of milliseconds. Though their eyes had glimpsed the images, the participants were not aware they had seen the faces. © 2014 Guardian News and Media Limited
Link ID: 19924 - Posted: 08.07.2014
David Robson It’s not often that you look at your meal to find it staring back at you. But when Diane Duyser picked up her cheese toastie, she was in for a shock. “I went to take a bite out of it, and then I saw this lady looking back at me,” she told the Chicago Tribune. “It scared me at first.” As word got around, it soon began to spark more attention, and eventually a casino paid Duyser $28,000 to exhibit the toasted sandwich. For many, the woman’s soft, full features and serene expression recalls famous depictions of the Virgin Mary. But I’ve always thought the curled hair, parted lips and heavy eyelids evoke a more modern idol. Whichever Madonna you think you can see, she joins good company; Jesus has also been seen in toast, as well as a taco, a pancake and a banana peel, while Buzzfeed recently ran photos of peppers that look like British politicians. “If someone reports seeing Jesus in a piece of toast, you’d think they must be nuts,” says Kang Lee, at the University of Toronto, Canada. “But it’s very pervasive... We are primed to see faces in every corner of the visual world.” Lee has shown that rather than being a result of divine intervention, these experiences reflect the powerful influence of our imagination over our perception. Indeed, his explanation may mean that you never trust your eyes again. Pareidolia, as this experience is known, is by no means a recent phenomenon. Leonardo da Vinci described seeing characters in natural markings on stone walls, which he believed could help inspire his artworks. In the 1950s, the Bank of Canada had to withdraw a series of banknotes because a grinning devil leapt from the random curls of the Queen’s hair (although I can’t, for the life of me, see the merest hint of a horn in Her Majesty’s locks). The Viking I spacecraft, meanwhile, appeared to photograph a carved face in the rocky landscape of Mars. BBC © 2014
Link ID: 19912 - Posted: 08.02.2014
By Marek Kohn “You know how they say that we can only access 20% of our brain?” says the man who offers stressed-out writer Eddie Morra a fateful pill in the 2011 film Limitless. “Well, what this does, it lets you access all of it.” Morra is instantly transformed into a superhuman by the fictitious drug NZT-48. Granted access to all cognitive areas, he learns to play the piano in three days, finishes writing his book in four, and swiftly makes himself a millionaire. Limitless is what you get when you flatter yourself that your head houses the most complex known object in the universe, and you run away with the notion that it must have powers to match. A number of so-called ‘smart drugs’ or cognitive enhancers have captured attention recently, from stimulants such as modafinil, to amphetamines (often prescribed under the name Adderall) and methylphenidate (also known by its brand name Ritalin). According to widespread news reports, students have begun using these drugs to enhance their performance in school and college, and are continuing to do so in their professional lives. Yet are these smart drugs all they are cracked up to be? Can they really make all of us more intelligent or learn more? Should we be asking deeper questions about what these pharmaceuticals can and can’t do? BBC © 2014
By Smitha Mundasad Health reporter, BBC News Scientists say a part of the brain, smaller than a pea, triggers the instinctive feeling that something bad is about to happen. Writing in the journal PNAS, they suggest the habenula plays a key role in how humans predict, learn from and respond to nasty experiences. And they question whether hyperactivity in this area is responsible for the pessimism seen in depression. They are now investigating whether the structure is involved in the condition. Animal studies have shown that the habenula fires up when subjects expect or experience adverse events, But in humans this tiny structure (less than 3mm in diameter) has proved difficult to see on scans. Inventing a technique to pinpoint the area, scientists at University College London put 23 people though MRI scanners to monitor their brain activity. Participants were shown a range of abstract pictures. A few seconds later, the images were linked to either punishment (painful electric shocks), reward (money) or neutral responses. For some images, a punishment or reward followed each time but for others this varied - leaving people uncertain whether they were going to feel pain or not. And when people saw pictures associated with shocks the habenula lit up. And the more certain they were a picture was going to result in a punishment, the stronger and faster the activity in this area. Scientists suggests the habenula is involved in helping people learn when it is best to stay away from something and may also signal just how bad a nasty event is likely to be. BBC © 2014
By KATE MURPHY ONE of the biggest complaints in modern society is being overscheduled, overcommitted and overextended. Ask people at a social gathering how they are and the stock answer is “super busy,” “crazy busy” or “insanely busy.” Nobody is just “fine” anymore. When people aren’t super busy at work, they are crazy busy exercising, entertaining or taking their kids to Chinese lessons. Or maybe they are insanely busy playing fantasy football, tracing their genealogy or churning their own butter. And if there is ever a still moment for reflective thought — say, while waiting in line at the grocery store or sitting in traffic — out comes the mobile device. So it’s worth noting a study published last month in the journal Science, which shows how far people will go to avoid introspection. “We had noted how wedded to our devices we all seem to be and that people seem to find any excuse they can to keep busy,” said Timothy Wilson, a psychology professor at the University of Virginia and lead author of the study. “No one had done a simple study letting people go off on their own and think.” The results surprised him and have created a stir in the psychology and neuroscience communities. In 11 experiments involving more than 700 people, the majority of participants reported that they found it unpleasant to be alone in a room with their thoughts for just 6 to 15 minutes. Moreover, in one experiment, 64 percent of men and 15 percent of women began self-administering electric shocks when left alone to think. These same people, by the way, had previously said they would pay money to avoid receiving the painful jolt. It didn’t matter if the subjects engaged in the contemplative exercise at home or in the laboratory, or if they were given suggestions of what to think about, like a coming vacation; they just didn’t like being in their own heads. © 2014 The New York Times Company
By MICHAEL INZLICHT and SUKHVINDER OBHI I FEEL your pain. These words are famously associated with Bill Clinton, who as a politician seemed to ooze empathy. A skeptic might wonder, though, whether he truly was personally distressed by the suffering of average Americans. Can people in high positions of power — presidents, bosses, celebrities, even dominant spouses — easily empathize with those beneath them? Psychological research suggests the answer is no. Studies have repeatedly shown that participants who are in high positions of power (or who are temporarily induced to feel powerful) are less able to adopt the visual, cognitive or emotional perspective of other people, compared to participants who are powerless (or are made to feel so). For example, Michael Kraus, a psychologist now at the University of Illinois at Urbana-Champaign, and two colleagues found that among full-time employees of a public university, those who were higher in social class (as determined by level of education) were less able to accurately identify emotions in photographs of human faces than were co-workers who were lower in social class. (While social class and social power are admittedly not the same, they are strongly related.) Why does power leave people seemingly coldhearted? Some, like the Princeton psychologist Susan Fiske, have suggested that powerful people don’t attend well to others around them because they don’t need them in order to access important resources; as powerful people, they already have plentiful access to those. We suggest a different, albeit complementary, reason from cognitive neuroscience. On the basis of a study we recently published with the researcher Jeremy Hogeveen, in the Journal of Experimental Psychology: General, we contend that when people experience power, their brains fundamentally change how sensitive they are to the actions of others. © 2014 The New York Times Company
Posted by Katie Langin In a battle of wits, could a bird outsmart a kindergartner? Don’t be too quick to say no: One clever young bird solved a problem that has stumped 5-year-old children, according to a new study. The bird—a New Caledonian crow named Kitty—figured out that dropping rocks in one water-filled tube was the key to raising the water level in another, seemingly unconnected tube, giving her access to a floating morsel of meat. To solve this problem, Kitty needed to decipher a confusing cause-and-effect relationship, basically akin to figuring out that if you flip a switch on the wall, a ceiling light will turn on. This mental ability was once thought to be restricted to humans, but causal reasoning—the ability to understand cause and effect—has now been identified in a handful of animals, from chimpanzees to rats. Crows are the Einsteins of the bird world, renowned for their ability to make tools and solve complex puzzles. (Watch a video of a New Caledonian crow solving problems.) Their impressive mental capacity was even apparent to the ancient Greeks. In one of Aesop’s fables, a thirsty crow is presented with a dilemma when he cannot reach the water at the bottom of a pitcher. He figures out that the water level rises when he drops pebbles into the pitcher, and many pebbles later he is rewarded with a drink. As it turns out, there’s some truth to this fictional story. A study published earlier this year reported that New Caledonian crows will place rocks in water-filled tubes if they can’t reach a piece of meat that is attached to a floating cork. © 1996-2013 National Geographic Society.
by Douglas Heaven Hijacking how neurons of nematode worms are wired is the first step in an approach that could revolutionise our understanding of brains and consciousness CALL it the first brain hack. The humble nematode worm has had its neural connections hot-wired, changing the way it responds to salt and smells. As well as offering a way to create souped-up organisms, changing neural connectivity could one day allow us to treat brain damage in people by rerouting signals around damaged neurons. What's more, it offers a different approach to probing brain mysteries such as how consciousness arises from wiring patterns – much like exploring the function of an electronic circuit by plugging and unplugging cables. In our attempts to understand the brain, a lot of attention is given to neurons. A technique known as optogenetics, for example, lets researchers study the function of individual neurons by genetically altering them so they can be turned on and off by a light switch. But looking at the brain's connections is as important as watching the activity of neurons. Higher cognitive functions, such as an awareness of our place in the world, do not spring from a specific area, says Fani Deligianni at University College London. Deligianni and her colleagues are developing imaging techniques to map the brain's connections, as are other groups around the world (see "Start with a worm..."). "From this we can begin to answer some of the big questions about the workings of the brain and consciousness which seem to depend on connectivity," she says. Tracing how the brain is wired is a great first step but to find out how this linking pattern produces a particular behaviour we need to be able to see how changing these links affects brain function. This is what a team led by William Schafer at the MRC Laboratory of Molecular Biology in Cambridge, UK, is attempting. © Copyright Reed Business Information Ltd.