Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By James Gallagher Health editor, BBC News website Close your eyes and imagine walking along a sandy beach and then gazing over the horizon as the Sun rises. How clear is the image that springs to mind? Most people can readily conjure images inside their head - known as their mind's eye. But this year scientists have described a condition, aphantasia, in which some people are unable to visualise mental images. Niel Kenmuir, from Lancaster, has always had a blind mind's eye. He knew he was different even in childhood. "My stepfather, when I couldn't sleep, told me to count sheep, and he explained what he meant, I tried to do it and I couldn't," he says. "I couldn't see any sheep jumping over fences, there was nothing to count." Our memories are often tied up in images, think back to a wedding or first day at school. As a result, Niel admits, some aspects of his memory are "terrible", but he is very good at remembering facts. And, like others with aphantasia, he struggles to recognise faces. Yet he does not see aphantasia as a disability, but simply a different way of experiencing life. Take the aphantasia test It is impossible to see what someone else is picturing inside their head. Psychologists use the Vividness of Visual Imagery Questionnaire, which asks you to rate different mental images, to test the strength of the mind's eye. The University of Exeter has developed an abridged version that lets you see how your mind compares. © 2015 BBC.
By Elizabeth Kolbert C57BL/6J mice are black, with pink ears and long pink tails. Inbred for the purposes of experimentation, they exhibit a number of infelicitous traits, including a susceptibility to obesity, a taste for morphine, and a tendency to nibble off their cage mates’ hair. They’re also tipplers. Given access to ethanol, C57BL/6J mice routinely suck away until the point that, were they to get behind the wheel of a Stuart Little-size roadster, they’d get pulled over for D.U.I. Not long ago, a team of researchers at Temple University decided to take advantage of C57BL/6Js’ bad habits to test a hunch. They gathered eighty-six mice and placed them in Plexiglas cages, either singly or in groups of three. Then they spiked the water with ethanol and videotaped the results. Half of the test mice were four weeks old, which, in murine terms, qualifies them as adolescents. The other half were twelve-week-old adults. When the researchers watched the videos, they found that the youngsters had, on average, outdrunk their elders. More striking still was the pattern of consumption. Young male C57BL/6Js who were alone drank roughly the same amount as adult males. But adolescent males with cage mates went on a bender; they spent, on average, twice as much time drinking as solo boy mice and about thirty per cent more time than solo girls. The researchers published the results in the journal Developmental Science. In their paper, they noted that it was “not possible” to conduct a similar study on human adolescents, owing to the obvious ethical concerns. But, of course, similar experiments are performed all the time, under far less controlled circumstances. Just ask any college dean. Or ask a teen-ager.
By Simon Worrall, National Geographic How do we know we exist? What is the self? These are some of the questions science writer Anil Ananthaswamy asks in his thought-provoking new book, The Man Who Wasn’t There: Investigations Into the Strange New Science of the Self. The answers, he says, may lie in medical conditions like Cotard’s syndrome, Alzheimer’s or body integrity identity disorder, which causes some people to try and amputate their own limbs. Speaking from Berkeley, California, he explains why Antarctic explorer Ernest Shackleton fell victim to the doppelgänger effect; how neuroscience is rewriting our ideas about identity; and how a song by George Harrison of the Beatles offers a critique of the Western view of the self. You dedicate the book to “those of us who want to let go but wonder, who is letting go and of what?” Explain that statement. We always hear within popular culture that we have to “let go,” as a way of dealing with certain situations in our lives. And in some sense you have to wonder about that statement because the person or thing doing the letting go is also probably what has to be let go. In the book, I am trying to get behind the whole issue of what the self is that has to do the letting go; and what aspects of the self have to be let go of. You start your book with Alzheimer’s. Tell us about the origin of the condition and what it tells us about “the autobiographical self.” Alzheimer’s is a very severe condition, especially during the mid- to late stages, which starts robbing people of their ability to remember anything that’s happening to them. They also start forgetting the people they are close to. © 1996-2015 National Geographic Society
Link ID: 21343 - Posted: 08.27.2015
By NINA STROHMINGER and SHAUN NICHOLS WHEN does the deterioration of your brain rob you of your identity, and when does it not? Alzheimer’s, the neurodegenerative disease that erodes old memories and the ability to form new ones, has a reputation as a ruthless plunderer of selfhood. People with the disease may no longer seem like themselves. Neurodegenerative diseases that target the motor system, like amyotrophic lateral sclerosis, can lead to equally devastating consequences: difficulty moving, walking, speaking and eventually, swallowing and breathing. Yet they do not seem to threaten the fabric of selfhood in quite the same way. Memory, it seems, is central to identity. And indeed, many philosophers and psychologists have supposed as much. This idea is intuitive enough, for what captures our personal trajectory through life better than the vault of our recollections? But maybe this conventional wisdom is wrong. After all, the array of cognitive faculties affected by neurodegenerative diseases is vast: language, emotion, visual processing, personality, intelligence, moral behavior. Perhaps some of these play a role in securing a person’s identity. The challenge in trying in determine what parts of the mind contribute to personal identity is that each neurodegenerative disease can affect many cognitive systems, with the exact constellation of symptoms manifesting differently from one patient to the next. For instance, some Alzheimer’s patients experience only memory loss, whereas others also experience personality change or impaired visual recognition. The only way to tease apart which changes render someone unrecognizable is to compare all such symptoms, across multiple diseases. And that’s just what we did, in a study published this month in Psychological Science. © 2015 The New York Times Company
Link ID: 21331 - Posted: 08.24.2015
Helen Thomson Modafinil is the world’s first safe “smart drug”, researchers at Harvard and Oxford universities have said, after performing a comprehensive review of the drug. They concluded that the drug, which is prescribed for narcolepsy but is increasingly taken without prescription by healthy people, can improve decision- making, problem-solving and possibly even make people think more creatively. While acknowledging that there was limited information available on the effects of long-term use, the reviewers said that the drug appeared safe to take in the short term, with few side effects and no addictive qualities. Modafinil has become increasingly common in universities across Britain and the US. Prescribed in the UK as Provigil, it was licensed in 2002 for use as a treatment for narcolepsy - a brain disorder that can cause a person to suddenly fall asleep at inappropriate times or to experience chronic pervasive sleepiness and fatigue. Used without prescription, and bought through easy-to-find websites, modafinil is what is known as a smart drug - used primarily by people wanting to improve their focus before an exam. A poll of Nature journal readers suggested that one in five have used drugs to improve focus, with 44% stating modafinil as their drug of choice. But despite its increasing popularity, there has been little consensus on the extent of modafinil’s effects in healthy, non-sleep-disordered humans. A new review of 24 of the most recent modafinil studies suggests that the drug has many positive effects in healthy people, including enhancing attention, improving learning and memory and increasing something called “fluid intelligence” - essentially our capacity to solve problems and think creatively. © 2015 Guardian News and Media Limited
By PAUL GLIMCHER and MICHAEL A. LIVERMORE THE United States government recently announced an $18.7 billion settlement of claims against the oil giant BP in connection with the Deepwater Horizon oil rig explosion in April 2010, which dumped millions of barrels of oil into the Gulf of Mexico. Though some of the settlement funds are to compensate the region for economic harm, most will go to environmental restoration in affected states. Is BP getting off easy, or being unfairly penalized? This is not easy to answer. Assigning a monetary value to environmental harm is notoriously tricky. There is, after all, no market for intact ecosystems or endangered species. We don’t reveal how much we value these things in a consumer context, as goods or services for which we will or won’t pay a certain amount. Instead, we value them for their mere existence. And it is not obvious how to put a price tag on that. In an attempt to do so, economists and policy makers often rely on a technique called “contingent valuation,” which amounts to asking individuals survey questions about their willingness to pay to protect natural resources. The values generated by contingent valuation studies are frequently used to inform public policy and litigation. (If the government had gone to trial with BP, it most likely would have relied on such studies to argue for a large judgment against the company.) Contingent valuation has always aroused skepticism. Oil companies, unsurprisingly, have criticized the technique. But many economists have also been skeptical, worrying that hypothetical questions posed to ordinary citizens may not really capture their genuine sense of environmental value. Even the Obama administration seems to discount contingent valuation, choosing to exclude data from this technique in 2014 when issuing a new rule to reduce the number of fish killed by power plants. © 2015 The New York Times Company
By John Danaher Discoveries in neuroscience, and the science of behaviour more generally, pose a challenge to the existence of free will. But this all depends on what is meant by ‘free will’. The term means different things to different people. Philosophers focus on two conditions that seem to be necessary for free will: (i) the alternativism condition, according to which having free will requires the ability to do otherwise; and (ii) the sourcehood condition, according to which having free will requires that you (your ‘self’) be the source of your actions. A scientific and deterministic worldview is often said to threaten the first condition. Does it also threaten the second? That is what Christian List and Peter Menzies article “My brain made me do it: The exclusion argument against free will and what’s wrong with it” tries to figure out. As you might guess from the title, the authors think that the scientific worldview, in particular the advances in neuroscience, do not necessarily threaten the sourcehood condition. I discussed their main argument in the previous post. To briefly recap, they critiqued an argument from physicalism against free will. According to this argument, the mental states which constitute the self do not cause our behaviour because they are epiphenomenal: they supervene on the physical brain states that do all the causal work. List and Menzies disputed this by appealing to a difference-making account of causation. This allowed for the possibility of mental states causing behaviour (being the ‘difference makers’) even if they were supervenient upon underlying physical states.
Link ID: 21279 - Posted: 08.10.2015
April Dembosky Developers of a new video game for your brain say theirs is more than just another get-smarter-quick scheme. Akili, a Northern California startup, insists on taking the game through a full battery of clinical trials so it can get approval from the Food and Drug Administration — a process that will take lots of money and several years. So why would a game designer go to all that trouble when there's already a robust market of consumers ready to buy games that claim to make you smarter and improve your memory? Think about all the ads you've heard for brain games. Maybe you've even passed a store selling them. There's one at the mall in downtown San Francisco — just past the cream puff stand and across from Jamba Juice — staffed on my visit by a guy named Dominic Firpo. "I'm a brain coach here at Marbles: The Brain Store," he says. Brain coach? "Sounds better than sales person," Firpo explains. "We have to learn all 200 games in here and become great sales people so we can help enrich peoples' minds." He heads to the "Word and Memory" section of the store and points to one product that says it will improve your focus and reduce stress in just three minutes a day. "We sold out of it within the first month of when we got it," Firpo says. The market for these "brain fitness" games is worth about $1 billion and is expected to grow to $6 billion in the next five years. Game makers appeal to both the young and the older with the common claim that if you exercise your memory, you'll be able to think faster and be less forgetful. Maybe bump up your IQ a few points. "That's absurd," says psychology professor Randall Engle from the Georgia Institute of Technology. © 2015 NPR
By John Danaher Consider the following passage from Ian McEwan’s novel Atonement. It concerns one of the novel’s characters (Briony) as she philosophically reflects on the mystery of human action: She raised one hand and flexed its fingers and wondered, as she had sometimes done before, how this thing, this machine for gripping, this fleshy spider on the end of her arm, came to be hers, entirely at her command. Or did it have some little life of its own? She bent her finger and straightened it. The mystery was in the instant before it moved, the dividing moment between not moving and moving, when her intention took effect. It was like a wave breaking. If she could only find herself at the crest, she thought, she might find the secret of herself, that part of her that was really in charge. Is Briony’s quest forlorn? Will she ever find herself at the crest of the wave? The contemporary scientific understanding of human action seems to cast this into some doubt. A variety of studies in the neuroscience of action paint an increasingly mechanistic and subconscious picture of human behaviour. According to these studies, our behaviour is not the product of our intentions or desires or anything like that. It is the product of our neural networks and systems, a complex soup of electrochemical interactions, oftentimes operating beneath our conscious awareness. In other words, our brains control our actions; our selves (in the philosophically important sense of the word ‘self’) do not. This discovery — that our brains ‘make us do it’ and that ‘we’ don’t — is thought to have a number of significant social implications, particularly for our practices of blame and punishment.
Link ID: 21276 - Posted: 08.08.2015
By Ariana Eunjung Cha Children who suffer an injury to the brain -- even a minor one -- are more likely to experience attention issues, according to a study published Monday in the journal Pediatrics. The effects may not be immediate and could occur long after the incident. Study author Marsh Konigs, a doctoral candidate at VU University Amsterdam, described the impact as "very short lapses in focus, causing children to be slower." Researchers looked at 113 children, ages six to 13, who suffered from traumatic brain injuries (TBIs) ranging from a concussion that gave them a headache or caused them to vomit, to losing consciousness for more than 30 minutes, and compared them with a group of 53 children who experienced a trauma that was not head-related. About 18 months after the children's accidents, parents and teachers were asked to rate their attention and other indicators of their health. They found that those with TBI had more lapses in attention and other issues, such as anxiety, a tendency to internalize their problems and slower processing speed. Based on studies of adults who experienced attention issues after suffering from a brain injury, doctors have theorized for years that head injuries in children might be followed by a "secondary attention deficit hyperactivity disorder." This study appears to confirm that association.
by Anil Ananthaswamy Science journalist Anil Ananthaswamy thinks a lot about "self" — not necessarily himself, but the role the brain plays in our notions of self and existence. In his new book, The Man Who Wasn't There, Ananthaswamy examines the ways people think of themselves and how those perceptions can be distorted by brain conditions, such as Alzheimer's disease, Cotard's syndrome and body integrity identity disorder, or BIID, a psychological condition in which a patient perceives that a body part is not his own. Ananthaswamy tells Fresh Air's Terry Gross about a patient with BIID who became so convinced that a healthy leg wasn't his own that he eventually underwent an amputation of the limb. "Within 12 hours, this patient that I saw, he was sitting up and there was no regret. He really seemed fine with having given up his leg," Ananthaswamy says. Ultimately, Ananthaswamy says, our sense of self is a layered one, which pulls information from varying parts of the brain to create a sense of narrative self, bodily self and spiritual self: "What it comes down to is this sense we have of being someone or something to which things are happening. It's there when we wake up in the morning, it kind of disappears when we go to sleep, it reappears in our dreams, and it's also this sense we have of being an entity that spans time." Interview Highlights On how to define "self" When you ask someone, "Who are you?" you're most likely to get a kind of narrative answer, "I am so-and-so, I'm a father, I'm son." They are going to tell you a kind of story they have in their heads about themselves, the story that they tell to themselves and to others, and in some sense that's what can be called the narrative self. ... © 2015 NPR
Link ID: 21235 - Posted: 07.29.2015
By Ariana Eunjung Cha The Defense Advanced Research Projects Agency funds a lot of weird stuff, and in recent years more and more of it has been about the brain. Its signature work in this field is in brain-computer interfaces and goes back several decades to its Biocybernetics program, which sought to enable direct communication between humans and machines. In 2013, DARPA made headlines when it announced that it intended to spend more than $70 million over five years to take its research to the next level by developing an implant that could help restore function or memory in people with neuropsychiatric issues. Less known is DARPA's Narrative Networks (or N2) project which aims to better understand how stories — or narratives — influence human behavior and to develop a set of tools that can help facilitate faster and better communication of information. "Narratives exert a powerful influence on human thoughts, emotions and behavior and can be particularly important in security contexts," DARPA researchers explained in a paper published in the Journal of Neuroscience Methods in April. They added that "in conflict resolution and counterterrorism scenarios, detecting the neural response underlying empathy induced by stories is of critical importance." This is where the work on the Hitchcock movies comes in. Researchers at the Georgia Institute of Technology recruited undergraduates to be hooked up to MRI machines and watch movie clips that were roughly three minutes long. The excerpts all featured a character facing a potential negative outcome and were taken from suspenseful movies, including three Alfred Hitchcock flicks as well as "Alien," "Misery," "Munich" and "Cliffhanger," among others.
By Roni Caryn Rabin “Fat” cartoon characters may lead children to eat more junk food, new research suggests, but there are ways to counter this effect. The findings underscore how cartoon characters, ubiquitous in children’s books, movies, television, video games, fast-food menus and graphic novels, may influence children’s behavior in unforeseen ways, especially when it comes to eating. Researchers first randomly showed 60 eighth graders a svelte jelly-bean-like cartoon character or a similar rotund character and asked them to comment on the images. Then they thanked them and gestured toward bowls of Starburst candies and Hershey’s Kisses, saying, “You can take some candy.” Children who had seen the rotund cartoon character helped themselves to more than double the number of candies as children shown the lean character, taking 3.8 candies on average, compared with 1.7 taken by children shown the lean bean character. (Children in a comparison group shown an image of a coffee mug took 1.5 candies on average.) But activating children’s existing health knowledge can counter these effects, the researchers discovered. In a separate experiment, they showed 167 elementary school children two red Gumby-like cartoon characters, one fat and one thin, and then asked them to “taste test” some cookies. But they also asked the children to “think about things that make you healthy,” such as getting enough sleep versus watching TV, or drinking soda versus milk. Some children were asked the health questions before being given the cookie taste test, while others were asked the questions after the taste test. Remarkably, the children who were asked about healthy habits before doing the taste test ate fewer cookies — even if they had first been exposed to the rotund cartoon character. Those who were shown the rotund figure ate 4.2 cookies on average if they were asked about healthy habits after eating the cookies, compared to three cookies if they were asked about healthy habits before doing the taste test. Children who saw the normal weight character and who were asked about healthy habits after the taste test also ate about three cookies. © 2015 The New York Times Company
By Neuroskeptic According to British biochemist Donald R. Forsdyke in a new paper in Biological Theory, the existence of people who seem to be missing most of their brain tissue calls into question some of the “cherished assumptions” of neuroscience. I’m not so sure. Forsdyke discusses the disease called hydrocephalus (‘water on the brain’). Some people who suffer from this condition as children are cured thanks to prompt treatment. Remarkably, in some cases, these post-hydrocephalics turn out to have grossly abnormal brain structure: huge swathes of their brain tissue are missing, replaced by fluid. Even more remarkably, in some cases, these people have normal intelligence and display no obvious symptoms, despite their brains being mostly water. This phenomenon was first noted by a British pediatrician called John Lorber. Lorber never published his observations in a scientific journal, although a documentary was made about them. However, his work was famously discussed in Science in 1980 by Lewin in an article called “Is Your Brain Really Necessary?“. There have been a number of other more recent published cases. Forsdyke argues that such cases pose a problem for mainstream neuroscience. If a post-hydrocephalic brain can store the same amount of information as a normal brain, he says, then “brain size does not scale with information quantity”, therefore, “it would seem timely to look anew at possible ways our brains might store their information.”
Alison Abbott Neuroscientists have identified an area of the brain that might give the human mind its unique abilities, including language. The area lit up in human, but not monkey, brains when they were presented with different types of abstract information. The idea that integrating abstract information drives many of the human brain's unique abilities has been around for decades. But a paper published1 in Current Biology, which directly compares activity in human and macaque monkey brains as they listen to simple auditory patterns, provides the first physical evidence that a specific area for such integration may exist in humans. Other studies that compare monkeys and humans have revealed differences in the brain’s anatomy, for example, but not differences that could explain where humans’ abstract abilities come from, say neuroscientists. “This gives us a powerful clue about what is special about our minds,” says psychologist Gary Marcus at New York University. “Nothing is more important than understanding how we got to be how we are.” A team of researchers headed by Stanislas Dehaene at the INSERM Cognitive Neuroimaging Unit at Gif-sur-Yvette near Paris, looked at changing patterns of activation in the brain as untrained monkeys and human adults listened to a simple sequence of tones, for example three identical tones followed by a different tone (like the famous four-note opening of Beethoven’s fifth symphony: da-da-da-DAH). The researchers played several different sequences with this structure — known as AAAB — and other sequences to the subjects while they lay in a functional magnetic resonance imaging (fMRI) scanner. The fMRI technique picks up changes in blood flow in the brain that correlate with regional brain activity. © 2015 Nature Publishing Group,
By Gretchen Reynolds A walk in the park may soothe the mind and, in the process, change the workings of our brains in ways that improve our mental health, according to an interesting new study of the physical effects on the brain of visiting nature. Most of us today live in cities and spend far less time outside in green, natural spaces than people did several generations ago. City dwellers also have a higher risk for anxiety, depression and other mental illnesses than people living outside urban centers, studies show. These developments seem to be linked to some extent, according to a growing body of research. Various studies have found that urban dwellers with little access to green spaces have a higher incidence of psychological problems than people living near parks and that city dwellers who visit natural environments have lower levels of stress hormones immediately afterward than people who have not recently been outside. But just how a visit to a park or other green space might alter mood has been unclear. Does experiencing nature actually change our brains in some way that affects our emotional health? That possibility intrigued Gregory Bratman, a graduate student at the Emmett Interdisciplinary Program in Environment and Resources at Stanford University, who has been studying the psychological effects of urban living. In an earlier study published last month, he and his colleagues found that volunteers who walked briefly through a lush, green portion of the Stanford campus were more attentive and happier afterward than volunteers who strolled for the same amount of time near heavy traffic. But that study did not examine the neurological mechanisms that might underlie the effects of being outside in nature. So for the new study, which was published last week in Proceedings of the National Academy of Sciences, Mr. Bratman and his collaborators decided to closely scrutinize what effect a walk might have on a person’s tendency to brood. © 2015 The New York Times Company
Jon Hamilton It's almost impossible to ignore a screaming baby. (Click here if you doubt that.) And now scientists think they know why. "Screams occupy their own little patch of the soundscape that doesn't seem to be used for other things," says David Poeppel, a professor of psychology and neuroscience at New York University and director of the Department of Neuroscience at the Max Planck Institute in Frankfurt. And when people hear the unique sound characteristics of a scream — from a baby or anyone else — it triggers fear circuits in the brain, Poeppel and a team of researchers report in Cell Biology. The team also found that certain artificial sounds, like alarms, trigger the same circuits. "That's why you want to throw your alarm clock on the floor," Poeppel says. The researchers in Poeppel's lab decided to study screams in part because they are a primal form of communication found in every culture. And there was another reason. "Many of the postdocs in my lab are in the middle of having kids and, of course, screams are very much on their mind," Poeppel says. "So it made perfect sense for them to be obsessed with this topic." The team started by trying to figure out "what makes a scream a scream," Poeppel says. Answering that question required creating a large database of recorded screams — from movies, from the Internet and from volunteers who agreed to step into a sound booth. A careful analysis of these screams found that they're not like any other sound that people make, including other loud, high-pitched vocalizations. The difference is something called the amplitude modulation rate, which is how often the loudness of a sound changes. © 2015 NPR
That song really is stuck in your head. The experience of hearing tunes in your mind appears to be linked to physical differences in brain structure. The study is the first to look at the neural basis for “involuntary musical imagery” – or “earworms”. They aren’t just a curiosity, says study co-author Lauren Stewart at Goldsmith’s, University of London, but could have a biological function. Stewart, a music psychologist, was first inspired to study earworms by a regular feature on the radio station BBC 6Music, in which listeners would write in with songs they had woken up with in their heads. There was a lot of interest from the public in what they are and where they had come from, but there was little research on the topic, she says. Once Stewart and her team started researching earworms, it became clear that some people are affected quite severely: one person even wrote to them saying he had lost his job because of an earworm. To find out what makes some people more susceptible to the phenomenon, the team asked 44 volunteers about how often they got earworms and how they were affected by them. Then they used MRI scans to measure the thickness of volunteers’ cerebral cortices and the volume of their grey matter in various brain areas. Brain differences People who suffered earworms more frequently had thicker cortices in areas involved in auditory perception and pitch discrimination. © Copyright Reed Business Information Ltd.
By Laura Sanders Everybody knows people who seem to bumble through life with no sense of time — they dither for hours on a “quick” e-mail or expect an hour’s drive to take 20 minutes. These people are always late. But even for them, such minor lapses in timing are actually exceptions. We notice these flaws precisely because they’re out of the ordinary. Humans, like other animals, are quite good at keeping track of passing time. This talent does more than keep office meetings running smoothly. Almost everything our bodies and brains do requires precision clockwork — down to milliseconds. Without a sharp sense of time, people would be reduced to insensate messes, unable to move, talk, remember or learn. “We don’t think about it, but just walking down the street is an exquisitely timed operation,” says neuroscientist Lila Davachi of New York University. Muscles fire and joints steady themselves in a precisely orchestrated time series that masquerades as an unremarkable part of everyday life. A sense of time, Davachi says, is fundamental to how we move, how we act and how we perceive the world. Yet for something that forms the bedrock of nearly everything we do, time perception is incredibly hard to study. “It’s a quagmire,” says cognitive neuroscientist Peter Tse of Dartmouth College. The problem is thorny because there are thousands of possible intricate answers, all depending on what exactly scientists are asking. Their questions have begun to reveal an astonishingly complex conglomerate of neural timekeepers that influence each other. © Society for Science & the Public 2000 - 2015.
Link ID: 21177 - Posted: 07.16.2015
Computers built to mimic the brain can now recognise images, speech and even create art, and it’s all because they are learning from data we churn out online Do androids dream of electric squid? (Image: Reservoir Lab at Ghent University) I AM watching it have a very odd dream – psychedelic visions of brain tissue folds, interspersed with chunks of coral reef. The dreamer in question is an artificial intelligence, one that live-streams from a computer on the ground floor of the Technicum building in Ghent University, Belgium. This vision has been conjured up after a viewer in the chat sidebar suggests "brain coral" as a topic. It's a fun distraction – and thousands of people have logged on to watch. But beyond that, the bot is a visual demonstration of a technology that is finally coming of age: neural networks. The bot is called 317070, a name it shares with the Twitter handle of its creator, Ghent graduate student Jonas Degrave. It is based on a neural network that can recognise objects in images, except that Degrave runs it in reverse. Given static noise, it tweaks its output until it creates images that tally with what viewers are requesting online. The bot's live-stream page says it is "hallucinating", although Degrave says "imagining" is a little more accurate. Degrave's experiment plays off recent Google research which aimed to tackle one of the core issues with neural networks: that no one knows how neural networks come up with their answers. The images the network creates to satisfy simple instructions can give us some insights. © Copyright Reed Business Information Ltd
Link ID: 21149 - Posted: 07.09.2015