Chapter 14. Attention and Consciousness
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
MONKEYS may have a primitive version of the human ability to put ourselves in another's shoes. Intelligent animals such as apes can intuit others' intentions, suggesting they have some theory of mind capability. But only humans can reason that others may not hold their own beliefs. To study this difference, Rogier Mars of the University of Oxford and colleagues scanned 36 people's brains. Using an algorithm, they created a map of how an area associated with theory of mind is connected to brain regions linked to abilities such as face recognition and interpretation. Next, the researchers scanned 12 macaque brains for a similar pattern of connections. An area involved in facial recognition had a similar pattern, suggesting involvement in abstract thought. That doesn't necessarily mean the structures share a function, Mars says. Theory of mind is probably a spectrum of ways of thinking, he says, and humans got better at it as they evolved. Laurie Santos of Yale University says the structural differences may one day tell us why non-human primates lack the ability to think about others' beliefs. © Copyright Reed Business Information Ltd.
By Keith Payne It was a summer evening when Tony Cornell tried to make the residents of Cambridge, England see a ghost. He got dressed up in a sheet and walked through a public park waving his arms about. Meanwhile his assistants observed the bystanders for any hint that they noticed something strange. No, this wasn’t Candid Camera. Cornell was a researcher interested in the paranormal. The idea was first to get people to notice the spectacle, and then see how they understood what their eyes were telling them. Would they see the apparition as a genuine ghost or as something more mundane, like a bloke in a bed sheet? The plan was foiled when not a single bystander so much as raised an eye brow. Several cows did notice, however, and they followed Cornell on his ghostly rambles. Was it just a fluke, or did people “not want to see” the besheeted man, as Cornell concluded in his 1959 report? Okay, that stunt was not a very good experiment, but twenty years later the eminent psychologist Ulric Neisser did a better job. He filmed a video of two teams of students passing a basketball back and forth, and superimposed another video of a girl with an umbrella walking right through the center of the screen. When he asked subjects in his study to count the number of times the ball was passed, an astonishing 79 percent failed to notice the girl with the umbrella. In the years since, hundreds of studies have backed up the idea that when attention is occupied with one thing, people often fail to notice other things right before their eyes. When you first learn about these studies they seem deeply strange. Is it really possible that we are constantly failing to notice things right in front of us? Is there some mysterious force screening what we see and what remains hidden? © 2013 Scientific American
Link ID: 18263 - Posted: 06.12.2013
by David Robson NO CREVICE of the human experience is safe. Our deepest fears and desires, our pasts and our futures – all have been revealed, and all in the form of colourful images that look like lava bubbling under the skull. That, at least, is the popular conception of neuroscience – and it's worth big money. The USMovie Camera and the European Union are throwing billions of dollars at two new projects to map the human brain. Yet there is also a growing anxiety that many of neuroscience's findings don't stand up to scrutiny. It's not just sensational headlines reporting a "dark patch" in a psychopath's brain, there are now serious concerns that some of the methods themselves are flawed. The intrepid outsider needs expert guidance through this rocky terrain – and there's no better place to start than Brainwashed by Sally Satel and Scott O. Lilienfeld. Satel, a practising psychiatrist, and Lilienfeld, a clinical psychologist, are terrific sherpas. They are clear-sighted, considered and forgiving of the novice's ignorance. Their first stop is the fMRI scan – a staple of much brain research. Worryingly, the statistical techniques used to construct the images sometimes create a mirage of activity where none should exist. They have a telling example: one research team watching a salmon in an fMRI scanner as images of human faces were flashed at it saw its brain spark into life in certain shots – even though it was dead. © Copyright Reed Business Information Ltd.
Keyword: Brain imaging
Link ID: 18222 - Posted: 06.04.2013
Rebecca J. Rosen What would you draw if somebody told you to draw a neuron? According to a new study, your sketch will depend on how much science education you have, but not in the way you'd expect. In the image above, the top row -- those detailed, labeled, neat renderings -- are the work of undergraduates. The bottom row, with their janky, sparse lines, come from the leaders of neuroscience research laboratories. That martini-glass looking thing over there on the left? That's a neuron, as drawn by a professional scientist. The middle row, some intermediary step, shows drawings from postdocs and graduate students. These drawings come from a new study published in the journal Science Education. Its authors, a team at King's College London led by education professor David Hay, found that nearly every single undergraduate student they studied (all but three of 126) faithfully reproduced textbook-style neurons, something akin to a canonical image from an 1899 book detailing the brain, which, the authors say, "has enjoyed an unusually pervasive influence." These drawings are "typified by a multipolar cell body and truncated, feathery dendritic processes around a clearly demarcated nucleus." Many of the drawings were annotated. For the "trainee scientists" -- those in PhD programs or completing a postdoc -- the neurons appeared more like what would be seen in a microscope image. Nuclei were excluded, the number of dendrites was reduced, and orientation was inconsistent -- all characterizing neurons as you would see them "in nature" not in the pages of a textbook. © 2013 by The Atlantic Monthly Group
by Helen Thomson Sean O'Connor is a very rational man. But he also tried, unsuccessfully, to sever his spine, and still feels a need to be paralysed. Sean has body integrity identity disorder (BIID), which causes him to feel that his limbs just don't belong to his body. Sean's legs function correctly and he has full sensation in them, but they feel disconnected from him. "I don't hate my limbs – they just feel wrong," he says. "I'm aware that they are as nature designed them to be, but there is an intense discomfort at being able to feel my legs and move them." The cause of his disorder has yet to be pinpointed, but it almost certainly stems from a problem in the early development of his brain. "My earliest memories of feeling I should be paralysed go back to when I was 4 or 5 years old," says Sean. The first case of BIID was reported in the 18th century, when a French surgeon was held at gunpoint by an Englishman who demanded that one of his legs be removed. The surgeon, against his will, performed the operation. Later, he received a handsome payment from the Englishman, with an accompanying letter of thanks for removing "a limb which put an invincible obstacle to my happiness" (Experimental Brain Research, DOI: 10.1007/s00221-009-2043-7). We now think that there are at least two forms of BIID. In one, people wish that part of their body were paralysed. Another form causes people to want to have a limb removed. BIID doesn't have to affect limbs either – there have been anecdotal accounts of people wishing they were blind or deaf. © Copyright Reed Business Information Ltd.
Link ID: 18215 - Posted: 06.01.2013
By Gary Stix Unraveling the mystery of consciousness remains perhaps the biggest challenge in all neuroscience, so big and amorphous that most brain scientists won’t go near the topic, leaving philosophers to speculate about the a prioris. Even defining what consciousness is quickly devolves into lengthy and often ponderous treatises. The World Science Festival assembled a panel of luminaries who will attempt to make sense of this sprawling theme in the allotted 90 minutes. They included Mélanie Boly, a researcher and physician who has performed studies on minimally conscious patients; Christof Koch, a leading researcher on the neural basis of consciousness; Colin McGinn, known for his work on the philosophy of mind, and Nicholas Schiff, a physician-scientist who specializes in disorders of consciousness. Click below here to see these leading lights gathered at NYU’s Skirball Center for the Performing Arts on May 30 to take on whether Homo sapiens is the only conscious species, the question of whether consciousness transcends the physical boundaries of the brain, and an exploration of the biochemical processes that underlie the life of the mind. The session, entitled “The Whispering Mind: The Enduring Conundrum of Consciousness,” is moderated by ABC Nightline co-anchor Terry Moran. © 2013 Scientific American
Link ID: 18212 - Posted: 06.01.2013
By Tina Hesman Saey Genetic factors may exert a tiny influence on how much schooling a person ends up with, a new study suggests. But the main lesson of the research, experts say, should be that attributing cultural and socioeconomic traits to genes is a dicey enterprise. “If there is a policy implication, it’s that there’s even more reason to be skeptical of genetic determinism,” says sociologist Jeremy Freese of Northwestern University in Evanston, Ill. Published May 30 in Science by a group of more than 200 researchers, the study does mark the first time genetic factors have been reproducibly associated with a social trait, says Richard Ebstein, a behavioral geneticist at the National University of Singapore. “It announces to social scientists that some things they’ve been studying that make a difference to health and life success do have a base in genetics.” But even if it does survive further inspection — and many similar links between genes and social characteristics have not — the study accounts for no more than 2 percent of whatever it is that makes one person continue school while someone in similar circumstances chooses to move on to something else. Previous studies comparing twins and family members have suggested that not-yet-identified genetic factors can explain 40 percent of people’s educational attainment; factors such as social groups, economic status and access to education would explain the other 60 percent. That percentage attributed to genetics is similar to the heritability of physical and medical characteristics such as weight and risk of heart disease.That makes a hunt for the genetic factors underlying educational attainment an attractive prospect. © Society for Science & the Public 2000 - 2013
Posted by Gary Marcus A few weeks ago, while staying with my in-laws, my four-month-old son woke up at two-thirty in the morning. He was hungry, and, knowing that he would not be coaxed back to sleep without a bottle, I brought him downstairs to the kitchen, where his crying stopped abruptly. He clearly recognized that he had arrived in an unfamiliar place, and he became fully absorbed in understanding where he was and how he’d gotten there. He was searingly alert; he craned his head and his eyes darted around. The eight minutes or so that it took it to warm the bottle, usually a time of intense complaint, passed with hardly a peep. I became convinced that, for the first time, my son was fully, consciously aware of his surroundings. As a scientist, I realize that my experience was subjective. But the leading scientific journal, Science, just published the results of an experiment that endeavored to look objectively at the rudiments of consciousness in infants. This work, conducted by the cognitive psychologists Sid Kouider, Stanislas Dehaene, and Ghislaine Dehaene-Lambertz, is an examination of brain waves in babies between five and fifteen months old, aimed at constructing what the scientists refer to as a “biological signature of consciousness.” The background of this experiment is a theory called the “global workspace” model of consciousness, according to which perceptual awareness involves two stages of neural activity. The first is a purely sensory activation, typically in the back of the brain. The second stage reflects a kind of “ignition,” and is achieved only for stimuli that are consciously perceived. © 2013 Condé Nast.
By ALAN SCHWARZ An analysis published Wednesday by the American Medical Association said children with attention deficit hyperactivity disorder who take stimulant medication do not have a lower risk over all for later substance abuse, contradicting the longstanding and influential message that such medicines tend to deter those with the disorder from abusing other substances. The paper, written by three researchers at the University of California, Los Angeles, examined data from 15 previous studies on the subject and determined that, on average, medications like Adderall and Ritalin had no effect one way or the other on whether children abused alcohol, marijuana, nicotine or cocaine later in life. A 2003 study in the journal Pediatrics had concluded that the introduction of stimulant medication to children with A.D.H.D. reduced the risk of such abuse later in life, a finding that has been repeated by doctors and pharmaceutical companies not only to assuage parents’ fears of medication but also to suggest that the pills would protect their children from later harm. “I always doubted the whole ‘protection’ argument, and I wasn’t the only one, but that message was really out there,” said Liz Jorgensen, an adolescent addiction specialist at Insight Counseling in Ridgefield, Conn. “Hopefully, this message will be heard loud and clear.” The study comes amid growing concern about the persistent rise in A.D.H.D. diagnoses and prescriptions for medication among children. A recent New York Times analysis of data collected by the Centers for Disease Control and Prevention found that 11 percent of all children ages 4 through 17 — 6.4 million over all — had received a diagnosis of A.D.H.D. from a medical professional. The diagnosis rate rose to 19 percent for boys of high school age. © 2013 The New York Times Company
People with higher IQs are slow to detect large background movements because their brains filter out non-essential information, say US researchers. Instead, they are good at detecting small moving objects. The findings come in a study of 53 people given a simple, visual test in Current Biology. The results could help scientists understand what makes a brain more efficient and more intelligent. In the study, individuals watched short video clips of black and white bars moving across a computer screen. Some clips were small and filled only the centre of the screen, while others filled the whole screen. The participants' sole task was to identify in which direction the bars were drifting - to the right or to the left. Participants also took a standardised intelligence test. The results showed that people with higher IQ scores were faster at noticing the movement of the bars when observing the smallest image - but they were slower at detecting movement in the larger images. Michael Melnick of the University of Rochester, who was part of the research team said the results were very clear. "From previous research, we expected that all participants would be worse at detecting the movement of large images, but high IQ individuals were much, much worse. The authors explain that in most scenarios, background movement is less important than small moving objects in the foreground, for example driving a car, walking down a hall or moving your eyes across the room. BBC © 2013
By Tara Haelle Identification and treatment issues surrounding attention deficit hyperactivity disorder (ADHD) are challenging enough. Now research is shedding light on long-term outcomes for people with ADHD. A recent study in Pediatrics reports that men who had ADHD in childhood are twice as likely to be obese in middle age, even if they no longer exhibit symptoms of ADHD. ADHD is a mental disorder characterized by hyperactivity, impulsivity, inattention and inability to focus. It affects approximately 6.8 percent of U.S. children ages 3 to 17 in any given year, according to a recent report by the CDC. Medications used to treat ADHD, such as Ritalin (methylphenidate) or Adderall (dextroamphetamine and amphetamine), are stimulants that can suppress appetite, however, a couple recent retrospective studies have pointed to a possible increased risk for obesity among adults diagnosed with ADHD as children. The new 33-year prospective study started with 207 healthy middle-class white boys from New York City between 6 and 12 years old, who had been diagnosed with ADHD. When the cohort reached an average age of 18, another 178 healthy boys without ADHD were recruited for comparison. At the most recent follow-up when the participants were an average age of 41, a total of 222 men remained in the study. A troubling pattern emerged: A comparison of the men’s self-reported height and weight revealed that twice as many men with childhood ADHD were obese than those without childhood ADHD. The average body mass index (BMI) of the men with childhood ADHD was 30.1 and 41.4 percent were obese, whereas those without the condition as kids reported an average BMI of 27.6 and an obesity rate of 21.6 percent. The association held even after the researchers controlled for socioeconomic status, depression, anxiety and substance abuse disorders. © 2013 Scientific American
By Bruce Bower In its idealized form, science resembles a championship boxing match. Theories square off, each vying for the gold belt engraved with “Truth.” Under the stern eyes of a host of referees, one theory triumphs by best explaining available evidence — at least until the next bout. But in the real world, science sometimes works more like a fashion show. Researchers clothe plausible explanations of experimental findings in glittery statistical suits and gowns. These gussied-up hypotheses charm journal editors and attract media coverage with carefully orchestrated runway struts, never having to battle competitors. Then there’s psychology. Even more than other social scientists — and certainly more than physical scientists — psychologists tend to overlook or dismiss hypotheses that might topple their own, says Klaus Fiedler of the University of Heidelberg in Germany. They explain experimental findings with ambiguous terms that make no testable predictions at all; they build careers on theories that have never bested a competitor in a fair scientific fight. In many cases, no one knows or bothers to check how much common ground one theory shares with others that address the same topic. Problems like these, Fiedler and his colleagues contended last November in Perspectives in Psychological Science, afflict sets of related theories about such psychological phenomena as memory and decision making. In the end, that affects how well these phenomena are understood. © Society for Science & the Public 2000 - 2013
Link ID: 18170 - Posted: 05.20.2013
by Emily Underwood If you are one of the 20% of healthy adults who struggle with basic arithmetic, simple tasks like splitting the dinner bill can be excruciating. Now, a new study suggests that a gentle, painless electrical current applied to the brain can boost math performance for up to 6 months. Researchers don't fully understand how it works, however, and there could be side effects. The idea of using electrical current to alter brain activity is nothing new—electroshock therapy, which induces seizures for therapeutic effect, is probably the best known and most dramatic example. In recent years, however, a slew of studies has shown that much milder electrical stimulation applied to targeted regions of the brain can dramatically accelerate learning in a wide range of tasks, from marksmanship to speech rehabilitation after stroke. In 2010, cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford in the United Kingdom showed that, when combined with training, electrical brain stimulation can make people better at very basic numerical tasks, such as judging which of two quantities is larger. However, it wasn't clear how those basic numerical skills would translate to real-world math ability. © 2010 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 18168 - Posted: 05.18.2013
Linda Carroll TODAY contributor We all get lost or disoriented once in a while, but for Sharon Roseman, being lost is a way of life. A little quirk in her brain makes it impossible to recognize landmarks and find her way around neighborhoods that should have become familiar long ago. “I can literally see my house out the car window, but I have no clue that it’s my house,” Roseman told NBC’s Kristen Dahlgren. Roseman, 64, suffers from developmental topographical disorientation, or DTD, a disorder that had flown under brain researchers’ radar until very recently. DTD was first described as a single case study in a paper published online in 2008 in the journal Neuropsychologia. At the time, it was thought to be extremely rare, says the study’s lead author, Giuseppe Iaria, professor of cognitive neuroscience at the University of Calgary. But since then, Iaria has discovered nearly 1,000 other people with DTD and he thinks there may be a lot more. He currently estimates that about 2 percent of the population may be constantly coping with orientation and navigation problems caused by the disorder. DTD is a profound and disabling deficit. Nothing, not even the layout of a house you’ve lived in for decades, ever becomes familiar. And for Roseman that has made life very trying. When her kids would cry in the night, she would struggle to find her way to them.
Link ID: 18155 - Posted: 05.14.2013
by Helen Thomson "I've been in a crowded elevator with mirrors all around, and a woman will move and I'll go to get out the way and then realise: 'oh that woman is me'." Heather Sellers has prosopagnosia, more commonly known as face blindness. "I can't remember any image of the human face. It's simply not special to me," she says. "I don't process them like I do a car or a dog. It's not a visual problem, it's a perception problem." Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn't diagnosed until she was in her 30s. "I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy." The condition is estimated to affect around 2.5 per cent of the population, and it's common for those who have it not to realise that anything is wrong. "In many ways it's a subtle disorder," says Heather. "It's easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it's socially devastating." As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn't initially recognise her mother. "But I didn't know that I wasn't recognising her." © Copyright Reed Business Information Ltd
Link ID: 18119 - Posted: 05.04.2013
by Lizzie Wade If you were a rat living in a completely virtual world like in the movie The Matrix, could you tell? Maybe not, but scientists studying your brain might be able to. Today, researchers report that certain cells in rat brains work differently when the animals are in virtual reality than when they are in the real world. The neurons in question are known as place cells, which fire in response to specific physical locations in the outside world and reside in the hippocampus, the part of the brain responsible for spatial navigation and memory. As you walk out of your house every day, the same place cell fires each time you reach the shrub that's two steps away from your door. It fires again when you reach the same place on your way back home, even though you are traveling in the opposite direction. Scientists have long suspected that these place cells help the brain generate a map of the world around us. But how do the place cells know when to fire in the first place? Previous research showed that the cells rely on three different kinds of information. First, they analyze "visual cues," or what you see when you look around. Then, there are what researchers call "self-motion cues." These cues come from how your body moves in space and are the reason you can still find your way around a room with the lights out. The final type of information is the "proximal cues," which encompass everything else about the environment you're in. The smell of a bakery on your way to work, the sounds of a street jammed with traffic, and the springy texture of grass in a park are all proximal cues. © 2010 American Association for the Advancement of Science.
By Scott O. Lilienfeld and Hal Arkowitz A German children's book from 1845 by Heinrich Hoffman featured “Fidgety Philip,” a boy who was so restless he would writhe and tilt wildly in his chair at the dinner table. Once, using the tablecloth as an anchor, he dragged all the dishes onto the floor. Yet it was not until 1902 that a British pediatrician, George Frederic Still, described what we now recognize as attention-deficit hyperactivity disorder (ADHD). Since Still's day, the disorder has gone by a host of names, including organic drivenness, hyperkinetic syndrome, attention-deficit disorder and now ADHD. Despite this lengthy history, the diagnosis and treatment of ADHD in today's children could hardly be more controversial. On his television show in 2004, Phil McGraw (“Dr. Phil”) opined that ADHD is “so overdiagnosed,” and a survey in 2005 by psychologists Jill Norvilitis of the University at Buffalo, S.U.N.Y., and Ping Fang of Capitol Normal University in Beijing revealed that in the U.S., 82 percent of teachers and 68 percent of undergraduates agreed that “ADHD is overdiagnosed today.” According to many critics, such overdiagnosis raises the specter of medicalizing largely normal behavior and relying too heavily on pills rather than skills—such as teaching children better ways of coping with stress. Yet although data point to at least some overdiagnosis, at least in boys, the extent of this problem is unclear. In fact, the evidence, with notable exceptions, appears to be stronger for the undertreatment than overtreatment of ADHD. © 2013 Scientific American,
Alison Abbott Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack. A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks. David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”. An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the underpinnings of ‘unconscious-thought theory’. © 2013 Nature Publishing Group
By ALAN SCHWARZ FRESNO, Calif. — Lisa Beach endured two months of testing and paperwork before the student health office at her college approved a diagnosis of attention deficit hyperactivity disorder. Then, to get a prescription for Vyvanse, a standard treatment for A.D.H.D., she had to sign a formal contract — promising to submit to drug testing, to see a mental health professional every month and to not share the pills. “As much as it stunk, it’s nice to know, ‘O.K., this is legit,' ” said Ms. Beach, a senior at California State University, Fresno. The rigorous process, she added, has deterred some peers from using the student health office to obtain A.D.H.D. medications, stimulants long abused on college campuses. “I tell them it takes a couple months,” Ms. Beach said, “and they’re like, ‘Oh, never mind.’ ” Fresno State is one of dozens of colleges tightening the rules on the diagnosis of A.D.H.D. and the subsequent prescription of amphetamine-based medications like Vyvanse and Adderall. Some schools are reconsidering how their student health offices handle A.D.H.D., and even if they should at all. Various studies have estimated that as many as 35 percent of college students illicitly take these stimulants to provide jolts of focus and drive during finals and other periods of heavy stress. Many do not know that it is a federal crime to possess the pills without a prescription and that abuse can lead to anxiety, depression and, occasionally, psychosis. Although few experts dispute that stimulant medications can be safe and successful treatments for many people with a proper A.D.H.D. diagnosis, the growing concern about overuse has led some universities, as one student health director put it, “to get out of the A.D.H.D. business.” © 2013 The New York Times Company
By JAMES GORMAN TRONDHEIM, Norway — In 1988, two determined psychology students sat in the office of an internationally renowned neuroscientist in Oslo and explained to him why they had to study with him. Unfortunately, the researcher, Per Oskar Andersen, was hesitant, May-Britt Moser said as she and her husband, Edvard I. Moser, now themselves internationally recognized neuroscientists, recalled the conversation recently. He was researching physiology and they were interested in the intersection of behavior and physiology. But, she said, they wouldn’t take no for an answer. “We sat there for hours. He really couldn’t get us out of his office,” Dr. May-Britt Moser said. “Both of us come from nonacademic families and nonacademic places,” Edvard said. “The places where we grew up, there was no one with any university education, no one to ask. There was no recipe on how to do these things.” “And how to act politely,” May-Britt interjected. “It was just a way to get to the point where we wanted to be. But seen now, when I know the way people normally do it,” he said, smiling at the memory of his younger self, “I’m quite impressed.” So, apparently, was Dr. Andersen. In the end, he yielded to the Mosers’ combination of furious curiosity and unwavering determination and took them on as graduate students. They have impressed more than a few people since. In 2005, they and their colleagues reported the discovery of cells in rats’ brains that function as a kind of built-in navigation system that is at the very heart of how animals know where they are, where they are going and where they have been. They called them grid cells. © 2013 The New York Times Company
Link ID: 18099 - Posted: 04.30.2013