Chapter 18. Attention and Higher Cognition
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Anil Ananthaswamy Science journalist Anil Ananthaswamy thinks a lot about "self" — not necessarily himself, but the role the brain plays in our notions of self and existence. In his new book, The Man Who Wasn't There, Ananthaswamy examines the ways people think of themselves and how those perceptions can be distorted by brain conditions, such as Alzheimer's disease, Cotard's syndrome and body integrity identity disorder, or BIID, a psychological condition in which a patient perceives that a body part is not his own. Ananthaswamy tells Fresh Air's Terry Gross about a patient with BIID who became so convinced that a healthy leg wasn't his own that he eventually underwent an amputation of the limb. "Within 12 hours, this patient that I saw, he was sitting up and there was no regret. He really seemed fine with having given up his leg," Ananthaswamy says. Ultimately, Ananthaswamy says, our sense of self is a layered one, which pulls information from varying parts of the brain to create a sense of narrative self, bodily self and spiritual self: "What it comes down to is this sense we have of being someone or something to which things are happening. It's there when we wake up in the morning, it kind of disappears when we go to sleep, it reappears in our dreams, and it's also this sense we have of being an entity that spans time." Interview Highlights On how to define "self" When you ask someone, "Who are you?" you're most likely to get a kind of narrative answer, "I am so-and-so, I'm a father, I'm son." They are going to tell you a kind of story they have in their heads about themselves, the story that they tell to themselves and to others, and in some sense that's what can be called the narrative self. ... © 2015 NPR
Link ID: 21235 - Posted: 07.29.2015
By Ariana Eunjung Cha The Defense Advanced Research Projects Agency funds a lot of weird stuff, and in recent years more and more of it has been about the brain. Its signature work in this field is in brain-computer interfaces and goes back several decades to its Biocybernetics program, which sought to enable direct communication between humans and machines. In 2013, DARPA made headlines when it announced that it intended to spend more than $70 million over five years to take its research to the next level by developing an implant that could help restore function or memory in people with neuropsychiatric issues. Less known is DARPA's Narrative Networks (or N2) project which aims to better understand how stories — or narratives — influence human behavior and to develop a set of tools that can help facilitate faster and better communication of information. "Narratives exert a powerful influence on human thoughts, emotions and behavior and can be particularly important in security contexts," DARPA researchers explained in a paper published in the Journal of Neuroscience Methods in April. They added that "in conflict resolution and counterterrorism scenarios, detecting the neural response underlying empathy induced by stories is of critical importance." This is where the work on the Hitchcock movies comes in. Researchers at the Georgia Institute of Technology recruited undergraduates to be hooked up to MRI machines and watch movie clips that were roughly three minutes long. The excerpts all featured a character facing a potential negative outcome and were taken from suspenseful movies, including three Alfred Hitchcock flicks as well as "Alien," "Misery," "Munich" and "Cliffhanger," among others.
By Roni Caryn Rabin “Fat” cartoon characters may lead children to eat more junk food, new research suggests, but there are ways to counter this effect. The findings underscore how cartoon characters, ubiquitous in children’s books, movies, television, video games, fast-food menus and graphic novels, may influence children’s behavior in unforeseen ways, especially when it comes to eating. Researchers first randomly showed 60 eighth graders a svelte jelly-bean-like cartoon character or a similar rotund character and asked them to comment on the images. Then they thanked them and gestured toward bowls of Starburst candies and Hershey’s Kisses, saying, “You can take some candy.” Children who had seen the rotund cartoon character helped themselves to more than double the number of candies as children shown the lean character, taking 3.8 candies on average, compared with 1.7 taken by children shown the lean bean character. (Children in a comparison group shown an image of a coffee mug took 1.5 candies on average.) But activating children’s existing health knowledge can counter these effects, the researchers discovered. In a separate experiment, they showed 167 elementary school children two red Gumby-like cartoon characters, one fat and one thin, and then asked them to “taste test” some cookies. But they also asked the children to “think about things that make you healthy,” such as getting enough sleep versus watching TV, or drinking soda versus milk. Some children were asked the health questions before being given the cookie taste test, while others were asked the questions after the taste test. Remarkably, the children who were asked about healthy habits before doing the taste test ate fewer cookies — even if they had first been exposed to the rotund cartoon character. Those who were shown the rotund figure ate 4.2 cookies on average if they were asked about healthy habits after eating the cookies, compared to three cookies if they were asked about healthy habits before doing the taste test. Children who saw the normal weight character and who were asked about healthy habits after the taste test also ate about three cookies. © 2015 The New York Times Company
By Neuroskeptic According to British biochemist Donald R. Forsdyke in a new paper in Biological Theory, the existence of people who seem to be missing most of their brain tissue calls into question some of the “cherished assumptions” of neuroscience. I’m not so sure. Forsdyke discusses the disease called hydrocephalus (‘water on the brain’). Some people who suffer from this condition as children are cured thanks to prompt treatment. Remarkably, in some cases, these post-hydrocephalics turn out to have grossly abnormal brain structure: huge swathes of their brain tissue are missing, replaced by fluid. Even more remarkably, in some cases, these people have normal intelligence and display no obvious symptoms, despite their brains being mostly water. This phenomenon was first noted by a British pediatrician called John Lorber. Lorber never published his observations in a scientific journal, although a documentary was made about them. However, his work was famously discussed in Science in 1980 by Lewin in an article called “Is Your Brain Really Necessary?“. There have been a number of other more recent published cases. Forsdyke argues that such cases pose a problem for mainstream neuroscience. If a post-hydrocephalic brain can store the same amount of information as a normal brain, he says, then “brain size does not scale with information quantity”, therefore, “it would seem timely to look anew at possible ways our brains might store their information.”
Alison Abbott Neuroscientists have identified an area of the brain that might give the human mind its unique abilities, including language. The area lit up in human, but not monkey, brains when they were presented with different types of abstract information. The idea that integrating abstract information drives many of the human brain's unique abilities has been around for decades. But a paper published1 in Current Biology, which directly compares activity in human and macaque monkey brains as they listen to simple auditory patterns, provides the first physical evidence that a specific area for such integration may exist in humans. Other studies that compare monkeys and humans have revealed differences in the brain’s anatomy, for example, but not differences that could explain where humans’ abstract abilities come from, say neuroscientists. “This gives us a powerful clue about what is special about our minds,” says psychologist Gary Marcus at New York University. “Nothing is more important than understanding how we got to be how we are.” A team of researchers headed by Stanislas Dehaene at the INSERM Cognitive Neuroimaging Unit at Gif-sur-Yvette near Paris, looked at changing patterns of activation in the brain as untrained monkeys and human adults listened to a simple sequence of tones, for example three identical tones followed by a different tone (like the famous four-note opening of Beethoven’s fifth symphony: da-da-da-DAH). The researchers played several different sequences with this structure — known as AAAB — and other sequences to the subjects while they lay in a functional magnetic resonance imaging (fMRI) scanner. The fMRI technique picks up changes in blood flow in the brain that correlate with regional brain activity. © 2015 Nature Publishing Group,
By Gretchen Reynolds A walk in the park may soothe the mind and, in the process, change the workings of our brains in ways that improve our mental health, according to an interesting new study of the physical effects on the brain of visiting nature. Most of us today live in cities and spend far less time outside in green, natural spaces than people did several generations ago. City dwellers also have a higher risk for anxiety, depression and other mental illnesses than people living outside urban centers, studies show. These developments seem to be linked to some extent, according to a growing body of research. Various studies have found that urban dwellers with little access to green spaces have a higher incidence of psychological problems than people living near parks and that city dwellers who visit natural environments have lower levels of stress hormones immediately afterward than people who have not recently been outside. But just how a visit to a park or other green space might alter mood has been unclear. Does experiencing nature actually change our brains in some way that affects our emotional health? That possibility intrigued Gregory Bratman, a graduate student at the Emmett Interdisciplinary Program in Environment and Resources at Stanford University, who has been studying the psychological effects of urban living. In an earlier study published last month, he and his colleagues found that volunteers who walked briefly through a lush, green portion of the Stanford campus were more attentive and happier afterward than volunteers who strolled for the same amount of time near heavy traffic. But that study did not examine the neurological mechanisms that might underlie the effects of being outside in nature. So for the new study, which was published last week in Proceedings of the National Academy of Sciences, Mr. Bratman and his collaborators decided to closely scrutinize what effect a walk might have on a person’s tendency to brood. © 2015 The New York Times Company
Jon Hamilton It's almost impossible to ignore a screaming baby. (Click here if you doubt that.) And now scientists think they know why. "Screams occupy their own little patch of the soundscape that doesn't seem to be used for other things," says David Poeppel, a professor of psychology and neuroscience at New York University and director of the Department of Neuroscience at the Max Planck Institute in Frankfurt. And when people hear the unique sound characteristics of a scream — from a baby or anyone else — it triggers fear circuits in the brain, Poeppel and a team of researchers report in Cell Biology. The team also found that certain artificial sounds, like alarms, trigger the same circuits. "That's why you want to throw your alarm clock on the floor," Poeppel says. The researchers in Poeppel's lab decided to study screams in part because they are a primal form of communication found in every culture. And there was another reason. "Many of the postdocs in my lab are in the middle of having kids and, of course, screams are very much on their mind," Poeppel says. "So it made perfect sense for them to be obsessed with this topic." The team started by trying to figure out "what makes a scream a scream," Poeppel says. Answering that question required creating a large database of recorded screams — from movies, from the Internet and from volunteers who agreed to step into a sound booth. A careful analysis of these screams found that they're not like any other sound that people make, including other loud, high-pitched vocalizations. The difference is something called the amplitude modulation rate, which is how often the loudness of a sound changes. © 2015 NPR
That song really is stuck in your head. The experience of hearing tunes in your mind appears to be linked to physical differences in brain structure. The study is the first to look at the neural basis for “involuntary musical imagery” – or “earworms”. They aren’t just a curiosity, says study co-author Lauren Stewart at Goldsmith’s, University of London, but could have a biological function. Stewart, a music psychologist, was first inspired to study earworms by a regular feature on the radio station BBC 6Music, in which listeners would write in with songs they had woken up with in their heads. There was a lot of interest from the public in what they are and where they had come from, but there was little research on the topic, she says. Once Stewart and her team started researching earworms, it became clear that some people are affected quite severely: one person even wrote to them saying he had lost his job because of an earworm. To find out what makes some people more susceptible to the phenomenon, the team asked 44 volunteers about how often they got earworms and how they were affected by them. Then they used MRI scans to measure the thickness of volunteers’ cerebral cortices and the volume of their grey matter in various brain areas. Brain differences People who suffered earworms more frequently had thicker cortices in areas involved in auditory perception and pitch discrimination. © Copyright Reed Business Information Ltd.
By Laura Sanders Everybody knows people who seem to bumble through life with no sense of time — they dither for hours on a “quick” e-mail or expect an hour’s drive to take 20 minutes. These people are always late. But even for them, such minor lapses in timing are actually exceptions. We notice these flaws precisely because they’re out of the ordinary. Humans, like other animals, are quite good at keeping track of passing time. This talent does more than keep office meetings running smoothly. Almost everything our bodies and brains do requires precision clockwork — down to milliseconds. Without a sharp sense of time, people would be reduced to insensate messes, unable to move, talk, remember or learn. “We don’t think about it, but just walking down the street is an exquisitely timed operation,” says neuroscientist Lila Davachi of New York University. Muscles fire and joints steady themselves in a precisely orchestrated time series that masquerades as an unremarkable part of everyday life. A sense of time, Davachi says, is fundamental to how we move, how we act and how we perceive the world. Yet for something that forms the bedrock of nearly everything we do, time perception is incredibly hard to study. “It’s a quagmire,” says cognitive neuroscientist Peter Tse of Dartmouth College. The problem is thorny because there are thousands of possible intricate answers, all depending on what exactly scientists are asking. Their questions have begun to reveal an astonishingly complex conglomerate of neural timekeepers that influence each other. © Society for Science & the Public 2000 - 2015.
Link ID: 21177 - Posted: 07.16.2015
Computers built to mimic the brain can now recognise images, speech and even create art, and it’s all because they are learning from data we churn out online Do androids dream of electric squid? (Image: Reservoir Lab at Ghent University) I AM watching it have a very odd dream – psychedelic visions of brain tissue folds, interspersed with chunks of coral reef. The dreamer in question is an artificial intelligence, one that live-streams from a computer on the ground floor of the Technicum building in Ghent University, Belgium. This vision has been conjured up after a viewer in the chat sidebar suggests "brain coral" as a topic. It's a fun distraction – and thousands of people have logged on to watch. But beyond that, the bot is a visual demonstration of a technology that is finally coming of age: neural networks. The bot is called 317070, a name it shares with the Twitter handle of its creator, Ghent graduate student Jonas Degrave. It is based on a neural network that can recognise objects in images, except that Degrave runs it in reverse. Given static noise, it tweaks its output until it creates images that tally with what viewers are requesting online. The bot's live-stream page says it is "hallucinating", although Degrave says "imagining" is a little more accurate. Degrave's experiment plays off recent Google research which aimed to tackle one of the core issues with neural networks: that no one knows how neural networks come up with their answers. The images the network creates to satisfy simple instructions can give us some insights. © Copyright Reed Business Information Ltd
Link ID: 21149 - Posted: 07.09.2015
CONCORD, N.H. — Can an algorithm pass for an author? Can a robot rock the house? A series of contests at Dartmouth College is about to find out. Dartmouth is seeking artificial intelligence algorithms that create "human-quality" short stories, sonnets and dance music sets that will be pitted against human-produced literature, poetry and music selections. The judges won't know which is which. The goal is to determine whether people can distinguish between the two, and whether they might even prefer the computer-generated creativity. "Historically, often when we have advances in artificial intelligence, people will always say, 'Well, a computer couldn't paint a sunset,' or 'a computer couldn't write a beautiful love sonnet,' but could they? That's the question," said Dan Rockmore, director of the Neukom Institute for Computational Science at Dartmouth. Rockmore, a mathematics and computer science professor, spun off the idea for the contests from his experience riding a stationary bike. He started thinking about how the music being played during his spin class helped him pedal at the right the pace, and he was surprised when the instructor told him he selected the songs without the help of computer software. "I left there thinking, 'I wonder if I could write a program that did that, or somebody could?'" he said. "Because that is a creative act — a good spin instructor is a total artist. It sort of opened my mind to thinking about whether a computer or algorithm could produce something that was indistinguishable from or even perhaps preferred over what the human does." The competitions are variations of the "Turing Test," named for British computer scientist Alan Turing, who in 1950 proposed an experiment to determine if a computer could have humanlike intelligence. The classic Turing test involves intelligent computer programs that can fool a person carrying on a conversation with it, and there have been many competitions over the years, said Manuela Veloso, professor of computer science and robotics at Carnegie Mellon University and past president of the Association for the Advancement of Artificial Intelligence. © 2015 The New York Times Company
Link ID: 21135 - Posted: 07.06.2015
Jon Hamilton If you run into an old friend at the train station, your brain will probably form a memory of the experience. And that memory will forever link the person you saw with the place where you saw them. For the first time, researchers have been able to see that sort of link being created in people's brains, according to a study published Wednesday in the journal Neuron. The process involves neurons in one area of the brain that change their behavior as soon as someone associates a particular person with a specific place. "This type of study helps us understand the neural code that serves memory," says Itzhak Fried, an author of the paper and head of the Cognitive Neurophysiology Laboratory at UCLA. It also could help explain how diseases like Alzheimer's make it harder for people to form new memories, Fried says. The research is an extension of work that began more than a decade ago. That's when scientists discovered special neurons in the medial temporal lobe that respond only to a specific place, or a particular person, like the actress Jennifer Aniston. The experiment used a fake photo of actor Clint Eastwood and Pisa's leaning tower to test how the brain links person and place. More recently, researchers realized that some of these special neurons would respond to two people, but only if the people were connected somehow. For example, "a neuron that was responding to Jennifer Aniston was also responding to pictures of Lisa Kudrow," [another actress on the TV series Friends], says Matias Ison of the University of Leicester in the U.K. © 2015 NPR
Carl Zimmer Certain people, researchers have discovered, can’t summon up mental images — it’s as if their mind’s eye is blind. This month in the journal Cortex, the condition received a name: aphantasia, based on the Greek word phantasia, which Aristotle used to describe the power that presents visual imagery to our minds. I find research like this irresistible. It coaxes me to think about ways to experience life that are radically different from my own, and it offer clues to how the mind works. And in this instance, I played a small part in the discovery. In 2005, a 65-year-old retired building inspector paid a visit to the neurologist Adam Zeman at the University of Exeter Medical School. After a minor surgical procedure, the man — whom Dr. Zeman and his colleagues refer to as MX — suddenly realized he could no longer conjure images in his mind. Dr. Zeman couldn’t find any description of such a condition in medical literature. But he found MX’s case intriguing. For decades, scientists had debated how the mind’s eye works, and how much we rely on it to store memories and to make plans for the future. MX agreed to a series of examinations. He proved to have a good memory for a man of his age, and he performed well on problem-solving tests. His only unusual mental feature was an inability to see mental images. Dr. Zeman and his colleagues then scanned MX’s brain as he performed certain tasks. First, MX looked at faces of famous people and named them. The scientists found that certain regions of his brain became active, the same ones that become active in other people who look at faces. © 2015 The New York Times Company
Link ID: 21085 - Posted: 06.23.2015
by Bob Holmes Lions might be one of the biggest threats to hyenas, but that doesn't stop the smaller animals teaming up to steal from the big cats. Nora Lewin from Michigan State University in East Lansing and her colleagues observed the mobbing behaviour at the Masai Mara National Reserve in Kenya. Hyenas were also spotted banding together to keep lions away from their dens. The mobbing involves a surprising degree of cooperation and communication. Male lions, which actively pursue and kill hyenas, are much more of a danger than females, who usually just make threats. This could be why the hyenas in the video above are confronting females. The team suggests the hyenas can identify their opponent's age and sex before deciding as a group whether or not to mob it. Levin and her colleagues are now investigating how the hyenas communicate to make a group decision. The findings were reported on 13 June at the annual meeting of the Animal Behavior Society in Anchorage, Alaska. © Copyright Reed Business Information Ltd.
Maanvi Singh Teenagers aren't exactly known for their responsible decision making. But some young people are especially prone to making rash, risky decisions about sex, drugs and alcohol. Individual differences in the brain's working memory — which allows people to draw on and use information to make decisions — could help explain why some adolescents are especially impulsive when it comes to sex, according to a study published Wednesday in Child Development. "Working memory is the ability to keep different things in mind when you're making decisions or problem solving," explains Atika Khurana, an assistant professor of counseling psychology at the University of Oregon who led the study. Khurana and her colleagues rounded up 360 adolescents, ages 12 to 15, and assessed their working memory using a series of tests. For example, the researchers told the participants a string of random numbers and asked them to repeat what they heard in reverse order. "We basically tested their ability to keep information in mind while making decisions," Khurana says. The researchers then tracked all the participants for two years, and asked about the teens' sexual activity. And through another series of tests and surveys, the researcher tried to gauge how likely each teen was to act without thinking, to make rash decisions and take risks. There was a correlation between weaker working memory and the likelihood that a teen would have sex — including unprotected sex — at a younger age. And they were more likely to act without much deliberation. That trend held true even after the researchers accounted for the teenagers' age, socioeconomic status and gender. © 2015 NPR
The structure of the living cell is defined by the difference between what’s inside and what’s not. Biologists have taken great pains over the years to document the minute workings of the openings in cell membranes that allow hydrogen, sodium, calcium and other ions to make their way inside across the barrier that envelops the cell and its contents. Five scholars of the brain have built upon these observations to suggest that these activities may provide a foundation for a badly needed theory to understand consciousness and some of the cognitive processes that underlie it. They contend that when animal cells open and close themselves to the outside world, these actions can be construed as more than just responses to external stimuli. In fact, they constitute the basis for perception, cognition and movement in the animal kingdom—and may underlie consciousness itself. Read about what the five have to say and then continue to Koch’s reply. The five authors and NYU neurology professor Oliver Sacks; Antonio Damasio and Gil B. Carvalho from the University of Southern California, Norman D. Cook from the faculty of Kansai University in Osaka, Japan and Harry T. Hunt from Brock University in Ontario. They have framed their ideas in the form of an open letter to Christof Koch, president of the Allen Institute for Brain Science, and a Scientific American MIND columnist (Consciousness Redux) and member of Scientific American’s board of advisers.
Link ID: 21065 - Posted: 06.17.2015
By Jessica Schmerler Approximately one in 68 children is identified with some form of autism, from extremely mild to severe, according to the U.S. Centers for Disease Control. On average, diagnosis does not occur until after age four, yet all evidence indicates that early intervention is the best way to maximize the treatment impact. Various tests that look for signs of autism in infants have not been conclusive but a new exercise could improve early diagnosis, and also help reduce worry among parents that they did not intervene as soon as possible. The two most widely used tests to measure symptoms, the Autism Observation Scale for Infants (AOSI) and the Autism Diagnostic Observation Schedule (ADOS), cannot be used before the ages of 12 or 16 months respectively. The AOSI measures precursors to symptoms, such as a baby’s response to name, eye contact, social reciprocity, and imitation. The ADOS measures the characteristics and severity of autism symptoms such as social affectation and repetitive and restrictive behaviors. Now a group of scientists at the Babylab at Birkbeck, University of London think they have identified a marker that can predict symptom development more accurately and at an earlier age: enhanced visual attention. Experts have long recognized that certain individuals with autism have superior visual skills, such as increased visual memory or artistic talent. Perhaps the most well known example is Temple Grandin, a high-functioning woman with autism who wrote, “I used to become very frustrated when a verbal thinker could not understand something I was trying to express because he or she couldn’t see the picture that was crystal clear to me.” © 2015 Scientific American
by Penny Sarchet Children with ADHD are more likely to succeed in cognitive tasks when they are fidgeting. Rather than telling them to stop, is it time to let them squirm in class? The results, from a small study of teens and pre-teens, add to growing evidence that movement may help children with attention-deficit hyperactivity disorder to think. One of the theories about ADHD is that the brain is somehow under-aroused. Physical movements could help wake it up or maintain alertness, perhaps by stimulating the release of brain-signalling chemicals like dopamine or norepinephrine. This hypothesis would help explain why countries like the US are experiencing an epidemic of ADHD – it might be that a lack of physical activity leads to reduced brain function. Fidget britches In the latest study, Julie Schweitzer of the University of California, Davis, and her colleagues asked 44 children with ADHD and 29 kids without to describe an arrangement of arrows. The children with ADHD were more likely to focus on the task and answer correctly if the test coincided with them fidgeting, as tracked by an ankle monitor. Intriguingly, Schwietzer found that it is the vigour of movements, rather than how often children make them, that seems to be related to improvements in test scores. This might mean, for example, that it helps children to swing their legs in longer arcs, but not to swing them faster. "I think we need to consider that fidgeting is helpful," says Schweitzer. "We need to find ways that children with ADHD can move without being disruptive to others." Dustin Sarver at the University of Mississippi, who recently found a link between fidgeting and improved working memory, agrees. "We should revisit the targets we want for these children, such as improving the work they complete and paying attention, rather than focusing on sitting still." He suggests that movements that are not disruptive to other schoolchildren, such as squirming, bouncing and leg movements, as opposed to getting up in the middle of lessons, could be encouraged in classrooms. © Copyright Reed Business Information Ltd
By Gretchen Reynolds Treadmill desks are popular, even aspirational, in many offices today since they can help those of us who are deskbound move more, burn extra calories and generally improve our health. But an interesting new study raises some practical concerns about the effects of walking at your workspace and suggests that there may be unacknowledged downsides to using treadmill desks if you need to type or think at the office. The drumbeat of scientific evidence about the health benefits of sitting less and moving more during the day continues to intensify. One study presented last month at the 2015 annual meeting of the American College of Sports Medicine in San Diego found that previously sedentary office workers who walked slowly at a treadmill desk for two hours each workday for two months significantly improved their blood pressure and slept better at night. But as attractive as the desks are for health reasons, they must be integrated into a work setting so it seems sensible that they should be tested for their effects on productivity. But surprisingly little research had examined whether treadmill desks affect someone’s ability to get work done. So for the new study, which was published in April in PLOS One, researchers at Brigham Young University in Provo, Utah, recruited 75 healthy young men and women and randomly assigned them to workspaces outfitted with a computer and either a chair or a treadmill desk. The treadmill desk was set to move at a speed of 1.5 miles per hour with zero incline. None of the participants had used a treadmill desk before, so they received a few minutes of instruction and practice. Those assigned a chair were assumed to be familiar with its use. © 2015 The New York Times Company
Mo Costandi According to the old saying, the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide. Although modern science precludes the existence of the soul, it does suggest that there is a kernel of truth in this saying: it turns out the eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions. Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously. When we read, for instance, we make a series of very quick eye movements called saccades that fixate rapidly on one word after another. When we enter a room, we make larger sweeping saccades as we gaze around. Then there are the small, involuntary eye movements we make as we walk, to compensate for the movement of our head and stabilise our view of the world. And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep. What is now becoming clear is that some of our eye movements may actually reveal our thought process. Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate. This change in the eye may also reveal what a decision-maker is about to say: one group of researchers, for example, found that watching for dilation made it possible to predict when a cautious person used to saying ‘no’ was about to make the tricky decision to say ‘yes’. © 2015 Guardian News and Media Limited