Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Email David By David Grimm Place a housecat next to its direct ancestor, the Near Eastern wildcat, and it may take you a minute to spot the difference. They’re about the same size and shape, and, well, they both look like cats. But the wildcat is fierce and feral, whereas the housecat, thanks to nearly 10,000 years of domestication, is tame and adaptable enough to have become the world’s most popular pet. Now scientists have begun to pinpoint the genetic changes that drove this remarkable transformation. The findings, based on the first high-quality sequence of the cat genome, could shed light on how other creatures, even humans, become tame. “This is the closest thing to a smoking gun we’ve ever had,” says Greger Larson, an evolutionary biologist at the University of Oxford in the United Kingdom who has studied the domestication of pigs, dogs, and other animals. “We’re much closer to understanding the nitty-gritty of domestication than we were a decade ago.” Cats first entered human society about 9500 years ago, not long after people first took up farming in the Middle East. Drawn to rodents that had invaded grain stores, wildcats slunk out of the deserts and into villages. There, many scientists suspect, they mostly domesticated themselves, with the friendliest ones able to take advantage of human table scraps and protection. Over thousands of years, cats shrank slightly in size, acquired a panoply of coat colors and patterns, and (largely) shed the antisocial tendencies of their past. Domestic animals from cows to dogs have undergone similar transformations, yet scientists know relatively little about the genes involved. Researchers led by Michael Montague, a postdoc at the Washington University School of Medicine in St. Louis, have now pinpointed some of them. The scientists started with the genome of a domestic cat—a female Abyssinian—that had been published in draft form in 2007, then filled in missing sequences and identified genes. They compared the resulting genome with those of cows, tigers, dogs, and humans. © 2014 American Association for the Advancement of Science.
By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.
Link ID: 20296 - Posted: 11.10.2014
By James Gallagher Health editor, BBC News website The brain has specialist neurons for each of the five taste categories - salty, bitter, sour, sweet and umami - US scientists have discovered. The study, published in the journal Nature, should settle years of debate on how the brain perceives taste. The Columbia University team showed the separate taste sensors on the tongue had a matching partner in the brain. The scientists hope the findings could be used to help reverse the loss of taste sensation in the elderly. It is a myth that you taste sweet only on the tip of the tongue. Each of the roughly 8,000 taste buds scattered over the tongue is capable of sensing the full suite of tastes. But specialised cells within the taste bud are tuned to either salty, bitter, sour, sweet or umami tastes. When they detect the signal, a message is sent to the brain. Although how the brain deals with the information has been up for discussion. A team at Columbia University engineered mice so that their taste neurons would fluoresce when they were activated. They then trained their endoscopes on the neurons deep at their base of the brain. The animals were fed chemicals to trigger either a salty, bitter, sour, sweet or umami response on the tongue and the researchers monitored the change in the brain. They found a "hard wired" connection between tongue and brain. Prof Charles Zuker told the BBC News website: "The cells were beautifully tuned to discrete individual taste qualities, so you have a very nice match between the nature of the cells in your tongue and the quality they represent [in the brain]." It scotches the alternative idea that brain cells respond to multiple tastes. BBC © 2014
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20295 - Posted: 11.10.2014
Stem cells can be used to heal the damage in the brain caused by Parkinson's disease, according to scientists in Sweden. They said their study on rats heralded a "huge breakthrough" towards developing effective treatments. There is no cure for the disease, but medication and brain stimulation can alleviate symptoms. Parkinson's UK said there were many questions still to be answered before human trials could proceed. The disease is caused by the loss of nerve cells in the brain that produce the chemical dopamine ,which helps to control mood and movement. To simulate Parkinson's, Lund University researchers killed dopamine-producing neurons on one side of the rats' brains. They then converted human embryonic stem cells into neurons that produced dopamine. Parkinson's Disease Parkinson's is one of the commonest neurodegenerative diseases These were injected into the rats' brains, and the researchers found evidence that the damage was reversed. There have been no human clinical trials of stem-cell-derived neurons, but the researchers said they could be ready for testing by 2017. Malin Parmar, associate professor of developmental and regenerative neurobiology, said: "It's a huge breakthrough in the field [and] a stepping stone towards clinical trials." A similar method has been tried in a limited number of patients. It involved taking brain tissue from multiple aborted foetuses to heal the brain. Clinical trials were abandoned after mixed results, but about a third of the patients had foetal brain cells that functioned for 25 years. BBC © 2014
Link ID: 20292 - Posted: 11.08.2014
By Tracy Jarrett Autism advocates on Friday applauded Jerry Seinfeld's disclosure that he may be autistic, while warning against making him the poster boy for a disorder that is no laughing matter. “I think, on a very drawn-out scale, I think I’m on the spectrum,” Seinfeld told NBC Nightly News’ Brian Williams. "Basic social engagement is really a struggle. I'm very literal, when people talk to me and they use expressions, sometimes I don't know what they're saying," he said. "But I don't see it as dysfunctional, I just think of it as an alternate mindset." Seinfeld's revelation sends a positive message that the autism community is much larger and more diverse than people often understand, Ari Ne’eman, president of the Autistic Advocacy Network, told NBC News. Ne’eman is living with autism and says that there is still a tremendous amount of stigma surrounding autism that hinders the opportunities available to those with the disorder. “Think about what this does for a closeted autistic person who goes into the workplace knowing that their co-workers have just seen somebody they know, respect, and have a positive opinion of, like Jerry Seinfeld, identify in this way — it’s a valuable and important step in building a greater tolerance for autism,” Ne’eman said. Liz Feld, president of Autism Speaks, agreed, pointing out that “there are many people on the autism spectrum who can relate to Jerry’s heartfelt comments about his own experiences.”
Link ID: 20289 - Posted: 11.08.2014
by Penny Sarchet It's frustrating when your smartphone loses its signal in the middle of a call or when downloading a webpage. But for bats, a sudden loss of its sonar signal means missing an insect meal in mid-flight. Now there's evidence to suggest that bats are sneakily using sonar jamming techniques to make their fellow hunters miss their tasty targets. Like other bats, the Mexican free-tailed bat uses echolocation to pinpoint prey insects in the dark. But when many bats hunt in the same space, they can interfere with each other's echoes, making detection more difficult. Jamming happens when a sound disrupts a bat's ability to extract location information from the echoes returning from its prey, explains Aaron Corcoran of Johns Hopkins University in Baltimore, Maryland. Previous research has shown that Mexican free-tailed bats can get around this jamming by switching to higher pitches. Using different sound frequencies to map the hunting grounds around them allows many bats to hunt in the same space. In these studies, jamming of each other's signals was seemingly inadvertent – a simple consequence of two bats attempting to echolocate in close proximity. But Corcoran has found evidence of sneakier goings-on. Corcoran has found a second type of sonar jamming in these bats – intentional sabotage of a fellow bat. "In this study, the jamming is on purpose and the jamming signal has been designed by evolution to maximally disrupt the other bat's echolocation," he says. © Copyright Reed Business Information Ltd.
Link ID: 20288 - Posted: 11.08.2014
By Dwayne Godwin and Jorge Cham © 2014 Scientific American
by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.
Kate Baggaley Much of the increase in autism diagnoses in recent decades may be tied to changes in how the condition is reported. Sixty percent of the increase in autism cases in Denmark can be explained by these changes, scientists report November 3 in JAMA Pediatrics. The researchers followed all 677,915 people born in Denmark in 1980 through 1991, monitoring them from birth through the end of 2011. Among children born in this period, diagnoses increased fivefold, until 1 percent of children born in the early 1990s were diagnosed with autism by age 20. During these decades, Denmark experienced two changes in the way autism is reported. In 1994, the criteria physicians rely on to diagnose autism were updated in both the International Classification of Diseases manual used by Denmark and in its American counterpart, the Diagnostic and Statistical Manual of Mental Disorders. Then in 1995, the Danish Psychiatric Register began reporting diagnoses where doctors had only outpatient contact with children, in addition to cases where autism was diagnosed after children had been kept overnight. The researchers estimated Danish children’s likelihood of being diagnosed with autism before and after the two reporting changes. These changes accounted for 60 percent of the increase in diagnoses. © Society for Science & the Public 2000 - 2014.
Link ID: 20280 - Posted: 11.05.2014
By Amy Robinson Whether you’re walking, talking or contemplating the universe, a minimum of tens of billions of synapses are firing at any given second within your brain. “The weak link in understanding ourselves is really about understanding how our brains generate our minds and how our minds generate our selves,” says MIT neuroscientist Ed Boyden. One cubic millimeter in the brain contains over 100,000 neurons connected through a billion synapses computing on a millisecond timescale. To understand how information flows within these circuits, we first need a “brain parts” list of neurons and glia. But such a list is not enough. We’ll also need to chart how cells are connected and to monitor their activity over time both electrically and chemically. Researchers can do this at small scale thanks to a technology developed in the 1970s called patch clamping. Bringing a tiny glass needle very near to a neuron living within a brain allows researchers to perform microsurgery on single neurons, piercing the cell membrane to do things like record the millivolt electrical impulses flowing through it. Patch clamping also facilitates measurement of proteins contained within the cell, revealing characteristic molecules and contributing to our understanding of why one neuron may behave differently than another. Neuroscientists can even inject glowing dyes in order to see the shape of cells. Patch clamping is a technique that has been used in neuroscience for 40 years. Why now does it make an appearance as a novel neuroscience technology? In a word: robots. © 2014 Scientific American
Keyword: Brain imaging
Link ID: 20279 - Posted: 11.05.2014
By Kelly Servick Using data from old clinical trials, two groups of researchers have found a better way to predict how amyotrophic lateral sclerosis (ALS) progresses in different patients. The winning algorithms—designed by non-ALS experts—outperformed the judgments of a group of ALS clinicians given the same data. The advances could make it easier to test whether new drugs can slow the fatal neurodegenerative disease. The new work was inspired by the so-called ALS Prediction Prize, a joint effort by the ALS-focused nonprofit Prize4Life and Dialogue for Reverse Engineering Assessments and Methods, a computational biology project whose sponsors include IBM, Columbia University, and the New York Academy of Sciences. Announced in 2012, the $50,000 award was designed to bring in experts from outside the ALS field to tackle the notoriously unpredictable illness. Liuxia Wang, a data analyst at the marketing company Sentrana in Washington, D.C., was used to helping companies make business decisions based on big data sets, such as information about consumer choices, but says she “didn’t know too much about this life science thing” until she got an unusual query from a client. One of the senior managers she worked with revealed that her son had just been diagnosed with ALS and wondered if Sentrana’s analytics could apply to patient data, too. When Wang set out to investigate, she found the ALS Prediction Prize. The next step, she said, was to learn something about ALS. The disease destroys the neurons that control muscle movement, causing gradual paralysis and eventually killing about half of patients within 3 years of diagnosis. But the speed of its progression varies widely. About 10% of patients live a decade or more after being diagnosed. That makes it hard for doctors to answer patients’ questions about the future, and it’s a big problem for testing new ALS treatments. © 2014 American Association for the Advancement of Science.
Keyword: ALS-Lou Gehrig's Disease
Link ID: 20278 - Posted: 11.04.2014
James Gorman Here is something to keep arachnophobes up at night. The inside of a spider is under pressure, like the air in a balloon, because spiders move by pushing fluid through valves. They are hydraulic. This works well for the spiders, but less so for those who want to study what goes on in the brain of a jumping spider, an aristocrat of arachnids that, according to Ronald R. Hoy, a professor of neurobiology and behavior at Cornell University, is one of the smartest of all invertebrates. If you insert an electrode into the spider’s brain, what’s inside might squirt out, and while that is not the kind of thing that most people want to think about, it is something that the researchers at Cornell had to consider. Dr. Hoy and his colleagues wanted to study jumping spiders because they are very different from most of their kind. They do not wait in a sticky web for lunch to fall into a trap. They search out prey, stalk it and pounce. “They’ve essentially become cats,” Dr. Hoy said. And they do all this with a brain the size of a poppy seed and a visual system that is completely different from that of a mammal: two big eyes dedicated to high-resolution vision and six smaller eyes that pick up motion. Dr. Hoy gathered four graduate students in various disciplines to solve the problem of recording activity in a jumping spider’s brain when it spots something interesting — a feat nobody had accomplished before. In the end, they not only managed to record from the brain, but discovered that one neuron seemed to be integrating the information from the spider’s two independent sets of eyes, a computation that might be expected to involve a network of brain cells. © 2014 The New York Times Company
|By Sandra Upson Jan Scheuermann is not your average experimental subject. Diagnosed with spinocerebellar degeneration, she is only able to move her head and neck. The paralysis, which began creeping over her muscles in 1996, has been devastating in many ways. Yet two years ago she seized an opportunity to turn her personal liability into an extraordinary asset for neuroscience. In 2012 Scheuermann elected to undergo brain surgery to implant two arrays of electrodes on her motor cortex, a band of tissue on the surface of the brain. She did so as a volunteer in a multi-year study at the University of Pittsburgh to develop a better brain-computer interface. When she visits the lab, researchers hook up her brain to a robotic arm and hand, which she practices moving using her thoughts alone. The goal is to eventually allow other paralyzed individuals to regain function by wiring up their brains directly to a computer or prosthetic limb. The electrodes in her head record the firing patterns of about 150 of her neurons. Specific patterns of neuronal activity encode her desire to perform different movements, such as swinging the arm to the left or clasping the fingers around a cup. Two thick cables relay the data from her neurons to a computer, where software can identify Scheuermann’s intentions. The computer can then issue appropriate commands to move the robotic limb. On a typical workday, Jan Scheuermann arrives at the university around 9:15 am. Using her chin, she maneuvers her electric wheelchair into a research lab headed by neuroscientist Andrew Schwartz and settles in for a day of work. Scientific American Mind spoke to Scheuermann to learn more about her experience as a self-proclaimed “guinea pig extraordinaire.” © 2014 Scientific American,
Link ID: 20276 - Posted: 11.04.2014
Colin Barras It's the sweetest relief… until it's not. Scratching an itch only gives temporary respite before making it worse – we now know why. Millions of people experience chronic itching at some point, as a result of conditions ranging from eczema to kidney failure to cancer. The condition can have a serious impact on quality of life. On the face of it, the body appears to have a coping mechanism: scratching an itch until it hurts can bring instant relief. But when the pain wears off the itch is often more unbearable than before – which means we scratch even harder, sometimes to the point of causing painful skin damage. "People keep scratching even though they might end up bleeding," says Zhou-Feng Chen at the Washington University School of Medicine in St Louis, Missouri, who has now worked out why this happens. His team's work in mice suggests it comes down to an unfortunate bit of neural crosstalk. We know that the neurotransmitter serotonin helps control pain, and that pain – from the heavy scratching – helps soothe an itch, so Chen's team set out to explore whether serotonin is also involved in the itching process. They began by genetically engineering mice to produce no serotonin. Normally, mice injected with a chemical that irritates their skin will scratch up a storm, but the engineered mice seemed to have almost no urge to scratch. Genetically normal mice given a treatment to prevent serotonin leaving the brain also avoided scratching after being injected with the chemical, indicating that the urge to scratch begins when serotonin from the brain reaches the irritated spot. © Copyright Reed Business Information Ltd.
Keyword: Pain & Touch
Link ID: 20270 - Posted: 11.03.2014
By James Gallagher Health editor, BBC News website Weight loss surgery can dramatically reduce the odds of developing type 2 diabetes, according to a major study. Doctors followed nearly 5,000 people as part of a trial to assess the health impact of the procedure. The results, published in the Lancet Diabetes and Endocrinology journal, showed an 80% reduction in type 2 diabetes in those having surgery. The UK NHS is considering offering the procedure to tens of thousands of people to prevent diabetes. Obesity and type 2 diabetes are closely tied - the bigger someone is, the greater the risk of the condition. The inability to control blood sugar levels can result in blindness, amputations and nerve damage. Around a tenth of NHS budgets are spent on managing the condition. Surgery The study followed 2,167 obese adults who had weight loss - known as bariatric - surgery. They were compared to 2,167 fellow obese people who continued as they were. There were 38 cases of diabetes after surgery compared with 177 in people left as they were - a reduction of nearly 80%. Around 3% of morbidly obese people develop type 2 each year, however, surgery reduced the figure to around 0.5%, which is the background figure for the whole population. Bariatric surgery, also known as weight loss surgery, is used as a last resort to treat people who are dangerously obese and carrying an excessive amount of body fat. This type of surgery is available on the NHS only to treat people with potentially life-threatening obesity when other treatments have not worked. Around 8,000 people a year currently receive the treatment. The two most common types of weight loss surgery are: Gastric band, where a band is used to reduce the size of the stomach so a smaller amount of food is required to make someone feel full Gastric bypass, where the digestive system is re-routed past most of the stomach so less food is digested to make someone feel full BBC © 2014
Link ID: 20269 - Posted: 11.03.2014
by Aviva Rutkin IF DINNER is missing some zing, a spoon studded with electrodes could help. It creates tastes on your tongue with a pulse of electricity. The utensil may add some extra flavour for people who shouldn't eat certain foods. Different frequencies and magnitudes of current through the electrodes can create the impression of saltiness, sourness or bitterness. The spoon was developed by Nimesha Ranasinghe at the New York University Abu Dhabi in the United Arab Emirates and his team, who have also developed a water bottle with similar hardware on the mouthpiece. Both devices use various coloured lights, like blue for salty, in an attempt to augment the perceived intensity of the flavour. "Taste is not only taste. It's a multisensory sensation, so we need smell, colour, previous experiences, texture," says Ranasinghe. "I am trying to integrate different aspects of these sensations." By boosting the flavour of plain foods, he says a tool like this could be useful for people with diabetes or heart issues who have been ordered to cut down on salt and sugar. To see how well the electric utensils could fool diners, 30 people tried them out in a taste test with plain water and porridge. The spoon and bottle were judged 40 to 83 per cent successful at recreating the tastes, depending on which one they were aiming for. Bitter was the hardest sensation to get right. Some testers also said they were distracted by the metallic taste of the electrodes – a pitfall the researchers will work on next. © Copyright Reed Business Information Ltd.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 20268 - Posted: 11.03.2014
by Helen Thomson Scared of the dark? Terrified of heights? Spiders make you scream? For the first time, a person's lifelong phobia has been completely abolished overnight. Unfortunately, it required removing a tiny bit of the man's brain, so for now, most people will have to find another way to dispel their fears. The phobia was abolished by accident. A 44-year-old business man started having seizures out of the blue. Brain scans showed he had an abnormality in his left amygdala – an area in the temporal lobe involved in emotional reactions, among other things. Further tests showed the cause was sarcoidosis, a rare condition that causes damage to the lungs, skin and, occasionally, the brain. Doctors decided it was necessary to remove the man's damaged left amygdala. The surgery went well, but soon after the man noticed a strange turn of events. Not only did he have a peculiar "stomach-lurching" aversion to music – which was particularly noticeable when he heard the song accompanying a certain TV advert – but he also discovered he was no longer afraid of spiders. While his aversion to music waned over time, his arachnophobia never returned. Before the surgery he would throw tennis balls at spiders, or use hairspray to immobilise them before vacuuming them up. Now he is able to touch and observe the little critters at close distance and says he actually finds them fascinating. He hasn't noticed any changes to other kinds of fears or anxieties. For example, he is equally as anxious about public speaking now as he was prior to surgery. © Copyright Reed Business Information Ltd.
Link ID: 20265 - Posted: 11.01.2014
By Rachel Feltman Sometimes the process of scientific discovery can be a real headache. In a recent Danish study, scientists were thrilled to give painful migraines to 86 percent of their study subjects. Migraines are a particularly painful mystery for researchers to solve: More than 10 percent of people worldwide are affected by these intense headaches, but no one has been able to pinpoint a specific cause. What makes these headaches, which can cause incapacitating pain and nausea, different from all other headaches? That's why scientists had to make their patients suffer -- researchers keep trying to trigger migraines using different mechanisms. The more successful they are, the more likely it is that the mechanism being tested is actually a common cause of migraines. And once we know what the common causes are, we can try to develop better treatments that target them. In this case the 86 percent "success" rate, which the researchers say is much higher than results with other triggers, was owed to increases of a naturally occurring substance called cyclic AMP, or cAMP. Our bodies use cAMP to dilate blood vessels, so an increase of it can increase the flow of blood. To see if cAMP might cause migraines, the researchers dosed their subjects with cilostazol, a drug that increases cAMP concentrations in the body.
Keyword: Pain & Touch
Link ID: 20264 - Posted: 11.01.2014
by Dan Jones The way your brain reacts to a single disgusting image can be used to predict whether you lean to the left or the right politically. A number of studies have probed the emotions of people along the political spectrum, and found that disgust in particular is tightly linked to political orientation. People who are highly sensitive to disgusting images – of bodily waste, gore or animal remains – are more likely to sit on the political right and show concern for what they see as bodily and spiritual purity, so tend to oppose abortion and gay marriage, for example. A team led by Read Montague, a neuroscientist at Virginia Tech in Roanoke, recruited 83 volunteers and performed fMRI brain scans on them as they looked at a series of 80 images that were either pleasant, disgusting, threatening or neutral. Participants then rated the images for their emotional impact and completed a series of questionnaires that assessed whether they were liberal, moderate or conservative. The brain-imaging results were then fed to a learning algorithm which compared the whole-brain responses of liberals and conservatives when looking at disgusting images versus neutral ones. For both political groups, the algorithm was able to pick out distinct patterns of brain activity triggered by the disgusting images. And even though liberals and conservatives consciously reported similar emotional reactions to the images, the specific brain regions involved and their patterns of activation differed consistently between the two groups – so much so that they represented a neural signature of political leaning, the team concludes. © Copyright Reed Business Information Ltd
Link ID: 20263 - Posted: 11.01.2014
Linda Carroll TODAY contributor For years Larry Hester lived in darkness, his sight stolen by a disease that destroyed the photoreceptor cells in his retinas. But last week, through the help of a “bionic eye,” Hester got a chance to once again glimpse a bit of the world around him. Video: Larry Hester has been without sight for decades, but with the help of a new tool called the "bionic eye," researchers at Duke University have found a way to restore some of his sight. Hester is the seventh patient to receive an FDA-approved device that translates video signals into data the optic nerve can process. The images Hester and others “see” will be far from full sight, but experts hope it will be enough to give a little more autonomy to those who had previously been completely blind. Hester’s doctors at Duke University Eye Center believe that as time goes on the 66-year-old tire salesman from Raleigh, N.C., will be able to “see” more and more. After only five days, there has been remarkable progress. “I hope that [after some practice] he will be able to do things he can’t do today: maybe walk around a little more independently, see doorways or the straight line of a curb. We don’t expect him to be able to make out figures on TV. But we hope he’ll be more visually connected.” said Dr. Paul Hahn, an assistant professor of ophthalmology at the university in Durham.