Links for Keyword: Robotics

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 141 - 160 of 268

By KATHERINE BOUTON Imagine, Michael Chorost proposes, that four police officers on a drug raid are connected mentally in a way that allows them to sense what their colleagues are seeing and feeling. Tony Vittorio, the captain, is in the center room of the three-room drug den. He can sense that his partner Wilson, in the room on his left, is not feeling danger or arousal and thus has encountered no one. But suddenly Vittorio feels a distant thump on his chest. Sarsen, in the room on the right, has been hit with something, possibly a bullet fired from a gun with a silencer. Vittorio glimpses a flickering image of a metallic barrel pointed at Sarsen, who is projecting overwhelming shock and alarm. By deducing how far Sarsen might have gone into the room and where the gunman is likely to be standing, Vittorio fires shots into the wall that will, at the very least, distract the gunman and allow Sarsen to shoot back. Sarsen is saved; the gunman is dead. That scene, from his new book, “World Wide Mind,” is an example of what Mr. Chorost sees as “the coming integration of humanity, machines, and the Internet.” The prediction is conceptually feasible, he tells us, something that technology does not yet permit but that breaks no known physical laws. © 2011 The New York Times Company

Related chapters from BN: Chapter 9: Hearing, Balance, Taste, and Smell; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 6: Hearing, Balance, Taste, and Smell; Chapter 7: Vision: From Eye to Brain
Link ID: 15004 - Posted: 02.15.2011

By Emily Singer Most of the robotic arms now in use by some amputees are of limited practicality; they have only two to three degrees of freedom, allowing the user to make a single movement at a time. And they are controlled with conscious effort, meaning the user can do little else while moving the limb. A new generation of much more sophisticated and lifelike prosthetic arms, sponsored by the Department of Defense's Defense Advanced Research Projects Agency (DARPA), may be available within the next five to 10 years. Two different prototypes that move with the dexterity of a natural limb and can theoretically be controlled just as intuitively--with electrical signals recorded directly from the brain--are now beginning human tests. Initial results of one of these studies--the first tests of a paralyzed human controlling a robotic arm with multiple degrees of freedom--will be presented at the Society for Neuroscience conference in November. The new designs have about 20 degrees of independent motion, a significant leap over existing prostheses, and they can be operated via a variety of interfaces. One device, developed by DEKA Research and Development, can be consciously controlled using a system of levers in a shoe. © 2010 MIT Technology Review

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 14639 - Posted: 11.08.2010

by Debora MacKenzie Groups in Germany and the US have been testing electronic implants aimed at restoring vision to people with retinal dystrophy. The condition is hereditary or age-related, and causes degeneration of the photoreceptors – light-sensitive cells in the retina – leading to blindness. It affects 15 million people worldwide. Eberthart Zrenner and colleagues at the University of Tübingen in Germany have developed a microchip carrying 1500 photosensitive diodes that slides into the retina where the photoreceptors would normally be. The diodes respond to light, and when connected to an outside power source through a wire into the eye, can stimulate the nearby nerves that normally pass signals to the brain, mimicking healthy photoreceptors. The team reports that their first three volunteers could all locate bright objects. One could recognise normal objects and read large words. Nerves in the eye normally adapt to visual input and stop transmitting signals after a short time. Tiny movements of the eye overcome this by constantly projecting the image back and forth between neighbouring nerve cells so that each has time to recover and resume transmitting signals. Because the implant is inside the eye, this mechanism worked normally in the trials. Another device being tested sends images from a head-mounted camera to ocular nerves, but as the image forms outside the eye the tiny movements cannot maintain it and patients must rapidly shake their head instead. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 7: Vision: From Eye to Brain
Link ID: 14627 - Posted: 11.04.2010

by David Hambling Imagine a bionic arm that plugs directly into the nervous system, so that the brain can control its motion, and the owner can feel pressure and heat through their robotic hand. This prospect has come a step closer with the development of photonic sensors that could improve connections between nerves and prosthetic limbs. Existing neural interfaces are electronic, using metal components that may be rejected by the body. Now Marc Christensen at Southern Methodist University in Dallas, Texas, and colleagues are building sensors to pick up nerve signals using light instead. They employ optical fibres and polymers that are less likely than metal to trigger an immune response, and which will not corrode. The sensors are currently in the prototype stage and too big to put in the body, but smaller versions should work in biological tissue, according to the team. The sensors are based on spherical shells of a polymer that changes shape in an electric field. The shells are coupled with an optical fibre, which sends a beam of light travelling around inside them. The way that the light travels around the inside of the sphere is called a "whispering gallery mode", named after the Whispering Gallery in St Paul's Cathedral, London, where sound travels further than usual because it reflects along a concave wall. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 14565 - Posted: 10.19.2010

Katharine Sanderson Artificial electronic skins that can detect the gentlest of touches have been developed by two independent US research groups. The skins could eventually be used in prosthetics, or in touch-sensitive robotic devices. Both systems detect pressure changes of less than a kilopascal, the same as everyday pressures felt by our fingers when typing or picking up a pen. This sensitivity is better than previous systems, which detected pressures of tens of kilopascals or more, or only detected static pressures so that once an object was sat on the skin, the device could not sense that it was still there. The devices, both reported in Nature Materials today, work in different ways1,2. Chemist Zhenan Bao at Stanford University, California, and her colleagues used the elastic polymer polydimethylsiloxane (PDMS)1. Bao took a piece of PDMS measuring six centimetres square with pyramid-shaped chunks cut out of it at regular intervals. When the PDMS is squashed, the pyramid-shaped holes that were previously filled with air become filled with PDMS, changing the device's capacitance, or its ability to hold an electric charge. An optical image of a fully fabricated e-skin device with nanowire active matrix circuitry. each dark square represents a single pixel.The use of pressure-sensitive rubber makes this artifical skin flexible.Ali Javey and Kuniharu Takei To make it easier to detect the changes in capacitance, Bao stuck the PDMS capacitor onto an organic transistor, which can read out the differences as a change in current. The team used a grid of transistors to track pressure changes at different points across the material. © 2010 Nature Publishing Group,

Related chapters from BN: Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 14448 - Posted: 09.13.2010

By Alyssa Danigelis The thoughts are there, but there is no way to express them. For "locked in" patients, many with Lou Gehrig's disease, the only way to communicate tends to be through blinking in code. But now, words can be read directly from patients' minds by attaching microelectrode grids to the surface of the brain and learning which signals mean which words, a development that will ultimately help such patients talk again. "They're perfectly aware. They just can't get signals out of their brain to control their facial expressions. "They're the patients we'd like to help first," said University of Utah's Bradley Greger, an assistant professor of bioengineering who, with neurosurgery professor Paul House, M.D., published the study in the October issue of the Journal of Neural Engineering. Some severely-epileptic patients have the seizure-stricken parts of the brain removed. This standard procedure requires cutting the skull open and putting large, button-sized electrodes on the brain to determine just what needs removal. The electrodes are then taken off the brain. The University of Utah team worked with an epileptic patient who let them crowd together much smaller devices, called micro-electrocorticography, onto his brain prior to surgery. © 2010 Discovery Communications, LLC.

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 14442 - Posted: 09.11.2010

By CLAUDIA DREIFUS About four years ago, John Donoghue’s son, Jacob, then 18, took his father aside and declared, “Dad, I now understand what you do — you’re ‘The Matrix’!” Dr. Donoghue, 61, is a professor of engineering and neuroscience at Brown University, studying how human brain signals could combine with modern electronics to help paralyzed people gain greater control over their environments. He’s designed a machine, the BrainGate, that uses thought to move objects. We spoke for two hours in his Brown University offices in Providence, R.I., and then again by telephone. An edited version of the two conversations follows: Q. WHAT EXACTLY IS BRAINGATE? A. It’s a way for people who’ve been paralyzed by strokes, spinal cord injuries or A.L.S. to connect their brains to the outside world. The system uses a tiny sensor that’s been implanted into the part of a person’s brain that generates movement commands. This sensor picks up brain signals, transmits them to a plug attached to the person’s scalp. The signals then go to a computer which is programmed to translate them into simple actions. Q. WHY MOVE THE SIGNALS OUT OF THE BODY? A. Because for many paralyzed people, there’s been a break between their brain and the rest of their nervous system. Their brains may be fully functional, but their thoughts don’t go anywhere. What BrainGate does is bypass the broken connection. Free of the body, the signal is directed to machines that will turn thoughts into action. Copyright 2010 The New York Times Company

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 14319 - Posted: 08.03.2010

by Laurie Rich, Jane Bosveld, Andrew Grant, Amy Barth The brain is a castle on a hill. Encased in bone and protected by a special layer of cells, it is shielded from infections and injuries—but also from many pharmaceuticals and even from the body’s own immune defenses. As a result, brain problems are tough to diagnose and to treat. To meet this challenge, researchers are exploring unconventional therapies, from electrodes to laser-light stimulation to mind-bending drugs. Some of these radical experiments may never pan out. But, as frequently happens in medicine, a few of today’s improbable approaches may evolve into tomorrow’s miraculous cures. 1. Man Meets Machine In a sense, cyborgs already walk among us: Nearly 200,000 deaf or near-deaf people have cochlear implants, electronic sound-processing machines that stimulate the auditory nerve and link into the brain. But even by the fanciful science fiction definition, the age of cyborgs is just around the corner. In the last decade, researchers have become increasingly skilled at detecting and interpreting brain signals. Technologies that allow people to use their thoughts to control machines—computers, speaking devices, or prosthetic limbs—are already being tested and could soon be available for widespread applications.

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 14223 - Posted: 07.03.2010

By SANDRA BLAKESLEE On Thursday, the 12-pound, 32-inch monkey made a 200-pound, 5-foot humanoid robot walk on a treadmill using only her brain activity. She was in North Carolina, and the robot was in Japan. It was the first time that brain signals had been used to make a robot walk, said Dr. Miguel A. L. Nicolelis, a neuroscientist at Duke University whose laboratory designed and carried out the experiment. In 2003, Dr. Nicolelis’s team proved that monkeys could use their thoughts alone to control a robotic arm for reaching and grasping. These experiments, Dr. Nicolelis said, are the first steps toward a brain machine interface that might permit paralyzed people to walk by directing devices with their thoughts. Electrodes in the person’s brain would send signals to a device worn on the hip, like a cell phone or pager, that would relay those signals to a pair of braces, a kind of external skeleton, worn on the legs. “When that person thinks about walking,” he said, “walking happens.” Richard A. Andersen, an expert on such systems at the California Institute of Technology in Pasadena who was not involved in the experiment, said that it was “an important advance to achieve locomotion with a brain machine interface.” Copyright 2008 The New York Times Company

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 11210 - Posted: 06.24.2010

Atlanta —Working from their university labs in two different corners of the world, U.S. and Australian researchers have created what they call a new class of creative beings, “the semi-living artist” – a picture-drawing robot in Perth, Australia whose movements are controlled by the brain signals of cultured rat cells in Atlanta. Gripping three colored markers positioned above a white canvas, the robotic drawing arm operates based on the neural activity of a few thousand rat neurons placed in a special petri dish that keeps the cells alive. The dish, a Multi-Electrode Array (MEA), is instrumented with 60 two-way electrodes for communication between the neurons and external electronics. The neural signals are recorded and sent to a computer that translates neural activity into robotic movement. ©2003 Georgia Institute of Technology

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 4011 - Posted: 06.24.2010

By Jennifer Viegas, Discovery News — Genetically engineered fruit flies have been made to jump, beat their wings and fly on human command, according to a new study published in the journal Cell. The flies are the first creatures that humans have remotely controlled. Someday, a related nerve stimulation process may restore nerve circuits in people with neurological diseases or injuries, such as the spinal cord trauma of the late actor and activist Christopher Reeve. Manipulation of behavior in insects and animals, even humans, has been possible for the past 50 years or so. Most of the studies, however, involved invasive electrical stimulation of specific parts of the brain. Surgeon Wilder Penfield, for example, electrically stimulated the cortexes of neurosurgery patients, who later said that the electricity affected their thinking and memory. Monkeys undergoing brain stimulation also have been tricked into thinking that something was vibrating their hands. "Attempts to manipulate behavior in an active and predictive way have been a focus of the laboratory for several years," explained Gero Miesenböck, who co-authored the Cell paper with Susana Lima, and is an associate professor of cell biology at the Yale University School of Medicine. Copyright © 2005 Discovery Communications Inc.

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 7211 - Posted: 06.24.2010

By Larry Greenemeier Having proved in 2004 that plugging a sensor into the human brain's motor cortex could turn the thoughts of paralysis victims into action, a team of Brown University scientists now has the green light from the U.S. Food and Drug Administration (FDA) and the Massachusetts General Hospital (MGH) institutional review board to expand its efforts developing technology that reconnects the brain to lifeless limbs. Brown's BrainGate Neural Interface System—conceived in 2000 with the help of a $4.25-million U.S. Defense Department grant—includes a baby aspirin–size brain sensor containing 100 electrodes, each thinner than a human hair, that connects to the surface of the motor cortex (the part of the brain that enables voluntary movement), registers electrical signals from nearby neurons, and transmits them through gold wires to a set of computers, processors and monitors. (ScientificAmerican.com in 2006 wrote about one patient's experience using BrainGate during its first phase of trials.) The researchers designed BrainGate to assist those suffering from spinal cord injuries, muscular dystrophy, brain stem stroke, amyotrophic lateral sclerosis (ALS, or Lou Gehrig's Disease), and other motor neuron diseases. During the initial testing five years ago, patients suffering from paralysis demonstrated their ability to use brain signals sent from their motor cortex to control external devices such as computer screen cursors and robotic arms just by thinking about them. "The signals may have been disconnected from the (participant's) limb, but they were still there," says Leigh Hochberg, a Brown associate professor of engineering and a vascular and critical care neurologist at MGH who is helping lead the research. © 1996-2009 Scientific American Inc

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 12945 - Posted: 06.24.2010

Ewen Callaway Look Mum, no hands! Two monkeys have managed to use brain power to control a robotic arm to feed themselves. The feat marks the first time a brain-controlled prosthetic limb has been wielded to perform a practical task. Previous demonstrations in monkeys and humans have tapped into the brain to control computer cursors and virtual worlds, and even to clench a robot hand. But complicated physical activities like eating are "a completely different ball game", says Andrew Schwarz, a neurological engineer at the University of Pittsburgh, who led the new research. Tests with humans are being prepared in numerous labs, but experts caution that brain-controlled robotic limbs are far from freeing paraplegics from their wheelchairs or giving amputees their limbs back. Wired for actionMost people who become paralysed or lose limbs retain the mental dexterity to perform physical actions. And by tapping into a region of the brain responsible for movement – the motor cortex – researchers can decode a person's intentions and translate them into action with a prosthetic. This had been done mostly with monkeys and in virtual worlds or with simple movements, such as reaching out a hand. But two years ago, an American team hacked into the brain of a patient with no control over his arms to direct a computer cursor and a simple robotic arm. © Copyright Reed Business Information Ltd

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 11673 - Posted: 06.24.2010

Celeste Biever A virtual child controlled by artificially intelligent software has passed a cognitive test regarded as a major milestone in human development. It could lead to smarter computer games able to predict human players' state of mind. Children typically master the "false belief test" at age 4 or 5. It tests their ability to realise that the beliefs of others can differ from their own, and from reality. The creators of the new character – which they called Eddie – say passing the test shows it can reason about the beliefs of others, using a rudimentary "theory of mind". "Today's [video game] characters have no genuine autonomy or mental picture of who you are," researcher Selmer Bringsjord of Rensselaer Polytechnic Institute in Troy, New York, told New Scientist. He aims to change that with future games and virtual worlds populated by genuinely intelligent computer characters able to predict and understand players actions and motives. Bringsjord's colleague Andrew Shilliday adds that their work will have applications outside of gaming. For example, search engines able to reason about the beliefs of a user might allow them to better understand their search queries. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 4: Development of the Brain
Link ID: 11423 - Posted: 06.24.2010

By Malcolm Ritter ALBANY, N.Y. – To somebody peeking into this little room, I'm just a middle-aged guy wearing a polka-dotted blue shower cap with a bundle of wires sticking out the top, relaxing in a recliner while staring at a computer screen. But in my mind's eye, I'm a teenager sitting bolt upright on the black piano bench of my boyhood home, expertly pounding out the stirring opening chords of Chopin's Military Polonaise. Not that I've ever actually played that well. But there's a little red box motoring across that computer screen, and I'm hoping my fantasy will change my brain waves just enough to make it rise and hit a target. Some people have learned to hit such targets better than 90 percent of the time. During this, my first of 12 training sessions, I succeed 58 percent of the time. But my targets are so big that I could have reached 50 percent by random chance alone. Bottom line: Over the past half-hour, I've displayed just a bit more mental prowess than you'd expect from a bowl of Froot Loops. © Copyright 2005 Union-Tribune Publishing Co.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 7291 - Posted: 06.24.2010

Some 200,000 people live with partial or nearly total permanent paralysis in the United States, with spinal cord injuries adding 11,000 new cases each year. Most research aimed at recovering motor function has focused on repairing damaged nerve fibers, which has succeeded in restoring limited movement in animal experiments. But regenerating nerves and restoring complex motor behavior in humans are far more difficult, prompting researchers to explore alternatives to spinal cord rehabilitation. One promising approach involves circumventing neuronal damage by establishing connections between healthy areas of the brain and virtual devices, called brain–machine interfaces (BMIs), programmed to transform neural impulses into signals that can control a robotic device. While experiments have shown that animals using these artificial actuators can learn to adjust their brain activity to move robot arms, many issues remain unresolved, including what type of brain signal would provide the most appropriate inputs to program these machines. Link: http://www.plos.org/downloads/plbi-01-02-carmena.pdf

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 4375 - Posted: 06.24.2010

Promise and peril in a marriage of brains and silicon By Nell Boyce Except for those odd little backpacks, the rats seem no creepier than usual. They climb trees, run through pipes, and scamper across tables. But they aren't following the usual rodent urges. These rats are moving under remote control, reacting to commands radioed to three thin electrodes in their brains. The signals tell them which way to turn–and encourage them by delivering electrical jolts to their pleasure centers. It is a tour de force with unsettling implications, and not just for rats. "It was kind of amazing to see," says researcher Sanjiv Talwar of the State University of New York Downstate Medical Center, Brooklyn. "We didn't imagine that it would be that accurate." The success, reported last week in Nature, conjures up visions of roborat search-and-rescue squads. It may also advance a long-sought goal in humans: linking the brains of people paralyzed by disease or injury to robots that could act for them. To be really useful, such devices would have to give sensory feedback to the brains of their users. That's what Talwar and his colleagues achieved with the rats, steering them left or right with impulses that made them feel as if someone were touching their whiskers. The feat is just the latest in a series of demonstrations suggesting that brains could meld with machines faster than you might think. Monkeys have moved robot arms with signals from their brains. Neural implants have also given a few severely disabled patients control over a computer cursor and delivered "sound" right to the brains of some deaf people. Yet it isn't just the paranoid who worry that such technologies could be used for brain enhancement rather than therapy, or that the mating of mind and machine could turn people into something akin to roborats. © 2002 U.S.News & World Report Inc. All rights reserved.

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 3: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 2012 - Posted: 06.24.2010

NEW BRUNSWICK/PISCATAWAY, N.J. – Bionic limb replacements that look and work exactly like the real thing will likely remain a Hollywood fantasy, but fast advances in human-to-machine communication and miniaturization could bring the technology close within a decade. That is the outlook of Rutgers biomedical engineer and inventor William Craelius, whose Dextra artificial hand is the first to let a person use existing nerve pathways to control individual computer-driven mechanical fingers. Craelius published an overview of bionics entitled "The Bionic Man - Restoring Mobility," in the international journal "Science," on Feb. 8.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 1502 - Posted: 06.24.2010

THE possibility of operating a machine using thought control has long fascinated researchers. It would be the ultimate video-game controller, for one thing. On a more practical level, it would help disabled and paralysed people use computers, artificial limbs, motorised wheelchairs or robots. New developments in brain-to-machine interfaces show that such possibilities are getting closer. For many years it has been possible for people to manipulate relatively simple devices—such as a computer’s on-screen cursor—by thinking about moving them. One way is by implanting electrodes into the brain to measure the electrical activity associated with certain movements. Another uses electroencephalography (EEG), which detects the same activity using electrodes placed on the scalp. In both cases, a computer learns to associate particular brain signals with intended actions. The trouble is that non-invasive methods, which obviously have far broader appeal, are less precise than using implanted electrodes, which produce a clearer signal. Recent advances in sensors and signal processing, however, have helped close the gap, making the EEG-based approach more accurate and easier to learn how to use. In one of the latest studies, Jos Contreras-Vidal and his colleagues at the University of Maryland were able to obtain enough EEG data from volunteer button pushers to reconstruct the associated hand motions in three dimensions. For their study, reported in the Journal of Neuroscience, the researchers put something that looks like a swimming cap containing 34 EEG sensors on the heads of five people. The volunteers were asked to press eight buttons randomly as their brain’s electrical signals were recorded, along with their hand movements. When the volunteers were then asked to think about pressing one of the eight buttons, the resulting EEG data could be compared with the data produced during actual button-pushing, and the computer could determine which button they had in mind. © The Economist Newspaper Limited 2010.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 13856 - Posted: 06.24.2010

By Katie Moisse Our bodies are wired to move, and damaged wiring is often impossible to repair. Strokes and spinal cord injuries can quickly disconnect parts of the brain that initiate movement with the nerves and muscles that execute it, and neurodegenerative disorders such as Parkinson's disease and amyotrophic lateral sclerosis (ALS) draw the process out to the same effect. Scientists have been looking for a way to bypass damaged nerves by directly connecting the brain to an assistive device—like a robotic limb—through brain-computer interface (BCI) technology. Now, researchers have demonstrated the ability to nonintrusively record neural signals outside the skull and decode them into information that could be used to move a prosthetic. Past efforts at a BCI to animate an artificial limb involved electrodes inserted directly into the brain. The surgery required to implant the probes and the possibility that implants might not stay in place made this approach risky. The alternative—recording neural signals from outside the brain—has its own set of challenges. "It has been thought for quite some time that it wasn't possible to extract information about human movement using electroencephalography," or EEG, says neuroscientist and electrical engineer Jose Contreras-Vidal. In trying to record the brain's electrical activity off the scalp, he adds, "people assumed that the signal-to-noise ratio and the information content of these signals were limited." Evidently, that is not the case. In the March issue of The Journal of Neuroscience, Contreras-Vidal and his team from the bioengineering and kinesiology departments at the University of Maryland, College Park, show that the noisy brain waves recorded using noninvasive EEG can be mathematically decoded into meaningful information about complex human movements. © 2010 Scientific American

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 13831 - Posted: 06.24.2010