Links for Keyword: Robotics
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Larry Greenemeier Having proved in 2004 that plugging a sensor into the human brain's motor cortex could turn the thoughts of paralysis victims into action, a team of Brown University scientists now has the green light from the U.S. Food and Drug Administration (FDA) and the Massachusetts General Hospital (MGH) institutional review board to expand its efforts developing technology that reconnects the brain to lifeless limbs. Brown's BrainGate Neural Interface System—conceived in 2000 with the help of a $4.25-million U.S. Defense Department grant—includes a baby aspirin–size brain sensor containing 100 electrodes, each thinner than a human hair, that connects to the surface of the motor cortex (the part of the brain that enables voluntary movement), registers electrical signals from nearby neurons, and transmits them through gold wires to a set of computers, processors and monitors. (ScientificAmerican.com in 2006 wrote about one patient's experience using BrainGate during its first phase of trials.) The researchers designed BrainGate to assist those suffering from spinal cord injuries, muscular dystrophy, brain stem stroke, amyotrophic lateral sclerosis (ALS, or Lou Gehrig's Disease), and other motor neuron diseases. During the initial testing five years ago, patients suffering from paralysis demonstrated their ability to use brain signals sent from their motor cortex to control external devices such as computer screen cursors and robotic arms just by thinking about them. "The signals may have been disconnected from the (participant's) limb, but they were still there," says Leigh Hochberg, a Brown associate professor of engineering and a vascular and critical care neurologist at MGH who is helping lead the research. © 1996-2009 Scientific American Inc
Ewen Callaway Look Mum, no hands! Two monkeys have managed to use brain power to control a robotic arm to feed themselves. The feat marks the first time a brain-controlled prosthetic limb has been wielded to perform a practical task. Previous demonstrations in monkeys and humans have tapped into the brain to control computer cursors and virtual worlds, and even to clench a robot hand. But complicated physical activities like eating are "a completely different ball game", says Andrew Schwarz, a neurological engineer at the University of Pittsburgh, who led the new research. Tests with humans are being prepared in numerous labs, but experts caution that brain-controlled robotic limbs are far from freeing paraplegics from their wheelchairs or giving amputees their limbs back. Wired for actionMost people who become paralysed or lose limbs retain the mental dexterity to perform physical actions. And by tapping into a region of the brain responsible for movement – the motor cortex – researchers can decode a person's intentions and translate them into action with a prosthetic. This had been done mostly with monkeys and in virtual worlds or with simple movements, such as reaching out a hand. But two years ago, an American team hacked into the brain of a patient with no control over his arms to direct a computer cursor and a simple robotic arm. © Copyright Reed Business Information Ltd
Celeste Biever A virtual child controlled by artificially intelligent software has passed a cognitive test regarded as a major milestone in human development. It could lead to smarter computer games able to predict human players' state of mind. Children typically master the "false belief test" at age 4 or 5. It tests their ability to realise that the beliefs of others can differ from their own, and from reality. The creators of the new character – which they called Eddie – say passing the test shows it can reason about the beliefs of others, using a rudimentary "theory of mind". "Today's [video game] characters have no genuine autonomy or mental picture of who you are," researcher Selmer Bringsjord of Rensselaer Polytechnic Institute in Troy, New York, told New Scientist. He aims to change that with future games and virtual worlds populated by genuinely intelligent computer characters able to predict and understand players actions and motives. Bringsjord's colleague Andrew Shilliday adds that their work will have applications outside of gaming. For example, search engines able to reason about the beliefs of a user might allow them to better understand their search queries. © Copyright Reed Business Information Ltd.
Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 11423 - Posted: 06.24.2010
By Malcolm Ritter ALBANY, N.Y. – To somebody peeking into this little room, I'm just a middle-aged guy wearing a polka-dotted blue shower cap with a bundle of wires sticking out the top, relaxing in a recliner while staring at a computer screen. But in my mind's eye, I'm a teenager sitting bolt upright on the black piano bench of my boyhood home, expertly pounding out the stirring opening chords of Chopin's Military Polonaise. Not that I've ever actually played that well. But there's a little red box motoring across that computer screen, and I'm hoping my fantasy will change my brain waves just enough to make it rise and hit a target. Some people have learned to hit such targets better than 90 percent of the time. During this, my first of 12 training sessions, I succeed 58 percent of the time. But my targets are so big that I could have reached 50 percent by random chance alone. Bottom line: Over the past half-hour, I've displayed just a bit more mental prowess than you'd expect from a bowl of Froot Loops. © Copyright 2005 Union-Tribune Publishing Co.
Some 200,000 people live with partial or nearly total permanent paralysis in the United States, with spinal cord injuries adding 11,000 new cases each year. Most research aimed at recovering motor function has focused on repairing damaged nerve fibers, which has succeeded in restoring limited movement in animal experiments. But regenerating nerves and restoring complex motor behavior in humans are far more difficult, prompting researchers to explore alternatives to spinal cord rehabilitation. One promising approach involves circumventing neuronal damage by establishing connections between healthy areas of the brain and virtual devices, called brain–machine interfaces (BMIs), programmed to transform neural impulses into signals that can control a robotic device. While experiments have shown that animals using these artificial actuators can learn to adjust their brain activity to move robot arms, many issues remain unresolved, including what type of brain signal would provide the most appropriate inputs to program these machines. Link: http://www.plos.org/downloads/plbi-01-02-carmena.pdf
Promise and peril in a marriage of brains and silicon By Nell Boyce Except for those odd little backpacks, the rats seem no creepier than usual. They climb trees, run through pipes, and scamper across tables. But they aren't following the usual rodent urges. These rats are moving under remote control, reacting to commands radioed to three thin electrodes in their brains. The signals tell them which way to turn–and encourage them by delivering electrical jolts to their pleasure centers. It is a tour de force with unsettling implications, and not just for rats. "It was kind of amazing to see," says researcher Sanjiv Talwar of the State University of New York Downstate Medical Center, Brooklyn. "We didn't imagine that it would be that accurate." The success, reported last week in Nature, conjures up visions of roborat search-and-rescue squads. It may also advance a long-sought goal in humans: linking the brains of people paralyzed by disease or injury to robots that could act for them. To be really useful, such devices would have to give sensory feedback to the brains of their users. That's what Talwar and his colleagues achieved with the rats, steering them left or right with impulses that made them feel as if someone were touching their whiskers. The feat is just the latest in a series of demonstrations suggesting that brains could meld with machines faster than you might think. Monkeys have moved robot arms with signals from their brains. Neural implants have also given a few severely disabled patients control over a computer cursor and delivered "sound" right to the brains of some deaf people. Yet it isn't just the paranoid who worry that such technologies could be used for brain enhancement rather than therapy, or that the mating of mind and machine could turn people into something akin to roborats. © 2002 U.S.News & World Report Inc. All rights reserved.
Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 2012 - Posted: 06.24.2010
NEW BRUNSWICK/PISCATAWAY, N.J. – Bionic limb replacements that look and work exactly like the real thing will likely remain a Hollywood fantasy, but fast advances in human-to-machine communication and miniaturization could bring the technology close within a decade. That is the outlook of Rutgers biomedical engineer and inventor William Craelius, whose Dextra artificial hand is the first to let a person use existing nerve pathways to control individual computer-driven mechanical fingers. Craelius published an overview of bionics entitled "The Bionic Man - Restoring Mobility," in the international journal "Science," on Feb. 8.
THE possibility of operating a machine using thought control has long fascinated researchers. It would be the ultimate video-game controller, for one thing. On a more practical level, it would help disabled and paralysed people use computers, artificial limbs, motorised wheelchairs or robots. New developments in brain-to-machine interfaces show that such possibilities are getting closer. For many years it has been possible for people to manipulate relatively simple devices—such as a computer’s on-screen cursor—by thinking about moving them. One way is by implanting electrodes into the brain to measure the electrical activity associated with certain movements. Another uses electroencephalography (EEG), which detects the same activity using electrodes placed on the scalp. In both cases, a computer learns to associate particular brain signals with intended actions. The trouble is that non-invasive methods, which obviously have far broader appeal, are less precise than using implanted electrodes, which produce a clearer signal. Recent advances in sensors and signal processing, however, have helped close the gap, making the EEG-based approach more accurate and easier to learn how to use. In one of the latest studies, José Contreras-Vidal and his colleagues at the University of Maryland were able to obtain enough EEG data from volunteer button pushers to reconstruct the associated hand motions in three dimensions. For their study, reported in the Journal of Neuroscience, the researchers put something that looks like a swimming cap containing 34 EEG sensors on the heads of five people. The volunteers were asked to press eight buttons randomly as their brain’s electrical signals were recorded, along with their hand movements. When the volunteers were then asked to think about pressing one of the eight buttons, the resulting EEG data could be compared with the data produced during actual button-pushing, and the computer could determine which button they had in mind. © The Economist Newspaper Limited 2010.
By Katie Moisse Our bodies are wired to move, and damaged wiring is often impossible to repair. Strokes and spinal cord injuries can quickly disconnect parts of the brain that initiate movement with the nerves and muscles that execute it, and neurodegenerative disorders such as Parkinson's disease and amyotrophic lateral sclerosis (ALS) draw the process out to the same effect. Scientists have been looking for a way to bypass damaged nerves by directly connecting the brain to an assistive device—like a robotic limb—through brain-computer interface (BCI) technology. Now, researchers have demonstrated the ability to nonintrusively record neural signals outside the skull and decode them into information that could be used to move a prosthetic. Past efforts at a BCI to animate an artificial limb involved electrodes inserted directly into the brain. The surgery required to implant the probes and the possibility that implants might not stay in place made this approach risky. The alternative—recording neural signals from outside the brain—has its own set of challenges. "It has been thought for quite some time that it wasn't possible to extract information about human movement using electroencephalography," or EEG, says neuroscientist and electrical engineer Jose Contreras-Vidal. In trying to record the brain's electrical activity off the scalp, he adds, "people assumed that the signal-to-noise ratio and the information content of these signals were limited." Evidently, that is not the case. In the March issue of The Journal of Neuroscience, Contreras-Vidal and his team from the bioengineering and kinesiology departments at the University of Maryland, College Park, show that the noisy brain waves recorded using noninvasive EEG can be mathematically decoded into meaningful information about complex human movements. © 2010 Scientific American
by David Kushner On the quarter-mile walk between his office at the École Polytechnique Fédérale de Lausanne in Switzerland and the nerve center of his research across campus, Henry Markram gets a brisk reminder of the rapidly narrowing gap between human and machine. At one point he passes a museumlike display filled with the relics of old supercomputers, a memorial to their technological limitations. At the end of his trip he confronts his IBM Blue Gene/P—shiny, black, and sloped on one side like a sports car. That new supercomputer is the centerpiece of the Blue Brain Project, tasked with simulating every aspect of the workings of a living brain. Markram, the 47-year-old founder and codirector of the Brain Mind Institute at the EPFL, is the project’s leader and cheerleader. A South African neuroscientist, he received his doctorate from the Weizmann Institute of Science in Israel and studied as a Fulbright Scholar at the National Institutes of Health. For the past 15 years he and his team have been collecting data on the neocortex, the part of the brain that lets us think, speak, and remember. The plan is to use the data from these studies to create a comprehensive, three-dimensional simulation of a mammalian brain. Such a digital re-creation that matches all the behaviors and structures of a biological brain would provide an unprecedented opportunity to study the fundamental nature of cognition and of disorders such as depression and schizophrenia. Until recently there was no computer powerful enough to take all our knowledge of the brain and apply it to a model. Blue Gene has changed that. It contains four monolithic, refrigerator-size machines, each of which processes data at a peak speed of 56 teraflops (teraflops being one trillion floating-point operations per second).
Related chapters from BP7e: Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 13746 - Posted: 06.24.2010
People who believe that the mind can be replicated on a computer tend to explain the mind in terms of a computer. When theorizing about the mind, especially to outsiders but also to one another, defenders of artificial intelligence (AI) often rely on computational concepts. They regularly describe the mind and brain as the “software and hardware” of thinking, the mind as a “pattern” and the brain as a “substrate,” senses as “inputs” and behaviors as “outputs,” neurons as “processing units” and synapses as “circuitry,” to give just a few common examples. Those who employ this analogy tend to do so with casual presumption. They rarely justify it by reference to the actual workings of computers, and they misuse and abuse terms that have clear and established definitions in computer science—established not merely because they are well understood, but because they in fact are products of human engineering. An examination of what this usage means and whether it is correct reveals a great deal about the history and present state of artificial intelligence research. And it highlights the aspirations of some of the luminaries of AI—researchers, writers, and advocates for whom the metaphor of mind-as-machine is dogma rather than discipline. Conceptions of the Computer Before any useful discussion about artificial intelligence can proceed, it is important to first clarify some basic concepts. When the mind is compared to a computer, just what is it being compared to? How does a computer work? Ari N. Schulman, "Why Minds Are Not Like Computers," The New Atlantis, Number 23, Winter 2009, pp. 46-68.
Related chapters from BP7e: Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 13333 - Posted: 06.24.2010
By SANDRA BLAKESLEE Learning to move a computer cursor or robotic arm with nothing but thoughts can be no different from learning how to play tennis or ride a bicycle, according to a new study of how brains and machines interact. The research, which was carried out in monkeys but is expected to apply to humans, involves a fundamental redesign of brain-machine experiments. In previous studies, the computer interfaces that translate thoughts into movements are given a new set of instructions each day — akin to waking up each morning with a new arm that you have to figure out how to use all over again. In the new experiments, monkeys learned how to move a computer cursor with their thoughts using just one set of instructions and an unusually small number of brain cells that deliver instructions for performing movements the same way each day. “This is the first demonstration that the brain can form a motor memory to control a disembodied device in a way that mirrors how it controls its own body,” said Jose M. Carmena, an assistant professor of computer and cognitive science at the University of California, Berkeley, who led the research. The experiments were described Monday in the journal PloS Biology. The results are very “dramatic and surprising,” said Eberhard E. Fetz, an expert in brain-machine-interface technology at the University of Washington, who was not involved in the research. “It goes to show the brain is smarter than we thought.” Copyright 2009 The New York Times Company
By Andy Greenberg, Forbes.com The Force, it seems, is not so strong with this one. In the virtual world of a game called Neuroboy, I'm staring out over a lagoon at an exact digital replica of a Star Wars X-wing spaceship submerged in blue water. My task: to lift that virtual object out of its murky depths using not my mouse or keyboard, but instead — à la Luke Skywalker — my thoughts. Back in the real world, I'm wearing a bluetooth headset that touches my forehead with a single metal sensor. The more I relax, according to the headset's manufacturer, Neurosky, the more that small metal point will pick up my brain's alpha waves, triggering the ship to rise. I relax. The spaceship doesn't budge. I close my eyes to slits and let myself slip into a half-trance daze. The ship wiggles ever so slightly, and I respond with a bit of hopeful excitement that immediately sends it sinking back into the water. After a minute or so of frustration, I retreat to my keyboard to switch the game from "lift" to "pull" mode. Suddenly, and without a single Jedi mind trick, the X-wing leaps toward me and fills the entire screen, only coming to rest when I pull off the headset to break the sensor's connection with my forehead. Either my mental powers aren't yet ready for Neurosky's gadgetry, or vice versa. Ready or not, telekinesis gadgets like Neurosky's so-called "Mindset" are coming to market, along with those built by competitors like OCZ Technology and Emotiv Systems. Neurosky plans to announce Thursday at the Game Developers Conference in San Francisco that it's partnering with Toshiba to release the $199 US consumer headsets this summer, along with a software platform for third-party developers to create games and other applications for the device. © CBC 2009
By Steve Hamm When Lloyd Watts was growing up in Kingston, Ont., in the 1970s he had a knack for listening to songs by Billy Joel and Elton John and plunking out the melodies on the family piano. But he wondered, wouldn't it be great to have a machine that could "listen" to songs and immediately transcribe them into musical notation? Watts never built the gizmo, but his decades-long quest to engineer such a machine has finally resulted in one of the first commercial technologies based on the biology of the brain. Microchips designed by Audience, the Silicon Valley company Watts launched, are now being used by mobile handset makers in Asia to improve dramatically the quality of conversations in noisy places. Even a truck passing right by someone using the technology won't be heard at the other end of the phone line. The chip is modeled on functions of the inner ear and part of the cerebral cortex. "We have reverse-engineered this piece of the brain," declares Watts. The 47-year-old neuroscientist is on the leading edge of what some believe will be a fundamental shift in the way certain types of computers are designed. Today's computers are essentially really fast abacuses. They're good at math but can't process complex streams of information in real time, as humans do. Now, thanks to advances in our understanding of biology, scientists believe they can model a new generation of computers on how the brain actually works—the microscopic chemical interactions and electrical impulses that translate sensations into knowledge and knowledge into decisions and actions. It's a successor to the old ideas about artificial intelligence, and a handful of companies have initiatives under way, among them IBM (IBM) and Numenta, a Silicon Valley startup. Copyright 2000-2008 by The McGraw-Hill Companies Inc.
Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 12280 - Posted: 06.24.2010
By Larry Greenemeier As Kevin Warwick gently squeezed his hand into a fist one day in 2002, a robotic hand came to life 3,400 miles away and mimicked the gesture. The University of Reading cybernetics professor had successfully wired the nerves of his forearm to a computer in New York City's Columbia University and networked them to a robotic system back in his Reading, England, lab. "My body was effectively extended over the Internet," Warwick says. It's a far cry from his vision of transforming humanity into a race of half-machine cyborgs able to commune with the digital world—there is no spoon, Neo—but such an evolution is necessary, says 54-year-old Warwick. Those who don't avail themselves of subcutaneous microchips and other implanted technology, he predicts, will be at a serious disadvantage in tomorrow's world, because they won't be able to communicate with the "superintelligent machines" sure to be occupying the highest rungs of society, as he explains in a 2003 documentary, Building Gods, which is circulating online. Something of a self-promoter, Warwick, or "Captain Cyborg" as a U.K. newspaper once dubbed him, has appeared on Late Night with Conan O'Brien and other shows on the TV talk circuit to tout his work. In his 2004 book, I, Cyborg, he describes his research as "the extraordinary story of my adventure as the first human entering into a cyber world." © 1996-2008 Scientific American Inc
Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 1: An Introduction to Brain and Behavior
Link ID: 11394 - Posted: 06.24.2010
— Japanese researchers say they have found a way to let people stroll through the virtual world of Second Life using their own imagination, in a development that could help paralysis patients. Previous studies have shown people can move computer cursors through brain waves, but the Japanese team says it is the first to apply the technology to an Internet virtual world. The technology "would enable people suffering paralysis to communicate with others or do business through chatting and shopping in a virtual world," said Junichi Ushiba, associate professor at Keio University's rehabilitation centre. Second Life is an increasingly popular virtual world in which people — and animals — are represented by animated avatars and can do everything from social activities to shopping. Ushiba said Second Life could motivate patients with severe paralysis, who are often too depressed to undergo rehabilitation. "If they can see with their own eyes their characters moving around, it could reinvigorate their brain activity and restore some functions," he said. Under the technology, a person wearing head gear embedded with electrodes, which analyse brain waves in the cerebral motor cortex, would be able to move a Second Life character forward by thinking he or she is walking. © 2007 Discovery Communications
Rachel Konrad, Associated Press — A convincing twin of Darth Vader stalks the beige cubicles of a Silicon Valley office, complete with ominous black mask, cape and light saber. But this is no chintzy Halloween costume. It's a prototype, years in the making, of a toy that incorporates brain wave-reading technology. Behind the mask is a sensor that touches the user's forehead and reads the brain's electrical signals, then sends them to a wireless receiver inside the saber, which lights up when the user is concentrating. The player maintains focus by channeling thoughts on any fixed mental image, or thinking specifically about keeping the light sword on. When the mind wanders, the wand goes dark. Engineers at NeuroSky Inc. have big plans for brain wave-reading toys and video games. They say the simple Darth Vader game — a relatively crude biofeedback device cloaked in gimmicky garb — portends the coming of more sophisticated devices that could revolutionize the way people play. Technology from NeuroSky and other startups could make video games more mentally stimulating and realistic. It could even enable players to control video game characters or avatars in virtual worlds with nothing but their thoughts. Adding biofeedback to "Tiger Woods PGA Tour," for instance, could mean that only those players who muster Zen-like concentration could nail a put. In the popular action game "Grand Theft Auto," players who become nervous or frightened would have worse aim than those who remain relaxed and focused. © 2007 Discovery Communications Inc.
Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 10245 - Posted: 06.24.2010
Heidi Ledford Surgeons have managed to give an amputee not only a prosthetic arm that moves as directed by her thoughts, but also the feeling of touch — albeit in the wrong part of her body. When Claudia Mitchell presses an area on her chest, where surgeons re-wired the nerves that used to run to her hand, it feels to her as if her fingers are being touched. The technique opens the door to additional technologies that could one day relay signals from the prosthesis back to the 'fingers' on the chest, allowing an amputee to get sensory information such as touch and temperature from their artificial limb. Mitchell's success story was revealed in a press conference last year, but now the details have been published: they are reported this week in the Lancet.1 Mitchell was only 24 years old when a motorcycle accident robbed her of her left arm. She got a prosthesis five months later, but wore it infrequently and then only for cosmetic reasons. It just wasn't useful enough to make the discomfort worthwhile, she said. ©2007 Nature Publishing Group
Related chapters from BP7e: Chapter 8: General Principles of Sensory Processing, Touch, and Pain; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 5: The Sensorimotor System
Link ID: 9911 - Posted: 06.24.2010
Tracy Staedter, Discovery News — Part human and part robot, an interface made of lab-grown nerve tissue and a microchip could one day help patients control artificial limbs with their thoughts. The tissue interface could help make prosthesis motion, now awkward and cumbersome, smoother. It could also replace lost sensory feedback, such as that from touch, to reproduce natural limb responses. "We want to hook a nerve system up to a device and create bidirectional communication," said Douglas Smith, professor of neurosurgery at the University of Pennsylvania and director of its Center for Brain Injury and Repair. And since "a nerve doesn't like to be poked and prodded by things it's unfamiliar with," sticking an electronic device directly into the tissue isn't option. In this case, a hybrid device is the way to go. Smith and his team published their results in this month's issue of Neurosurgery. At the heart of the research is a technique Smith and his team are pioneering for growing, or rather stretching, nerve tissue to a particular length. © 2007 Discovery Communications Inc.
Tracy Staedter, Discovery News — Forget the remote control — scientists are learning how to let you control a robot with signals straight from your brain. Eventually, the technique could lead to semi-autonomous robots able to assist disabled people or perform routine tasks in the home. "We're using a well-known, well-characterized response that occurs in the brain to control a physical device in the world," said research leader Rajesh Rao, an associate professor of computer science and engineering at the University of Washington in Seattle. That brain "response" is the same one triggered whenever you lose and then find your car keys, said Rao. As you scan tables, desks and bureaus for lost keys, you're focused on a mental image of them. When you finally spot your keys, your brain's reaction is known to scientists as a P300 response ("P" is for positive and 300 for the number of milliseconds it takes your neurons to produce the reaction). The P300 response is strong and distinctive, and therefore can be picked up by external sensors. And because it is such a well-characterized response, it can be used reliably again and again. Rao and his team wrote software that allows a computer to recognize the P300 response, and use it to guide a robot. © 2007 Discovery Communications Inc
Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 9831 - Posted: 06.24.2010