Links for Keyword: Robotics

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 200

By David Pogue Okay, great: we can control Our phones with speech recognition and our television sets with gesture recognition. But those technologies don't work in all situations for all people. So I say, forget about those crude beginnings; what we really want is thought recognition. As I found out during research for a recent NOVA episode, it mostly appears that brain-computer interface (BCI) technology has not advanced very far just yet. For example, I tried to make a toy helicopter fly by thinking “up” as I wore a $300 commercial EEG headset. It barely worked. Such “mind-reading” caps are quick to put on and noninvasive. They listen, through your scalp, for the incredibly weak remnants of electrical signals from your brain activity. But they're lousy at figuring out where in your brain they originated. Furthermore, the headset software didn't even know that I was thinking “up.” I could just as easily have thought “goofy” or “shoelace” or “pickle”—whatever I had thought about during the 15-second training session. There are other noninvasive brain scanners—magnetoencephalography, positron-emission tomography and near-infrared spectroscopy, and so on—but each also has its trade-offs. Of course, you can implant sensors inside someone's skull for the best readings of all; immobilized patients have successfully manipulated computer cursors and robotic arms using this approach. Still, when it comes to controlling everyday electronics, brain surgery might be a tough sell. © 2012 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17518 - Posted: 11.21.2012

By Meghan Rosen Michael McAlpine’s shiny circuit doesn’t look like something you would stick in your mouth. It’s dashed with gold, has a coiled antenna and is glued to a stiff rectangle. But the antenna flexes, and the rectangle is actually silk, its stiffness melting away under water. And if you paste the device on your tooth, it could keep you healthy. The electronic gizmo is designed to detect dangerous bacteria and send out warning signals, alerting its bearer to microbes slipping past the lips. Recently, McAlpine, of Princeton University, and his colleagues spotted a single E. coli bacterium skittering across the surface of the gadget’s sensor. The sensor also picked out ulcer-causing H. pylori amid the molecular medley of human saliva, the team reported earlier this year in Nature Communications. At about the size of a standard postage stamp, the dental device is still too big to fit comfortably in a human mouth. “We had to use a cow tooth,” McAlpine says, describing test experiments. But his team plans to shrink the gadget so it can nestle against human enamel. McAlpine is convinced that one day, perhaps five to 10 years from now, everyone will wear some sort of electronic device. “It’s not just teeth,” he says. “People are going to be bionic.” McAlpine belongs to a growing pack of tech-savvy scientists figuring out how to merge the rigid, brittle materials of conventional electronics with the soft, curving surfaces of human tissues. Their goal: To create products that have the high performance of silicon wafers — the crystalline material used in computer chips — while still moving with the body. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 15: Language and Our Divided Brain
Link ID: 17455 - Posted: 11.05.2012

By Miguel A. L. Nicolelis In 2014 billions of viewers worldwide may remember the opening game of the World Cup in Brazil for more than just the goals scored by the Brazilian national team and the red cards given to its adversary. On that day my laboratory at Duke University, which specializes in developing technologies that allow electrical signals from the brain to control robotic limbs, plans to mark a milestone in overcoming paralysis. If we succeed in meeting still formidable challenges, the first ceremonial kick of the World Cup game may be made by a paralyzed teenager, who, flanked by the two contending soccer teams, will saunter onto the pitch clad in a robotic body suit. This suit—or exoskeleton, as we call it—will envelop the teenager's legs. His or her first steps onto the field will be controlled by motor signals originating in the kicker's brain and transmitted wirelessly to a computer unit the size of a laptop in a backpack carried by our patient. This computer will be responsible for translating electrical brain signals into digital motor commands so that the exoskeleton can first stabilize the kicker's body weight and then induce the robotic legs to begin the back-and-forth coordinated movements of a walk over the manicured grass. Then, on approaching the ball, the kicker will visualize placing a foot in contact with it. Three hundred milliseconds later brain signals will instruct the exoskeleton's robotic foot to hook under the leather sphere, Brazilian style, and boot it aloft. This scientific demonstration of a radically new technology, undertaken with collaborators in Europe and Brazil, will convey to a global audience of billions that brain control of machines has moved from lab demos and futuristic speculation to a new era in which tools capable of bringing mobility to patients incapacitated by injury or disease may become a reality. © 2012 Scientific American

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17220 - Posted: 08.30.2012

Helen Shen Automated assistance may soon be available to neuroscientists tackling the brain’s complex circuitry, according to research presented last week at the Aspen Brain Forum in Colorado. Robots that can find and simultaneously record the activity of dozens of neurons in live animals could help researchers to reveal how connected cells interpret signals from one another and transmit information across brain areas — a task that would be impossible using single-neuron studies. The robots are designed to perform whole-cell patch-clamping, a difficult but powerful method that allows neuroscientists to access neurons' internal electrical workings, says Edward Boyden of the Massachusetts Institute of Technology in Cambridge, who is leading the work. Manually performing the method on live animals requires extensive training to perfect and, as a result, only a handful of neurophysiologists use the technique, says Boyden, who presented at the conference. He is developing the automated tool with Craig Forest at the Georgia Institute of Technology in Atlanta and others. “We think that it helps democratize procedures that require a lot of skill,” he says. In May, the group described how a basic version of the robot can record electrical currents in single neurons in the brains of anaesthetized mice1. The robot finds its target on the basis of characteristic changes in the electrical environment near neurons. Then, the device nicks the cell’s membrane and seals itself around the tiny hole to access the neuron's contents. On 24 August, Boyden presented results showing that a more advanced version of the robot could be used to identify and probe four neurons at once — and he says he wants to push the design further, perhaps to tap as many as 100 neurons at a time. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 17215 - Posted: 08.29.2012

Analysis by Jesse Emspak The phrase, "use your brainpower" may soon become literal. Engineers at MIT have developed a tiny prototype fuel cell that creates electricy from the body's natural sugars. The fuel cell could be used to power brain implants for treating epilepsy, Parkinson's diseases and paralysis. Currently, devices implanted in the body are typically powered by lithium-ion batteries, but they have a limited lifetime and need to be replaced. Opening up the body to replace a battery is not something doctor like to do, but doing it in the brain is even less desirable. The researchers, led by Rahul Sarpeshkar, an associate professor of electrical engineering and computer science, built the fuel cell using a platinum catalyst at one end and a layer of carbon nanotubes at the other. It rests on a silicon chip, allowing it to be connected to electronics that would be used in brain implants. coughing robot As glucose passes over the platinum, electrons and hydrogen ions are stripped off as it is oxidized. That's what makes the current. At the other end of the cell, oxygen mixes with the hydrogen to make water when it hits the layer of single-walled carbon nanotubes. The cell produces up to 180 microwatts, enough to power a brain implant that might send signals to bypass damaged region, or stimulate part of the brain (a treatment used in disorders such as Parkinson's). © 2012 Discovery Communications, LLC.

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 16982 - Posted: 06.28.2012

By Rachel Ehrenberg Directing a robotic arm with her thoughts, a paralyzed woman named Cathy can pick up a bottle of coffee and sip it through a straw, a simple task that she hasn’t done on her own for nearly 15 years. The technology that brought about the feat is a brain-computer interface system: A computer decodes signals from a tiny chip implanted in the woman’s brain, translating her thoughts into actions that are carried out by the robot arm. The seemingly mundane task of bringing a drink to one’s mouth is the first published demonstration that severely paralyzed people can conduct directed movements in three-dimensional space using a brain-controlled robotic device. This latest application of the system, called BrainGate, is described in the May 17 Nature. “Much has been demonstrated in terms of laboratory work and monkeys, but this is the first time showing something that’s going to be useful for patients,” says neuroscientist Andrew Jackson, of Newcastle University in England. A commentary by Jackson on the new developments appears in the same issue of Nature. There’s still a lot of work to do before BrainGate can be used outside a lab. In the current design, the tiny sensor that sits in the patient’s brain is attached to a mini fridge–sized computer via ungainly wires. So making the system wireless is one goal. The researchers hope that within a decade the BrainGate system will be available and affordable for people who are paralyzed or have prosthetic limbs. Eventually, similar technology might restore function to a natural limb that no longer works. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16807 - Posted: 05.17.2012

by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16675 - Posted: 04.19.2012

By DAVID EWING DUNCAN SAN DIEGO — Already surrounded by machines that allow him, painstakingly, to communicate, the physicist Stephen Hawking last summer donned what looked like a rakish black headband that held a feather-light device the size of a small matchbox. Called the iBrain, this simple-looking contraption is part of an experiment that aims to allow Dr. Hawking — long paralyzed by amyotrophic lateral sclerosis, or Lou Gehrig’s disease — to communicate by merely thinking. The iBrain is part of a new generation of portable neural devices and algorithms intended to monitor and diagnose conditions like sleep apnea, depression and autism. Invented by a team led by Philip Low, a 32-year-old neuroscientist who is chief executive of NeuroVigil, a company based in San Diego, the iBrain is gaining attention as a possible alternative to expensive sleep labs that use rubber and plastic caps riddled with dozens of electrodes and usually require a patient to stay overnight. “The iBrain can collect data in real time in a person’s own bed, or when they’re watching TV, or doing just about anything,” Dr. Low said. The device uses a single channel to pick up waves of electrical brain signals, which change with different activities and thoughts, or with the pathologies that accompany brain disorders. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16600 - Posted: 04.04.2012

David T. Blake After training, animals and humans can make their thoughts interact directly with computers. A study provides evidence that the corticostriatal system of the brain is essential for this learning process. Brain–machine interfaces have a rich history in the sci-fi genre: in The Matrix films, human brains are plugged into a computer-based simulation that then becomes their 'reality'. But using our thoughts to directly control computers or other devices is not just in the realm of fantasy. Monkeys can learn to use visual cues to instruct a brain–machine interface to move a robotic arm or a computer cursor1, 2. And electrode arrays were implanted into the brain of a paralysed man in 2006, enabling him to control an artificial arm, to move a cursor on a computer screen and even to open e-mail3. Over time, an individual learns to improve their control over the brain–machine interface by modifying the activity of their brain, but how this happens is not well understood. In an article published on Nature's website today, Koralek et al.4 report that the corticostriatal system of the brain is involved in learning mental actions and skills that do not involve physical movement, such as those required for control of brain–machine interfaces. The corticostriatal system has a unique pattern of connectivity that enables sensory inputs to be associated with appropriate motor or cognitive responses5. It consists of a cortical component, the primary motor cortex, that exerts control over muscles, and a striatal component, the basal ganglia, that receives direct inputs from the motor cortex. The basal ganglia are involved in a wide range of learning conditions and are crucial to the motor deficits observed in Parkinson's and Huntington's diseases. Both corticostriatal components have a role in the learning and execution of physical skills requiring movement. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16469 - Posted: 03.05.2012

M. Mitchell Waldrop It wasn't quite the lynching that Henry Markram had expected. But the barrage of sceptical comments from his fellow neuroscientists — “It's crap,” said one — definitely made the day feel like a tribunal. Officially, the Swiss Academy of Sciences meeting in Bern on 20 January was an overview of large-scale computer modelling in neuroscience. Unofficially, it was neuroscientists' first real chance to get answers about Markram's controversial proposal for the Human Brain Project (HBP) — an effort to build a supercomputer simulation that integrates everything known about the human brain, from the structures of ion channels in neural cell membranes up to mechanisms behind conscious decision-making. Markram, a South-African-born brain electrophysiologist who joined the Swiss Federal Institute of Technology in Lausanne (EPFL) a decade ago, may soon see his ambition fulfilled. The project is one of six finalists vying to win €1 billion (US$1.3 billion) as one of the European Union's two new decade-long Flagship initiatives. “Brain researchers are generating 60,000 papers per year,” said Markram as he explained the concept in Bern. “They're all beautiful, fantastic studies — but all focused on their one little corner: this molecule, this brain region, this function, this map.” The HBP would integrate these discoveries, he said, and create models to explore how neural circuits are organized, and how they give rise to behaviour and cognition — among the deepest mysteries in neuroscience. Ultimately, said Markram, the HBP would even help researchers to grapple with disorders such as Alzheimer's disease. “If we don't have an integrated view, we won't understand these diseases,” he declared. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 16416 - Posted: 02.23.2012

by Anil Ananthaswamy GOVERNMENT spooks want cyborg insects to snoop on their enemies. Biologists want to tap into the nervous systems of insects to understand how they fly. A probe that can be implanted into moths to control their flight could help satisfy both parties. One day, it could even help rehabilitate people who have had strokes. The US Defense Advanced Research Projects Agency (DARPA) has been running a programme to develop machine-insect interfaces for years but electrodes implanted to stimulate the brains or wing muscles of insects were not precise enough. Now Joel Voldman of the Massachusetts Institute of Technology and colleagues have designed a unique, flexible neural probe that can be attached directly to an insect's ventral nerve cord (VNC), which, along with the brain, makes up the central nervous system in insects. Another reason previous attempts have not been entirely successful was because the impedance of the electrodes did not match that of the insect's tissue. This probe is made of a polyimide polymer coated with gold and carbon nanotubes, and its impedance is much closer to that of nerve tissue. One end of the probe is a ring that clamps around the VNC. The inside of the ring has five electrodes which stimulate distinct nerve bundles within the VNC. Attached to the probe is a wireless stimulator, which contains a radio receiver, as well as a battery and a device to generate electrical pulses. The team implanted the device in the abdomen of a tobacco hawkmoth (Manduca sexta). As it weighs less than half a gram, it is easy for the moth to carry. "Their wingspan is the width of your hand," says Voldman. "These are big guys." © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16359 - Posted: 02.09.2012

By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16303 - Posted: 01.28.2012

by Jason Daley Over recent months, in José del R. Millán’s computer science lab in Switzerland, a little round robot, similar to a Roomba with a laptop mounted on it (right), bumped its way through an office space filled with furniture and people. Nothing special, except the robot was being controlled from a clinic more than 60 miles away—and not with a joystick or keyboard, but with the brain waves of a paralyzed patient. The robot’s journey was an experiment in shared control, a type of brain-machine interface that merges conscious thought and algorithms to give disabled patients finer mental control over devices that help them communicate or retrieve objects. If the user experiences a mental misfire, Millán’s software can step in to help. Instead of crashing down the stairs, for instance, the robot would recalculate to find the door. Such technology is a potential life changer for the tens of thousands of people suffering from locked-in syndrome, a type of paralysis that leaves patients with only the ability to blink. The condition is usually incurable, but Millán’s research could make it more bearable, allowing patients to engage the world through a robotic proxy. “The last 10 years have been like a proof of concept,” says Justin Sanchez, director of the Neuro­prosthetics Research Group at the University of Miami, who is also studying shared control. “But the research is moving fast. Now there is a big push to get these devices to people who need them for everyday life.” © 2012, Kalmbach Publishing Co.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16214 - Posted: 01.05.2012

Scientists are getting closer to the dream of creating computer systems that can replicate the brain. Researchers at the Massachusetts Institute of Technology have designed a computer chip that mimics how the brain's neurons adapt in response to new information. Such chips could eventually enable communication between artificially created body parts and the brain. It could also pave the way for artificial intelligence devices. There are about 100 billion neurons in the brain, each of which forms synapses - the connections between neurons that allow information to flow - with many other neurons. This process is known as plasticity and is believed to underpin many brain functions, such as learning and memory. Neural functions The MIT team, led by research scientist Chi-Sang Poon, has been able to design a computer chip that can simulate the activity of a single brain synapse. Activity in the synapses relies on so-called ion channels which control the flow of charged atoms such as sodium, potassium and calcium. The 'brain chip' has about 400 transistors and is wired up to replicate the circuitry of the brain. BBC © 2011

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16053 - Posted: 11.19.2011

Melissae Fellet, reporter A paralysed man has high-fived his girlfriend using a robotic arm controlled only by his thoughts (see video above). Tim Hemmes, who was paralysed in a motorcycle accident seven years ago, is the first participant in a clinical trial testing a brain implant that directs movement of an external device. Neurosurgeons at the University of Pittsburgh School of Medicine in Pennsylvania implanted a grid of electrodes, about the size of a large postage stamp, on top of Hemmes's brain over an area of neurons that fire when he imagines moving his right arm. They threaded wires from the implant underneath the skin of his neck and pulled the ends out of his body near his chest. The team then connected the implant to a computer that converts specific brainwaves into particular actions. As shown in this video, Hemmes first practices controlling a dot on a TV screen with his mind. The dot moves right when he imagines bending his elbow. Thinking about wiggling his thumb makes the dot slide left. With practice, Hemmes learned to move the cursor just by visualizing the motion, rather than concentrating on specific arm movements, says neurosurgeon Elizabeth Tyler-Kabara of the University of Pittsburgh in Pennsylvania, who implanted the electrodes. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15910 - Posted: 10.13.2011

by Ferris Jabr Monkeys have feelings too. In a mind-meld between monkey and computer, rhesus macaques have learned to "feel" the texture of virtual objects without physically touching a thing. In the future, prosthetic limbs modelled on similar technology could return a sense of touch to people with amputations. Using two-way communication between brain and machine, the monkeys manoeuvred a cursor with their minds and identified virtual objects by texture, based on electrical feedback from the computer. Miguel Nicolelis of Duke University Medical Center in Durham, North Carolina, and his colleagues implanted electrodes into the brains of two monkeys. The electrodes recorded activity in the motor cortex and somatosensory cortex (SSC) – brain areas that orchestrate voluntary movement and sense of touch. Electrical activity from the motor cortex was sent to a computer, which translated the neural chatter into instructions that moved a cursor on screen. The monkeys learned what patterns of thought reliably changed the cursor's position. The team then assigned a unique texture to each of three identical circles on the screen. When the cursor hovered over each circle, the computer zapped the monkeys' SSCs with the same electrical impulses that occurred when they touched each texture in real life. Finally, the team taught the monkeys to associate a particular texture with a reward. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15881 - Posted: 10.06.2011

by Linda Geddes AN ARTIFICIAL cerebellum has restored lost brain function in rats, bringing the prospect of cyborg-style brain implants a step closer to reality. Such implants could eventually be used to replace areas of brain tissue damaged by stroke and other conditions, or even to enhance healthy brain function and restore learning processes that decline with age. Cochlear implants and prosthetic limbs have already proved that it is possible to wire electrical devices into the brain and make sense of them, but such devices involve only one-way communication, either from the device to the brain or vice versa. Now Matti Mintz of Tel Aviv University in Israel and his colleagues have created a synthetic cerebellum which can receive sensory inputs from the brainstem - a region that acts as a conduit for neuronal information from the rest of the body. Their device can interpret these inputs, and send a signal to a different region of the brainstem that prompts motor neurons to execute the appropriate movement. "It's proof of concept that we can record information from the brain, analyse it in a way similar to the biological network, and return it to the brain," says Mintz, who presented the work this month at the Strategies for Engineered Negligible Senescence meeting in Cambridge, UK. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15850 - Posted: 09.29.2011

By PAGAN KENNEDY “Fingers!” Gerwin Schalk sputtered, waving his hands around in the air. “Fingers are made to pick up a hammer.” He prodded the table, mimicking the way we poke at computer keyboards. “It’s totally ridiculous,” he said. I was visiting Schalk, a 40-year-old computer engineer, at his bunkerlike office in the Wads­worth Center, a public-health lab outside Albany that handles many of New York State’s rabies tests. It so happens that his lab is also pioneering a new way to control our computers — with thoughts instead of fingers. Schalk studies people at the Albany Medical Center who have become, not by choice, some of the world’s first cyborgs. One volunteer was a young man in his 20s who suffers from a severe form of epilepsy. He had been outfitted with a temporary device, a postcard-­size patch of electrodes that sits on the brain’s cortex, known as an electrocorticographic (ECoG) implant. Surgeons use these implants to home in on the damaged tissue that causes seizures. Schalk took advantage of the implant to see if the patient could control the actions in a video game called Galaga using only his thoughts. In the videotape of this experiment, you see a young man wearing a turban of bandages with wires running from his head to a computer in a cart. “Pew, pew,” the ship on the computer screen whines, as it decimates buglike creatures. The patient flicks the spaceship back and forth by imagining that he is moving his tongue. This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command. © 2011 The New York Times Company

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15807 - Posted: 09.17.2011

by Sara Reardon They're not quite psychic yet, but machines are getting better at reading your mind. Researchers have invented a new, noninvasive method for recording patterns of brain activity and using them to steer a robot. Scientists hope the technology will give "locked in" patients—those too disabled to communicate with the outside world—the ability to interact with others and even give the illusion of being physically present, or "telepresent," with friends and family. Previous brain-machine interface systems have made it possible for people to control robots, cursors, or prosthetics with conscious thought, but they often take a lot of effort and concentration, says José del R. Millán, a biomedical engineer at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, who develops brain-machine interface systems that don't need to be implanted into the brain. Millán's goal is to make control as easy as driving a car on a highway. A partially autonomous robot would allow a user to stop concentrating on tasks that he or she would normally do subconsciously, such as following a person or avoiding running into walls. But if the robot encounters an unexpected event and needs to make a split-second decision, the user's thoughts can override the robot's artificial intelligence. To test their technology, Millán and colleagues created a telepresent robot by modifying a commercially available bot called Robotino. The robot looks a bit like a platform on three wheels, and it can avoid obstacles on its own using infrared sensors. On top of the robot, the researchers placed a laptop running Skype, a voice and video Internet chat system, over a wireless Internet connection. © 2010 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15778 - Posted: 09.08.2011

By Laura Sanders In a fast-moving car, the brain can hit the brakes faster than the foot. By relying on brain waves that signal the intent to jam on the brakes, a new technology could shave critical milliseconds off the reaction time, researchers report online July 28 in the Journal of Neural Engineering. The work adds to a growing trend in car technology that assists drivers. Though it may eventually lead to improvements in emergency braking, the new brain signal technology isn’t ready for the road. “As a basic science study, I was quite impressed with it,” says cognitive neuroscientist Raja Parasuraman of George Mason University in Fairfax, Va. “I just think a lot more needs to be done.” In the study, computer scientist Stefan Haufe of the Berlin Institute of Technology in Germany and his colleagues measured brain wave changes while participants drove in a car simulator. The participants drove around 60 miles per hour, following a lead car on a curvy road with heavy oncoming traffic. Every so often the lead car would slam on its brakes, so that the participant would have to either do the same or crash. For most drivers, the lag between the lead car stopping and themselves slamming the brakes was around 700 milliseconds. Particular neural signatures were evident during this lag time, and they could be early indicators that the drivers wanted to brake. “Our approach was to obtain the intention of the driver faster than he could actually act,” Haufe says. “That’s what the neural signature is good for.” © Society for Science & the Public 2000 - 2011

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 14: Attention and Consciousness
Link ID: 15636 - Posted: 07.30.2011