Links for Keyword: Robotics

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 203

By James Gallagher Health and science reporter, BBC News Unrivalled control of a robotic arm has been achieved using a paralysed woman's thoughts, a US study says. Jan Scheuermann, who is 53 and paralysed from the neck down, was able to deftly grasp and move a variety of objects just like a normal arm. Brain implants were used to control the robotic arm, in the study reported in the Lancet medical journal. Experts in the field said it was an "unprecedented performance" and a "remarkable achievement". Jan was diagnosed with spinocerebellar degeneration 13 years ago and progressively lost control of her body. She is now unable to move her arms or legs. She was implanted with two sensors - each four millimetres by four millimetres - in the motor cortex of her brain. A hundred tiny needles on each sensor pick up the electrical activity from about 200 individual brain cells. "The way that neurons communicate with each other is by how fast they fire pulses, it's a little bit akin to listening to a Geiger counter click, and it's that property that we lock onto," said Professor Andrew Schwartz from the University of Pittsburgh. The pulses of electricity in the brain are then translated into commands to move the arm, which bends at the elbow, wrist and could grab an object. BBC © 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17611 - Posted: 12.17.2012

By Laura Sanders A new computer simulation of the brain can count, remember and gamble. And the system, called Spaun, performs these tasks in a way that’s eerily similar to how people do. Short for Semantic Pointer Architecture Unified Network, Spaun is a crude approximation of the human brain. But scientists hope that the program and efforts like it could be a proving ground to test ideas about the brain. Several groups of scientists have been racing to construct a realistic model of the human brain, or at least parts of it. What distinguishes Spaun from other attempts is that the model actually does something, says computational neuroscientist Christian Machens of the Champalimaud Centre for the Unknown in Lisbon, Portugal. At the end of an intense computational session, Spaun spits out instructions for a behavior, such as how to reproduce a number it’s been shown. “And of course, that’s why the brain is interesting,” Machens says. “That’s what makes it different from a plant.” Like a digital Frankenstein’s monster, Spaun was cobbled together from bits and pieces of knowledge gleaned from years of basic brain research. The behavior of 2.5 million nerve cells in parts of the brain important for vision, memory, reasoning and other tasks forms the basis of the new system, says Chris Eliasmith of the University of Waterloo in Canada, coauthor of the study, which appears in the Nov. 30 Science. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17557 - Posted: 12.01.2012

David Perlman With an ultimate goal to help paralyzed patients achieve a degree of independence, Stanford brain researchers report they have taken a promising step forward in efforts to link nerve centers in the human brain with computers controlled by only a person's thought. In their latest development, the Stanford scientists have successfully enabled a pair of rhesus monkeys to move a virtual cursor across a computer screen merely by thinking about their response to human commands. The monkeys' ability to manipulate a cursor without using a mouse is based on a powerful new algorithm, a mathematical computing program devised by Vikash Gilja, a Stanford electrical engineer and computer scientist. Four years ago, neurosurgeons at Brown University and Massachusetts General Hospital had demonstrated a simpler version of an algorithm that enabled completely paralyzed humans with implanted sensors in their brains to command a cursor to move erratically toward targets on a computer screen. But with Gilja's algorithm, called ReFit, the monkeys showed they could aim their virtual cursor, a moving dot of light, at another bright light on a computer screen, and hold it steadily there for 15 seconds - far more precisely than the humans four years ago. With the new algorithm, they were able to perform their thinking tasks faster and more accurately as they sat comfortably in a chair facing the computer. The development is "a big step toward clinically useful brain-machine technology that has faster, smoother, and more natural movements" than anything before it, said James Gnadt of the National Institute of Neurological Disorders and Stroke. © 2012 Hearst Communications Inc.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17537 - Posted: 11.26.2012

By David Pogue Okay, great: we can control Our phones with speech recognition and our television sets with gesture recognition. But those technologies don't work in all situations for all people. So I say, forget about those crude beginnings; what we really want is thought recognition. As I found out during research for a recent NOVA episode, it mostly appears that brain-computer interface (BCI) technology has not advanced very far just yet. For example, I tried to make a toy helicopter fly by thinking “up” as I wore a $300 commercial EEG headset. It barely worked. Such “mind-reading” caps are quick to put on and noninvasive. They listen, through your scalp, for the incredibly weak remnants of electrical signals from your brain activity. But they're lousy at figuring out where in your brain they originated. Furthermore, the headset software didn't even know that I was thinking “up.” I could just as easily have thought “goofy” or “shoelace” or “pickle”—whatever I had thought about during the 15-second training session. There are other noninvasive brain scanners—magnetoencephalography, positron-emission tomography and near-infrared spectroscopy, and so on—but each also has its trade-offs. Of course, you can implant sensors inside someone's skull for the best readings of all; immobilized patients have successfully manipulated computer cursors and robotic arms using this approach. Still, when it comes to controlling everyday electronics, brain surgery might be a tough sell. © 2012 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17518 - Posted: 11.21.2012

By Meghan Rosen Michael McAlpine’s shiny circuit doesn’t look like something you would stick in your mouth. It’s dashed with gold, has a coiled antenna and is glued to a stiff rectangle. But the antenna flexes, and the rectangle is actually silk, its stiffness melting away under water. And if you paste the device on your tooth, it could keep you healthy. The electronic gizmo is designed to detect dangerous bacteria and send out warning signals, alerting its bearer to microbes slipping past the lips. Recently, McAlpine, of Princeton University, and his colleagues spotted a single E. coli bacterium skittering across the surface of the gadget’s sensor. The sensor also picked out ulcer-causing H. pylori amid the molecular medley of human saliva, the team reported earlier this year in Nature Communications. At about the size of a standard postage stamp, the dental device is still too big to fit comfortably in a human mouth. “We had to use a cow tooth,” McAlpine says, describing test experiments. But his team plans to shrink the gadget so it can nestle against human enamel. McAlpine is convinced that one day, perhaps five to 10 years from now, everyone will wear some sort of electronic device. “It’s not just teeth,” he says. “People are going to be bionic.” McAlpine belongs to a growing pack of tech-savvy scientists figuring out how to merge the rigid, brittle materials of conventional electronics with the soft, curving surfaces of human tissues. Their goal: To create products that have the high performance of silicon wafers — the crystalline material used in computer chips — while still moving with the body. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 15: Language and Our Divided Brain
Link ID: 17455 - Posted: 11.05.2012

By Miguel A. L. Nicolelis In 2014 billions of viewers worldwide may remember the opening game of the World Cup in Brazil for more than just the goals scored by the Brazilian national team and the red cards given to its adversary. On that day my laboratory at Duke University, which specializes in developing technologies that allow electrical signals from the brain to control robotic limbs, plans to mark a milestone in overcoming paralysis. If we succeed in meeting still formidable challenges, the first ceremonial kick of the World Cup game may be made by a paralyzed teenager, who, flanked by the two contending soccer teams, will saunter onto the pitch clad in a robotic body suit. This suit—or exoskeleton, as we call it—will envelop the teenager's legs. His or her first steps onto the field will be controlled by motor signals originating in the kicker's brain and transmitted wirelessly to a computer unit the size of a laptop in a backpack carried by our patient. This computer will be responsible for translating electrical brain signals into digital motor commands so that the exoskeleton can first stabilize the kicker's body weight and then induce the robotic legs to begin the back-and-forth coordinated movements of a walk over the manicured grass. Then, on approaching the ball, the kicker will visualize placing a foot in contact with it. Three hundred milliseconds later brain signals will instruct the exoskeleton's robotic foot to hook under the leather sphere, Brazilian style, and boot it aloft. This scientific demonstration of a radically new technology, undertaken with collaborators in Europe and Brazil, will convey to a global audience of billions that brain control of machines has moved from lab demos and futuristic speculation to a new era in which tools capable of bringing mobility to patients incapacitated by injury or disease may become a reality. © 2012 Scientific American

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17220 - Posted: 08.30.2012

Helen Shen Automated assistance may soon be available to neuroscientists tackling the brain’s complex circuitry, according to research presented last week at the Aspen Brain Forum in Colorado. Robots that can find and simultaneously record the activity of dozens of neurons in live animals could help researchers to reveal how connected cells interpret signals from one another and transmit information across brain areas — a task that would be impossible using single-neuron studies. The robots are designed to perform whole-cell patch-clamping, a difficult but powerful method that allows neuroscientists to access neurons' internal electrical workings, says Edward Boyden of the Massachusetts Institute of Technology in Cambridge, who is leading the work. Manually performing the method on live animals requires extensive training to perfect and, as a result, only a handful of neurophysiologists use the technique, says Boyden, who presented at the conference. He is developing the automated tool with Craig Forest at the Georgia Institute of Technology in Atlanta and others. “We think that it helps democratize procedures that require a lot of skill,” he says. In May, the group described how a basic version of the robot can record electrical currents in single neurons in the brains of anaesthetized mice1. The robot finds its target on the basis of characteristic changes in the electrical environment near neurons. Then, the device nicks the cell’s membrane and seals itself around the tiny hole to access the neuron's contents. On 24 August, Boyden presented results showing that a more advanced version of the robot could be used to identify and probe four neurons at once — and he says he wants to push the design further, perhaps to tap as many as 100 neurons at a time. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 17215 - Posted: 08.29.2012

Analysis by Jesse Emspak The phrase, "use your brainpower" may soon become literal. Engineers at MIT have developed a tiny prototype fuel cell that creates electricy from the body's natural sugars. The fuel cell could be used to power brain implants for treating epilepsy, Parkinson's diseases and paralysis. Currently, devices implanted in the body are typically powered by lithium-ion batteries, but they have a limited lifetime and need to be replaced. Opening up the body to replace a battery is not something doctor like to do, but doing it in the brain is even less desirable. The researchers, led by Rahul Sarpeshkar, an associate professor of electrical engineering and computer science, built the fuel cell using a platinum catalyst at one end and a layer of carbon nanotubes at the other. It rests on a silicon chip, allowing it to be connected to electronics that would be used in brain implants. coughing robot As glucose passes over the platinum, electrons and hydrogen ions are stripped off as it is oxidized. That's what makes the current. At the other end of the cell, oxygen mixes with the hydrogen to make water when it hits the layer of single-walled carbon nanotubes. The cell produces up to 180 microwatts, enough to power a brain implant that might send signals to bypass damaged region, or stimulate part of the brain (a treatment used in disorders such as Parkinson's). © 2012 Discovery Communications, LLC.

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 16982 - Posted: 06.28.2012

By Rachel Ehrenberg Directing a robotic arm with her thoughts, a paralyzed woman named Cathy can pick up a bottle of coffee and sip it through a straw, a simple task that she hasn’t done on her own for nearly 15 years. The technology that brought about the feat is a brain-computer interface system: A computer decodes signals from a tiny chip implanted in the woman’s brain, translating her thoughts into actions that are carried out by the robot arm. The seemingly mundane task of bringing a drink to one’s mouth is the first published demonstration that severely paralyzed people can conduct directed movements in three-dimensional space using a brain-controlled robotic device. This latest application of the system, called BrainGate, is described in the May 17 Nature. “Much has been demonstrated in terms of laboratory work and monkeys, but this is the first time showing something that’s going to be useful for patients,” says neuroscientist Andrew Jackson, of Newcastle University in England. A commentary by Jackson on the new developments appears in the same issue of Nature. There’s still a lot of work to do before BrainGate can be used outside a lab. In the current design, the tiny sensor that sits in the patient’s brain is attached to a mini fridge–sized computer via ungainly wires. So making the system wireless is one goal. The researchers hope that within a decade the BrainGate system will be available and affordable for people who are paralyzed or have prosthetic limbs. Eventually, similar technology might restore function to a natural limb that no longer works. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16807 - Posted: 05.17.2012

by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16675 - Posted: 04.19.2012

By DAVID EWING DUNCAN SAN DIEGO — Already surrounded by machines that allow him, painstakingly, to communicate, the physicist Stephen Hawking last summer donned what looked like a rakish black headband that held a feather-light device the size of a small matchbox. Called the iBrain, this simple-looking contraption is part of an experiment that aims to allow Dr. Hawking — long paralyzed by amyotrophic lateral sclerosis, or Lou Gehrig’s disease — to communicate by merely thinking. The iBrain is part of a new generation of portable neural devices and algorithms intended to monitor and diagnose conditions like sleep apnea, depression and autism. Invented by a team led by Philip Low, a 32-year-old neuroscientist who is chief executive of NeuroVigil, a company based in San Diego, the iBrain is gaining attention as a possible alternative to expensive sleep labs that use rubber and plastic caps riddled with dozens of electrodes and usually require a patient to stay overnight. “The iBrain can collect data in real time in a person’s own bed, or when they’re watching TV, or doing just about anything,” Dr. Low said. The device uses a single channel to pick up waves of electrical brain signals, which change with different activities and thoughts, or with the pathologies that accompany brain disorders. © 2012 The New York Times Company

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16600 - Posted: 04.04.2012

David T. Blake After training, animals and humans can make their thoughts interact directly with computers. A study provides evidence that the corticostriatal system of the brain is essential for this learning process. Brain–machine interfaces have a rich history in the sci-fi genre: in The Matrix films, human brains are plugged into a computer-based simulation that then becomes their 'reality'. But using our thoughts to directly control computers or other devices is not just in the realm of fantasy. Monkeys can learn to use visual cues to instruct a brain–machine interface to move a robotic arm or a computer cursor1, 2. And electrode arrays were implanted into the brain of a paralysed man in 2006, enabling him to control an artificial arm, to move a cursor on a computer screen and even to open e-mail3. Over time, an individual learns to improve their control over the brain–machine interface by modifying the activity of their brain, but how this happens is not well understood. In an article published on Nature's website today, Koralek et al.4 report that the corticostriatal system of the brain is involved in learning mental actions and skills that do not involve physical movement, such as those required for control of brain–machine interfaces. The corticostriatal system has a unique pattern of connectivity that enables sensory inputs to be associated with appropriate motor or cognitive responses5. It consists of a cortical component, the primary motor cortex, that exerts control over muscles, and a striatal component, the basal ganglia, that receives direct inputs from the motor cortex. The basal ganglia are involved in a wide range of learning conditions and are crucial to the motor deficits observed in Parkinson's and Huntington's diseases. Both corticostriatal components have a role in the learning and execution of physical skills requiring movement. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16469 - Posted: 03.05.2012

M. Mitchell Waldrop It wasn't quite the lynching that Henry Markram had expected. But the barrage of sceptical comments from his fellow neuroscientists — “It's crap,” said one — definitely made the day feel like a tribunal. Officially, the Swiss Academy of Sciences meeting in Bern on 20 January was an overview of large-scale computer modelling in neuroscience. Unofficially, it was neuroscientists' first real chance to get answers about Markram's controversial proposal for the Human Brain Project (HBP) — an effort to build a supercomputer simulation that integrates everything known about the human brain, from the structures of ion channels in neural cell membranes up to mechanisms behind conscious decision-making. Markram, a South-African-born brain electrophysiologist who joined the Swiss Federal Institute of Technology in Lausanne (EPFL) a decade ago, may soon see his ambition fulfilled. The project is one of six finalists vying to win €1 billion (US$1.3 billion) as one of the European Union's two new decade-long Flagship initiatives. “Brain researchers are generating 60,000 papers per year,” said Markram as he explained the concept in Bern. “They're all beautiful, fantastic studies — but all focused on their one little corner: this molecule, this brain region, this function, this map.” The HBP would integrate these discoveries, he said, and create models to explore how neural circuits are organized, and how they give rise to behaviour and cognition — among the deepest mysteries in neuroscience. Ultimately, said Markram, the HBP would even help researchers to grapple with disorders such as Alzheimer's disease. “If we don't have an integrated view, we won't understand these diseases,” he declared. © 2012 Nature Publishing Group

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 16416 - Posted: 02.23.2012

by Anil Ananthaswamy GOVERNMENT spooks want cyborg insects to snoop on their enemies. Biologists want to tap into the nervous systems of insects to understand how they fly. A probe that can be implanted into moths to control their flight could help satisfy both parties. One day, it could even help rehabilitate people who have had strokes. The US Defense Advanced Research Projects Agency (DARPA) has been running a programme to develop machine-insect interfaces for years but electrodes implanted to stimulate the brains or wing muscles of insects were not precise enough. Now Joel Voldman of the Massachusetts Institute of Technology and colleagues have designed a unique, flexible neural probe that can be attached directly to an insect's ventral nerve cord (VNC), which, along with the brain, makes up the central nervous system in insects. Another reason previous attempts have not been entirely successful was because the impedance of the electrodes did not match that of the insect's tissue. This probe is made of a polyimide polymer coated with gold and carbon nanotubes, and its impedance is much closer to that of nerve tissue. One end of the probe is a ring that clamps around the VNC. The inside of the ring has five electrodes which stimulate distinct nerve bundles within the VNC. Attached to the probe is a wireless stimulator, which contains a radio receiver, as well as a battery and a device to generate electrical pulses. The team implanted the device in the abdomen of a tobacco hawkmoth (Manduca sexta). As it weighs less than half a gram, it is easy for the moth to carry. "Their wingspan is the width of your hand," says Voldman. "These are big guys." © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16359 - Posted: 02.09.2012

By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16303 - Posted: 01.28.2012

by Jason Daley Over recent months, in José del R. Millán’s computer science lab in Switzerland, a little round robot, similar to a Roomba with a laptop mounted on it (right), bumped its way through an office space filled with furniture and people. Nothing special, except the robot was being controlled from a clinic more than 60 miles away—and not with a joystick or keyboard, but with the brain waves of a paralyzed patient. The robot’s journey was an experiment in shared control, a type of brain-machine interface that merges conscious thought and algorithms to give disabled patients finer mental control over devices that help them communicate or retrieve objects. If the user experiences a mental misfire, Millán’s software can step in to help. Instead of crashing down the stairs, for instance, the robot would recalculate to find the door. Such technology is a potential life changer for the tens of thousands of people suffering from locked-in syndrome, a type of paralysis that leaves patients with only the ability to blink. The condition is usually incurable, but Millán’s research could make it more bearable, allowing patients to engage the world through a robotic proxy. “The last 10 years have been like a proof of concept,” says Justin Sanchez, director of the Neuro­prosthetics Research Group at the University of Miami, who is also studying shared control. “But the research is moving fast. Now there is a big push to get these devices to people who need them for everyday life.” © 2012, Kalmbach Publishing Co.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16214 - Posted: 01.05.2012

Scientists are getting closer to the dream of creating computer systems that can replicate the brain. Researchers at the Massachusetts Institute of Technology have designed a computer chip that mimics how the brain's neurons adapt in response to new information. Such chips could eventually enable communication between artificially created body parts and the brain. It could also pave the way for artificial intelligence devices. There are about 100 billion neurons in the brain, each of which forms synapses - the connections between neurons that allow information to flow - with many other neurons. This process is known as plasticity and is believed to underpin many brain functions, such as learning and memory. Neural functions The MIT team, led by research scientist Chi-Sang Poon, has been able to design a computer chip that can simulate the activity of a single brain synapse. Activity in the synapses relies on so-called ion channels which control the flow of charged atoms such as sodium, potassium and calcium. The 'brain chip' has about 400 transistors and is wired up to replicate the circuitry of the brain. BBC © 2011

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16053 - Posted: 11.19.2011

Melissae Fellet, reporter A paralysed man has high-fived his girlfriend using a robotic arm controlled only by his thoughts (see video above). Tim Hemmes, who was paralysed in a motorcycle accident seven years ago, is the first participant in a clinical trial testing a brain implant that directs movement of an external device. Neurosurgeons at the University of Pittsburgh School of Medicine in Pennsylvania implanted a grid of electrodes, about the size of a large postage stamp, on top of Hemmes's brain over an area of neurons that fire when he imagines moving his right arm. They threaded wires from the implant underneath the skin of his neck and pulled the ends out of his body near his chest. The team then connected the implant to a computer that converts specific brainwaves into particular actions. As shown in this video, Hemmes first practices controlling a dot on a TV screen with his mind. The dot moves right when he imagines bending his elbow. Thinking about wiggling his thumb makes the dot slide left. With practice, Hemmes learned to move the cursor just by visualizing the motion, rather than concentrating on specific arm movements, says neurosurgeon Elizabeth Tyler-Kabara of the University of Pittsburgh in Pennsylvania, who implanted the electrodes. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15910 - Posted: 10.13.2011

by Ferris Jabr Monkeys have feelings too. In a mind-meld between monkey and computer, rhesus macaques have learned to "feel" the texture of virtual objects without physically touching a thing. In the future, prosthetic limbs modelled on similar technology could return a sense of touch to people with amputations. Using two-way communication between brain and machine, the monkeys manoeuvred a cursor with their minds and identified virtual objects by texture, based on electrical feedback from the computer. Miguel Nicolelis of Duke University Medical Center in Durham, North Carolina, and his colleagues implanted electrodes into the brains of two monkeys. The electrodes recorded activity in the motor cortex and somatosensory cortex (SSC) – brain areas that orchestrate voluntary movement and sense of touch. Electrical activity from the motor cortex was sent to a computer, which translated the neural chatter into instructions that moved a cursor on screen. The monkeys learned what patterns of thought reliably changed the cursor's position. The team then assigned a unique texture to each of three identical circles on the screen. When the cursor hovered over each circle, the computer zapped the monkeys' SSCs with the same electrical impulses that occurred when they touched each texture in real life. Finally, the team taught the monkeys to associate a particular texture with a reward. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15881 - Posted: 10.06.2011

by Linda Geddes AN ARTIFICIAL cerebellum has restored lost brain function in rats, bringing the prospect of cyborg-style brain implants a step closer to reality. Such implants could eventually be used to replace areas of brain tissue damaged by stroke and other conditions, or even to enhance healthy brain function and restore learning processes that decline with age. Cochlear implants and prosthetic limbs have already proved that it is possible to wire electrical devices into the brain and make sense of them, but such devices involve only one-way communication, either from the device to the brain or vice versa. Now Matti Mintz of Tel Aviv University in Israel and his colleagues have created a synthetic cerebellum which can receive sensory inputs from the brainstem - a region that acts as a conduit for neuronal information from the rest of the body. Their device can interpret these inputs, and send a signal to a different region of the brainstem that prompts motor neurons to execute the appropriate movement. "It's proof of concept that we can record information from the brain, analyse it in a way similar to the biological network, and return it to the brain," says Mintz, who presented the work this month at the Strategies for Engineered Negligible Senescence meeting in Cambridge, UK. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15850 - Posted: 09.29.2011