Links for Keyword: Robotics

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 121 - 140 of 276

Analysis by Jesse Emspak The phrase, "use your brainpower" may soon become literal. Engineers at MIT have developed a tiny prototype fuel cell that creates electricy from the body's natural sugars. The fuel cell could be used to power brain implants for treating epilepsy, Parkinson's diseases and paralysis. Currently, devices implanted in the body are typically powered by lithium-ion batteries, but they have a limited lifetime and need to be replaced. Opening up the body to replace a battery is not something doctor like to do, but doing it in the brain is even less desirable. The researchers, led by Rahul Sarpeshkar, an associate professor of electrical engineering and computer science, built the fuel cell using a platinum catalyst at one end and a layer of carbon nanotubes at the other. It rests on a silicon chip, allowing it to be connected to electronics that would be used in brain implants. coughing robot As glucose passes over the platinum, electrons and hydrogen ions are stripped off as it is oxidized. That's what makes the current. At the other end of the cell, oxygen mixes with the hydrogen to make water when it hits the layer of single-walled carbon nanotubes. The cell produces up to 180 microwatts, enough to power a brain implant that might send signals to bypass damaged region, or stimulate part of the brain (a treatment used in disorders such as Parkinson's). © 2012 Discovery Communications, LLC.

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 16982 - Posted: 06.28.2012

By Rachel Ehrenberg Directing a robotic arm with her thoughts, a paralyzed woman named Cathy can pick up a bottle of coffee and sip it through a straw, a simple task that she hasn’t done on her own for nearly 15 years. The technology that brought about the feat is a brain-computer interface system: A computer decodes signals from a tiny chip implanted in the woman’s brain, translating her thoughts into actions that are carried out by the robot arm. The seemingly mundane task of bringing a drink to one’s mouth is the first published demonstration that severely paralyzed people can conduct directed movements in three-dimensional space using a brain-controlled robotic device. This latest application of the system, called BrainGate, is described in the May 17 Nature. “Much has been demonstrated in terms of laboratory work and monkeys, but this is the first time showing something that’s going to be useful for patients,” says neuroscientist Andrew Jackson, of Newcastle University in England. A commentary by Jackson on the new developments appears in the same issue of Nature. There’s still a lot of work to do before BrainGate can be used outside a lab. In the current design, the tiny sensor that sits in the patient’s brain is attached to a mini fridge–sized computer via ungainly wires. So making the system wireless is one goal. The researchers hope that within a decade the BrainGate system will be available and affordable for people who are paralyzed or have prosthetic limbs. Eventually, similar technology might restore function to a natural limb that no longer works. © Society for Science & the Public 2000 - 2012

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16807 - Posted: 05.17.2012

by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16675 - Posted: 04.19.2012

By DAVID EWING DUNCAN SAN DIEGO — Already surrounded by machines that allow him, painstakingly, to communicate, the physicist Stephen Hawking last summer donned what looked like a rakish black headband that held a feather-light device the size of a small matchbox. Called the iBrain, this simple-looking contraption is part of an experiment that aims to allow Dr. Hawking — long paralyzed by amyotrophic lateral sclerosis, or Lou Gehrig’s disease — to communicate by merely thinking. The iBrain is part of a new generation of portable neural devices and algorithms intended to monitor and diagnose conditions like sleep apnea, depression and autism. Invented by a team led by Philip Low, a 32-year-old neuroscientist who is chief executive of NeuroVigil, a company based in San Diego, the iBrain is gaining attention as a possible alternative to expensive sleep labs that use rubber and plastic caps riddled with dozens of electrodes and usually require a patient to stay overnight. “The iBrain can collect data in real time in a person’s own bed, or when they’re watching TV, or doing just about anything,” Dr. Low said. The device uses a single channel to pick up waves of electrical brain signals, which change with different activities and thoughts, or with the pathologies that accompany brain disorders. © 2012 The New York Times Company

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16600 - Posted: 04.04.2012

David T. Blake After training, animals and humans can make their thoughts interact directly with computers. A study provides evidence that the corticostriatal system of the brain is essential for this learning process. Brain–machine interfaces have a rich history in the sci-fi genre: in The Matrix films, human brains are plugged into a computer-based simulation that then becomes their 'reality'. But using our thoughts to directly control computers or other devices is not just in the realm of fantasy. Monkeys can learn to use visual cues to instruct a brain–machine interface to move a robotic arm or a computer cursor1, 2. And electrode arrays were implanted into the brain of a paralysed man in 2006, enabling him to control an artificial arm, to move a cursor on a computer screen and even to open e-mail3. Over time, an individual learns to improve their control over the brain–machine interface by modifying the activity of their brain, but how this happens is not well understood. In an article published on Nature's website today, Koralek et al.4 report that the corticostriatal system of the brain is involved in learning mental actions and skills that do not involve physical movement, such as those required for control of brain–machine interfaces. The corticostriatal system has a unique pattern of connectivity that enables sensory inputs to be associated with appropriate motor or cognitive responses5. It consists of a cortical component, the primary motor cortex, that exerts control over muscles, and a striatal component, the basal ganglia, that receives direct inputs from the motor cortex. The basal ganglia are involved in a wide range of learning conditions and are crucial to the motor deficits observed in Parkinson's and Huntington's diseases. Both corticostriatal components have a role in the learning and execution of physical skills requiring movement. © 2012 Nature Publishing Group

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16469 - Posted: 03.05.2012

M. Mitchell Waldrop It wasn't quite the lynching that Henry Markram had expected. But the barrage of sceptical comments from his fellow neuroscientists — “It's crap,” said one — definitely made the day feel like a tribunal. Officially, the Swiss Academy of Sciences meeting in Bern on 20 January was an overview of large-scale computer modelling in neuroscience. Unofficially, it was neuroscientists' first real chance to get answers about Markram's controversial proposal for the Human Brain Project (HBP) — an effort to build a supercomputer simulation that integrates everything known about the human brain, from the structures of ion channels in neural cell membranes up to mechanisms behind conscious decision-making. Markram, a South-African-born brain electrophysiologist who joined the Swiss Federal Institute of Technology in Lausanne (EPFL) a decade ago, may soon see his ambition fulfilled. The project is one of six finalists vying to win €1 billion (US$1.3 billion) as one of the European Union's two new decade-long Flagship initiatives. “Brain researchers are generating 60,000 papers per year,” said Markram as he explained the concept in Bern. “They're all beautiful, fantastic studies — but all focused on their one little corner: this molecule, this brain region, this function, this map.” The HBP would integrate these discoveries, he said, and create models to explore how neural circuits are organized, and how they give rise to behaviour and cognition — among the deepest mysteries in neuroscience. Ultimately, said Markram, the HBP would even help researchers to grapple with disorders such as Alzheimer's disease. “If we don't have an integrated view, we won't understand these diseases,” he declared. © 2012 Nature Publishing Group

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 16416 - Posted: 02.23.2012

by Anil Ananthaswamy GOVERNMENT spooks want cyborg insects to snoop on their enemies. Biologists want to tap into the nervous systems of insects to understand how they fly. A probe that can be implanted into moths to control their flight could help satisfy both parties. One day, it could even help rehabilitate people who have had strokes. The US Defense Advanced Research Projects Agency (DARPA) has been running a programme to develop machine-insect interfaces for years but electrodes implanted to stimulate the brains or wing muscles of insects were not precise enough. Now Joel Voldman of the Massachusetts Institute of Technology and colleagues have designed a unique, flexible neural probe that can be attached directly to an insect's ventral nerve cord (VNC), which, along with the brain, makes up the central nervous system in insects. Another reason previous attempts have not been entirely successful was because the impedance of the electrodes did not match that of the insect's tissue. This probe is made of a polyimide polymer coated with gold and carbon nanotubes, and its impedance is much closer to that of nerve tissue. One end of the probe is a ring that clamps around the VNC. The inside of the ring has five electrodes which stimulate distinct nerve bundles within the VNC. Attached to the probe is a wireless stimulator, which contains a radio receiver, as well as a battery and a device to generate electrical pulses. The team implanted the device in the abdomen of a tobacco hawkmoth (Manduca sexta). As it weighs less than half a gram, it is easy for the moth to carry. "Their wingspan is the width of your hand," says Voldman. "These are big guys." © Copyright Reed Business Information Ltd

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16359 - Posted: 02.09.2012

By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16303 - Posted: 01.28.2012

by Jason Daley Over recent months, in José del R. Millán’s computer science lab in Switzerland, a little round robot, similar to a Roomba with a laptop mounted on it (right), bumped its way through an office space filled with furniture and people. Nothing special, except the robot was being controlled from a clinic more than 60 miles away—and not with a joystick or keyboard, but with the brain waves of a paralyzed patient. The robot’s journey was an experiment in shared control, a type of brain-machine interface that merges conscious thought and algorithms to give disabled patients finer mental control over devices that help them communicate or retrieve objects. If the user experiences a mental misfire, Millán’s software can step in to help. Instead of crashing down the stairs, for instance, the robot would recalculate to find the door. Such technology is a potential life changer for the tens of thousands of people suffering from locked-in syndrome, a type of paralysis that leaves patients with only the ability to blink. The condition is usually incurable, but Millán’s research could make it more bearable, allowing patients to engage the world through a robotic proxy. “The last 10 years have been like a proof of concept,” says Justin Sanchez, director of the Neuro­prosthetics Research Group at the University of Miami, who is also studying shared control. “But the research is moving fast. Now there is a big push to get these devices to people who need them for everyday life.” © 2012, Kalmbach Publishing Co.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16214 - Posted: 01.05.2012

Scientists are getting closer to the dream of creating computer systems that can replicate the brain. Researchers at the Massachusetts Institute of Technology have designed a computer chip that mimics how the brain's neurons adapt in response to new information. Such chips could eventually enable communication between artificially created body parts and the brain. It could also pave the way for artificial intelligence devices. There are about 100 billion neurons in the brain, each of which forms synapses - the connections between neurons that allow information to flow - with many other neurons. This process is known as plasticity and is believed to underpin many brain functions, such as learning and memory. Neural functions The MIT team, led by research scientist Chi-Sang Poon, has been able to design a computer chip that can simulate the activity of a single brain synapse. Activity in the synapses relies on so-called ion channels which control the flow of charged atoms such as sodium, potassium and calcium. The 'brain chip' has about 400 transistors and is wired up to replicate the circuitry of the brain. BBC © 2011

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16053 - Posted: 11.19.2011

Melissae Fellet, reporter A paralysed man has high-fived his girlfriend using a robotic arm controlled only by his thoughts (see video above). Tim Hemmes, who was paralysed in a motorcycle accident seven years ago, is the first participant in a clinical trial testing a brain implant that directs movement of an external device. Neurosurgeons at the University of Pittsburgh School of Medicine in Pennsylvania implanted a grid of electrodes, about the size of a large postage stamp, on top of Hemmes's brain over an area of neurons that fire when he imagines moving his right arm. They threaded wires from the implant underneath the skin of his neck and pulled the ends out of his body near his chest. The team then connected the implant to a computer that converts specific brainwaves into particular actions. As shown in this video, Hemmes first practices controlling a dot on a TV screen with his mind. The dot moves right when he imagines bending his elbow. Thinking about wiggling his thumb makes the dot slide left. With practice, Hemmes learned to move the cursor just by visualizing the motion, rather than concentrating on specific arm movements, says neurosurgeon Elizabeth Tyler-Kabara of the University of Pittsburgh in Pennsylvania, who implanted the electrodes. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15910 - Posted: 10.13.2011

by Ferris Jabr Monkeys have feelings too. In a mind-meld between monkey and computer, rhesus macaques have learned to "feel" the texture of virtual objects without physically touching a thing. In the future, prosthetic limbs modelled on similar technology could return a sense of touch to people with amputations. Using two-way communication between brain and machine, the monkeys manoeuvred a cursor with their minds and identified virtual objects by texture, based on electrical feedback from the computer. Miguel Nicolelis of Duke University Medical Center in Durham, North Carolina, and his colleagues implanted electrodes into the brains of two monkeys. The electrodes recorded activity in the motor cortex and somatosensory cortex (SSC) – brain areas that orchestrate voluntary movement and sense of touch. Electrical activity from the motor cortex was sent to a computer, which translated the neural chatter into instructions that moved a cursor on screen. The monkeys learned what patterns of thought reliably changed the cursor's position. The team then assigned a unique texture to each of three identical circles on the screen. When the cursor hovered over each circle, the computer zapped the monkeys' SSCs with the same electrical impulses that occurred when they touched each texture in real life. Finally, the team taught the monkeys to associate a particular texture with a reward. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15881 - Posted: 10.06.2011

by Linda Geddes AN ARTIFICIAL cerebellum has restored lost brain function in rats, bringing the prospect of cyborg-style brain implants a step closer to reality. Such implants could eventually be used to replace areas of brain tissue damaged by stroke and other conditions, or even to enhance healthy brain function and restore learning processes that decline with age. Cochlear implants and prosthetic limbs have already proved that it is possible to wire electrical devices into the brain and make sense of them, but such devices involve only one-way communication, either from the device to the brain or vice versa. Now Matti Mintz of Tel Aviv University in Israel and his colleagues have created a synthetic cerebellum which can receive sensory inputs from the brainstem - a region that acts as a conduit for neuronal information from the rest of the body. Their device can interpret these inputs, and send a signal to a different region of the brainstem that prompts motor neurons to execute the appropriate movement. "It's proof of concept that we can record information from the brain, analyse it in a way similar to the biological network, and return it to the brain," says Mintz, who presented the work this month at the Strategies for Engineered Negligible Senescence meeting in Cambridge, UK. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15850 - Posted: 09.29.2011

By PAGAN KENNEDY “Fingers!” Gerwin Schalk sputtered, waving his hands around in the air. “Fingers are made to pick up a hammer.” He prodded the table, mimicking the way we poke at computer keyboards. “It’s totally ridiculous,” he said. I was visiting Schalk, a 40-year-old computer engineer, at his bunkerlike office in the Wads­worth Center, a public-health lab outside Albany that handles many of New York State’s rabies tests. It so happens that his lab is also pioneering a new way to control our computers — with thoughts instead of fingers. Schalk studies people at the Albany Medical Center who have become, not by choice, some of the world’s first cyborgs. One volunteer was a young man in his 20s who suffers from a severe form of epilepsy. He had been outfitted with a temporary device, a postcard-­size patch of electrodes that sits on the brain’s cortex, known as an electrocorticographic (ECoG) implant. Surgeons use these implants to home in on the damaged tissue that causes seizures. Schalk took advantage of the implant to see if the patient could control the actions in a video game called Galaga using only his thoughts. In the videotape of this experiment, you see a young man wearing a turban of bandages with wires running from his head to a computer in a cart. “Pew, pew,” the ship on the computer screen whines, as it decimates buglike creatures. The patient flicks the spaceship back and forth by imagining that he is moving his tongue. This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command. © 2011 The New York Times Company

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15807 - Posted: 09.17.2011

by Sara Reardon They're not quite psychic yet, but machines are getting better at reading your mind. Researchers have invented a new, noninvasive method for recording patterns of brain activity and using them to steer a robot. Scientists hope the technology will give "locked in" patients—those too disabled to communicate with the outside world—the ability to interact with others and even give the illusion of being physically present, or "telepresent," with friends and family. Previous brain-machine interface systems have made it possible for people to control robots, cursors, or prosthetics with conscious thought, but they often take a lot of effort and concentration, says José del R. Millán, a biomedical engineer at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, who develops brain-machine interface systems that don't need to be implanted into the brain. Millán's goal is to make control as easy as driving a car on a highway. A partially autonomous robot would allow a user to stop concentrating on tasks that he or she would normally do subconsciously, such as following a person or avoiding running into walls. But if the robot encounters an unexpected event and needs to make a split-second decision, the user's thoughts can override the robot's artificial intelligence. To test their technology, Millán and colleagues created a telepresent robot by modifying a commercially available bot called Robotino. The robot looks a bit like a platform on three wheels, and it can avoid obstacles on its own using infrared sensors. On top of the robot, the researchers placed a laptop running Skype, a voice and video Internet chat system, over a wireless Internet connection. © 2010 American Association for the Advancement of Science

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15778 - Posted: 09.08.2011

By Laura Sanders In a fast-moving car, the brain can hit the brakes faster than the foot. By relying on brain waves that signal the intent to jam on the brakes, a new technology could shave critical milliseconds off the reaction time, researchers report online July 28 in the Journal of Neural Engineering. The work adds to a growing trend in car technology that assists drivers. Though it may eventually lead to improvements in emergency braking, the new brain signal technology isn’t ready for the road. “As a basic science study, I was quite impressed with it,” says cognitive neuroscientist Raja Parasuraman of George Mason University in Fairfax, Va. “I just think a lot more needs to be done.” In the study, computer scientist Stefan Haufe of the Berlin Institute of Technology in Germany and his colleagues measured brain wave changes while participants drove in a car simulator. The participants drove around 60 miles per hour, following a lead car on a curvy road with heavy oncoming traffic. Every so often the lead car would slam on its brakes, so that the participant would have to either do the same or crash. For most drivers, the lag between the lead car stopping and themselves slamming the brakes was around 700 milliseconds. Particular neural signatures were evident during this lag time, and they could be early indicators that the drivers wanted to brake. “Our approach was to obtain the intention of the driver faster than he could actually act,” Haufe says. “That’s what the neural signature is good for.” © Society for Science & the Public 2000 - 2011

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 14: Attention and Higher Cognition
Link ID: 15636 - Posted: 07.30.2011

by Adam Piore On a cold, blustery afternoon the week before Halloween, an assortment of spiritual mediums, animal communicators, and astrologists have set up tables in the concourse beneath the Empire State Plaza in Albany, New York. The cavernous hall of shops that 
connects the buildings in this 98-acre complex is a popular venue for autumnal events: Oktoberfest, the Maple Harvest Festival, and today’s “Mystic Fair.” Traffic is heavy as bureaucrats with ID badges dangling from their necks stroll by during their lunch breaks. Next to the Albany Paranormal Research Society table, a middle-aged woman is solemnly explaining the workings of an electromagnetic sensor that can, she asserts, detect the presence of ghosts. Nearby, a “clairvoyant” ushers a government worker in a suit into her canvas tent. A line has formed at the table of a popular tarot card reader. Amid all the bustle and transparent hustles, few of the dabblers at the Mystic Fair are aware that there is a genuine mind reader in the building, sitting in an office several floors below the concourse. This mind reader is not able to pluck a childhood memory or the name of a loved one out of your head, at least not yet. But give him time. He is applying hard science to an aspiration that was once relegated to clairvoyants, and unlike his predecessors, he can point to some hard results. © 2011, Kalmbach Publishing Co.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15584 - Posted: 07.21.2011

by Duncan Graham-Rowe The latest brain-computer interfaces meet smart home technology and virtual gaming TWO friends meet in a bar in the online environment Second Life to chat about their latest tweets and favourite TV shows. Nothing unusual in that - except that both of them have Lou Gehrig's disease, otherwise known as amyotrophic lateral sclerosis (ALS), and it has left them so severely paralysed that they can only move their eyes. These Second Lifers are just two of more than 50 severely disabled people who have been trying out a sophisticated new brain-computer interface (BCI). Second Life has been controlled using BCIs before, but only to a very rudimentary level. The new interface, developed by medical engineering company G.Tec of Schiedlberg, Austria, lets users freely explore Second Life's virtual world and control their avatar within it. It can be used to give people control over their real-world environment too: opening and closing doors, controlling the TV, lights, thermostat and intercom, answering the phone, or even publishing Twitter posts. The system was developed as part of a pan-European project called Smart Homes for All, and is the first time the latest BCI technology has been combined with smart-home technology and online gaming. It uses electroencephalograph (EEG) caps to pick up brain signals, which it translates into commands that are relayed to controllers in the building, or to navigate and communicate within Second Life and Twitter. © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15535 - Posted: 07.07.2011

Canadian and U.S. researchers have been able to predict what hand movement a person is going to make by reading a scan of their brain. The scientists at the University of Western Ontario and the University of Oregon scanned the brains of nine volunteers at the Robarts Research Institute in London, Ont. They found they were able to distinguish somewhat accurately among plans to make three hand movements that were only slightly different from one another: Jody Culham and Jason Gallivan at the University of Western Ontario were the two lead authors of the study. Jody Culham and Jason Gallivan at the University of Western Ontario were the two lead authors of the study. (University of Western Ontario)"We're showing that you can decode little subtle differences in finger movements based on the goal of the movement," said Jason Gallivan, a Ph.D. student in neuroscience at the University of Western Ontario and the lead author of a study published in the Journal of Neuroscience this week. Previously, scientists had only been able to make similar predictions for animals with electrodes inserted in their brains. Funcational magnetic resonance imaging, or fMRI, is far less intrusive, said Jody Culham, a psychology professor at the University of Western Ontario who is Gallivan’s supervisor and co-author. That made it possible to do such an experiment in humans. While the new discovery may bring to mind Minority Report, the 2002 movie starring Tom Cruise where criminals are caught before the crimes they commit, Gallivan said that type of scenario is a long way off. © CBC 2011

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15517 - Posted: 07.02.2011

By Alyssa Danigelis A stroke is like a meteorite impact. There’s a central “core of death” surrounded by silenced neural networks. So far, no one has figured out a way to turn those neurons back on. But by adding adult stem cells to a "brain in a dish" comprised of rat neurons, researchers at the University of Florida could find a way to reboot the brain -- essentially waking up quiet circuits and regenerating the core. “We take normal neurons, simulate a stroke event, and implant adult stem cells,” said Thomas DeMarse, a research scientist at the University of Florida who is working on the transplant model with assistant professor of biomedical engineering Brandi Ormerod and PhD student Crystal Stephens. The brain in the dish, or as the scientists prefer to call it, the "“biologically relevant neural model,” is a computer chip with an array of 60 microelectrodes that measure the action potential of neurons grown on top. The microelectrode array, or MEA, records the brain cell signals so the scientists can analyze them. “The beauty of the MEA is that it doesn’t just tell you the activity of one neuron, it tells you the activity of hundreds at the same time,” DeMarse said. Using MEAs is not new -- DeMarse used one in 2004 to show that brain cells could be used to control a flight simulator -- but adding adult stem cells to the mix in vitro, that is, in an experiment outside the brain, is the new part. © 2011 Discovery Communications, LLC.

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 4: Development of the Brain
Link ID: 15511 - Posted: 06.30.2011