Links for Keyword: Robotics
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,
by Jason Daley Over recent months, in José del R. Millán’s computer science lab in Switzerland, a little round robot, similar to a Roomba with a laptop mounted on it (right), bumped its way through an office space filled with furniture and people. Nothing special, except the robot was being controlled from a clinic more than 60 miles away—and not with a joystick or keyboard, but with the brain waves of a paralyzed patient. The robot’s journey was an experiment in shared control, a type of brain-machine interface that merges conscious thought and algorithms to give disabled patients finer mental control over devices that help them communicate or retrieve objects. If the user experiences a mental misfire, Millán’s software can step in to help. Instead of crashing down the stairs, for instance, the robot would recalculate to find the door. Such technology is a potential life changer for the tens of thousands of people suffering from locked-in syndrome, a type of paralysis that leaves patients with only the ability to blink. The condition is usually incurable, but Millán’s research could make it more bearable, allowing patients to engage the world through a robotic proxy. “The last 10 years have been like a proof of concept,” says Justin Sanchez, director of the Neuroprosthetics Research Group at the University of Miami, who is also studying shared control. “But the research is moving fast. Now there is a big push to get these devices to people who need them for everyday life.” © 2012, Kalmbach Publishing Co.
Scientists are getting closer to the dream of creating computer systems that can replicate the brain. Researchers at the Massachusetts Institute of Technology have designed a computer chip that mimics how the brain's neurons adapt in response to new information. Such chips could eventually enable communication between artificially created body parts and the brain. It could also pave the way for artificial intelligence devices. There are about 100 billion neurons in the brain, each of which forms synapses - the connections between neurons that allow information to flow - with many other neurons. This process is known as plasticity and is believed to underpin many brain functions, such as learning and memory. Neural functions The MIT team, led by research scientist Chi-Sang Poon, has been able to design a computer chip that can simulate the activity of a single brain synapse. Activity in the synapses relies on so-called ion channels which control the flow of charged atoms such as sodium, potassium and calcium. The 'brain chip' has about 400 transistors and is wired up to replicate the circuitry of the brain. BBC © 2011
Melissae Fellet, reporter A paralysed man has high-fived his girlfriend using a robotic arm controlled only by his thoughts (see video above). Tim Hemmes, who was paralysed in a motorcycle accident seven years ago, is the first participant in a clinical trial testing a brain implant that directs movement of an external device. Neurosurgeons at the University of Pittsburgh School of Medicine in Pennsylvania implanted a grid of electrodes, about the size of a large postage stamp, on top of Hemmes's brain over an area of neurons that fire when he imagines moving his right arm. They threaded wires from the implant underneath the skin of his neck and pulled the ends out of his body near his chest. The team then connected the implant to a computer that converts specific brainwaves into particular actions. As shown in this video, Hemmes first practices controlling a dot on a TV screen with his mind. The dot moves right when he imagines bending his elbow. Thinking about wiggling his thumb makes the dot slide left. With practice, Hemmes learned to move the cursor just by visualizing the motion, rather than concentrating on specific arm movements, says neurosurgeon Elizabeth Tyler-Kabara of the University of Pittsburgh in Pennsylvania, who implanted the electrodes. © Copyright Reed Business Information Ltd.
by Ferris Jabr Monkeys have feelings too. In a mind-meld between monkey and computer, rhesus macaques have learned to "feel" the texture of virtual objects without physically touching a thing. In the future, prosthetic limbs modelled on similar technology could return a sense of touch to people with amputations. Using two-way communication between brain and machine, the monkeys manoeuvred a cursor with their minds and identified virtual objects by texture, based on electrical feedback from the computer. Miguel Nicolelis of Duke University Medical Center in Durham, North Carolina, and his colleagues implanted electrodes into the brains of two monkeys. The electrodes recorded activity in the motor cortex and somatosensory cortex (SSC) – brain areas that orchestrate voluntary movement and sense of touch. Electrical activity from the motor cortex was sent to a computer, which translated the neural chatter into instructions that moved a cursor on screen. The monkeys learned what patterns of thought reliably changed the cursor's position. The team then assigned a unique texture to each of three identical circles on the screen. When the cursor hovered over each circle, the computer zapped the monkeys' SSCs with the same electrical impulses that occurred when they touched each texture in real life. Finally, the team taught the monkeys to associate a particular texture with a reward. © Copyright Reed Business Information Ltd.
Related chapters from BP7e: Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 15881 - Posted: 10.06.2011
by Linda Geddes AN ARTIFICIAL cerebellum has restored lost brain function in rats, bringing the prospect of cyborg-style brain implants a step closer to reality. Such implants could eventually be used to replace areas of brain tissue damaged by stroke and other conditions, or even to enhance healthy brain function and restore learning processes that decline with age. Cochlear implants and prosthetic limbs have already proved that it is possible to wire electrical devices into the brain and make sense of them, but such devices involve only one-way communication, either from the device to the brain or vice versa. Now Matti Mintz of Tel Aviv University in Israel and his colleagues have created a synthetic cerebellum which can receive sensory inputs from the brainstem - a region that acts as a conduit for neuronal information from the rest of the body. Their device can interpret these inputs, and send a signal to a different region of the brainstem that prompts motor neurons to execute the appropriate movement. "It's proof of concept that we can record information from the brain, analyse it in a way similar to the biological network, and return it to the brain," says Mintz, who presented the work this month at the Strategies for Engineered Negligible Senescence meeting in Cambridge, UK. © Copyright Reed Business Information Ltd.
By PAGAN KENNEDY “Fingers!” Gerwin Schalk sputtered, waving his hands around in the air. “Fingers are made to pick up a hammer.” He prodded the table, mimicking the way we poke at computer keyboards. “It’s totally ridiculous,” he said. I was visiting Schalk, a 40-year-old computer engineer, at his bunkerlike office in the Wadsworth Center, a public-health lab outside Albany that handles many of New York State’s rabies tests. It so happens that his lab is also pioneering a new way to control our computers — with thoughts instead of fingers. Schalk studies people at the Albany Medical Center who have become, not by choice, some of the world’s first cyborgs. One volunteer was a young man in his 20s who suffers from a severe form of epilepsy. He had been outfitted with a temporary device, a postcard-size patch of electrodes that sits on the brain’s cortex, known as an electrocorticographic (ECoG) implant. Surgeons use these implants to home in on the damaged tissue that causes seizures. Schalk took advantage of the implant to see if the patient could control the actions in a video game called Galaga using only his thoughts. In the videotape of this experiment, you see a young man wearing a turban of bandages with wires running from his head to a computer in a cart. “Pew, pew,” the ship on the computer screen whines, as it decimates buglike creatures. The patient flicks the spaceship back and forth by imagining that he is moving his tongue. This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command. © 2011 The New York Times Company
by Sara Reardon They're not quite psychic yet, but machines are getting better at reading your mind. Researchers have invented a new, noninvasive method for recording patterns of brain activity and using them to steer a robot. Scientists hope the technology will give "locked in" patients—those too disabled to communicate with the outside world—the ability to interact with others and even give the illusion of being physically present, or "telepresent," with friends and family. Previous brain-machine interface systems have made it possible for people to control robots, cursors, or prosthetics with conscious thought, but they often take a lot of effort and concentration, says José del R. Millán, a biomedical engineer at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, who develops brain-machine interface systems that don't need to be implanted into the brain. Millán's goal is to make control as easy as driving a car on a highway. A partially autonomous robot would allow a user to stop concentrating on tasks that he or she would normally do subconsciously, such as following a person or avoiding running into walls. But if the robot encounters an unexpected event and needs to make a split-second decision, the user's thoughts can override the robot's artificial intelligence. To test their technology, Millán and colleagues created a telepresent robot by modifying a commercially available bot called Robotino. The robot looks a bit like a platform on three wheels, and it can avoid obstacles on its own using infrared sensors. On top of the robot, the researchers placed a laptop running Skype, a voice and video Internet chat system, over a wireless Internet connection. © 2010 American Association for the Advancement of Science
By Laura Sanders In a fast-moving car, the brain can hit the brakes faster than the foot. By relying on brain waves that signal the intent to jam on the brakes, a new technology could shave critical milliseconds off the reaction time, researchers report online July 28 in the Journal of Neural Engineering. The work adds to a growing trend in car technology that assists drivers. Though it may eventually lead to improvements in emergency braking, the new brain signal technology isn’t ready for the road. “As a basic science study, I was quite impressed with it,” says cognitive neuroscientist Raja Parasuraman of George Mason University in Fairfax, Va. “I just think a lot more needs to be done.” In the study, computer scientist Stefan Haufe of the Berlin Institute of Technology in Germany and his colleagues measured brain wave changes while participants drove in a car simulator. The participants drove around 60 miles per hour, following a lead car on a curvy road with heavy oncoming traffic. Every so often the lead car would slam on its brakes, so that the participant would have to either do the same or crash. For most drivers, the lag between the lead car stopping and themselves slamming the brakes was around 700 milliseconds. Particular neural signatures were evident during this lag time, and they could be early indicators that the drivers wanted to brake. “Our approach was to obtain the intention of the driver faster than he could actually act,” Haufe says. “That’s what the neural signature is good for.” © Society for Science & the Public 2000 - 2011
Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 14: Attention and Consciousness
Link ID: 15636 - Posted: 07.30.2011
by Adam Piore On a cold, blustery afternoon the week before Halloween, an assortment of spiritual mediums, animal communicators, and astrologists have set up tables in the concourse beneath the Empire State Plaza in Albany, New York. The cavernous hall of shops that connects the buildings in this 98-acre complex is a popular venue for autumnal events: Oktoberfest, the Maple Harvest Festival, and today’s “Mystic Fair.” Traffic is heavy as bureaucrats with ID badges dangling from their necks stroll by during their lunch breaks. Next to the Albany Paranormal Research Society table, a middle-aged woman is solemnly explaining the workings of an electromagnetic sensor that can, she asserts, detect the presence of ghosts. Nearby, a “clairvoyant” ushers a government worker in a suit into her canvas tent. A line has formed at the table of a popular tarot card reader. Amid all the bustle and transparent hustles, few of the dabblers at the Mystic Fair are aware that there is a genuine mind reader in the building, sitting in an office several floors below the concourse. This mind reader is not able to pluck a childhood memory or the name of a loved one out of your head, at least not yet. But give him time. He is applying hard science to an aspiration that was once relegated to clairvoyants, and unlike his predecessors, he can point to some hard results. © 2011, Kalmbach Publishing Co.
by Duncan Graham-Rowe The latest brain-computer interfaces meet smart home technology and virtual gaming TWO friends meet in a bar in the online environment Second Life to chat about their latest tweets and favourite TV shows. Nothing unusual in that - except that both of them have Lou Gehrig's disease, otherwise known as amyotrophic lateral sclerosis (ALS), and it has left them so severely paralysed that they can only move their eyes. These Second Lifers are just two of more than 50 severely disabled people who have been trying out a sophisticated new brain-computer interface (BCI). Second Life has been controlled using BCIs before, but only to a very rudimentary level. The new interface, developed by medical engineering company G.Tec of Schiedlberg, Austria, lets users freely explore Second Life's virtual world and control their avatar within it. It can be used to give people control over their real-world environment too: opening and closing doors, controlling the TV, lights, thermostat and intercom, answering the phone, or even publishing Twitter posts. The system was developed as part of a pan-European project called Smart Homes for All, and is the first time the latest BCI technology has been combined with smart-home technology and online gaming. It uses electroencephalograph (EEG) caps to pick up brain signals, which it translates into commands that are relayed to controllers in the building, or to navigate and communicate within Second Life and Twitter. © Copyright Reed Business Information Ltd.
Canadian and U.S. researchers have been able to predict what hand movement a person is going to make by reading a scan of their brain. The scientists at the University of Western Ontario and the University of Oregon scanned the brains of nine volunteers at the Robarts Research Institute in London, Ont. They found they were able to distinguish somewhat accurately among plans to make three hand movements that were only slightly different from one another: Jody Culham and Jason Gallivan at the University of Western Ontario were the two lead authors of the study. Jody Culham and Jason Gallivan at the University of Western Ontario were the two lead authors of the study. (University of Western Ontario)"We're showing that you can decode little subtle differences in finger movements based on the goal of the movement," said Jason Gallivan, a Ph.D. student in neuroscience at the University of Western Ontario and the lead author of a study published in the Journal of Neuroscience this week. Previously, scientists had only been able to make similar predictions for animals with electrodes inserted in their brains. Funcational magnetic resonance imaging, or fMRI, is far less intrusive, said Jody Culham, a psychology professor at the University of Western Ontario who is Gallivan’s supervisor and co-author. That made it possible to do such an experiment in humans. While the new discovery may bring to mind Minority Report, the 2002 movie starring Tom Cruise where criminals are caught before the crimes they commit, Gallivan said that type of scenario is a long way off. © CBC 2011
By Alyssa Danigelis A stroke is like a meteorite impact. There’s a central “core of death” surrounded by silenced neural networks. So far, no one has figured out a way to turn those neurons back on. But by adding adult stem cells to a "brain in a dish" comprised of rat neurons, researchers at the University of Florida could find a way to reboot the brain -- essentially waking up quiet circuits and regenerating the core. “We take normal neurons, simulate a stroke event, and implant adult stem cells,” said Thomas DeMarse, a research scientist at the University of Florida who is working on the transplant model with assistant professor of biomedical engineering Brandi Ormerod and PhD student Crystal Stephens. The brain in the dish, or as the scientists prefer to call it, the "“biologically relevant neural model,” is a computer chip with an array of 60 microelectrodes that measure the action potential of neurons grown on top. The microelectrode array, or MEA, records the brain cell signals so the scientists can analyze them. “The beauty of the MEA is that it doesn’t just tell you the activity of one neuron, it tells you the activity of hundreds at the same time,” DeMarse said. Using MEAs is not new -- DeMarse used one in 2004 to show that brain cells could be used to control a flight simulator -- but adding adult stem cells to the mix in vitro, that is, in an experiment outside the brain, is the new part. © 2011 Discovery Communications, LLC.
Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 15511 - Posted: 06.30.2011
By Susan Gaidos Video games can be mesmerizing, even for a rhesus monkey. Which may explain, in part, why 6-year-old Jasper has been sitting transfixed at a computer screen in a Washington University lab for nearly an hour, his gaze trained on a small red ball. A more interesting reason for Jasper’s quiet demeanor is that he is hurling the ball at a moving target using just his thoughts. Jasper is not the only monkey to control objects with his mind. At the University of Pittsburgh, a pair of macaques manipulated a thought-controlled synthetic arm to grab and eat marshmallows. The monkeys then worked the arm to turn a doorknob — no muscle power required. In another case, a monkey in North Carolina transmitted its thoughts halfway around the world to set a Japanese robot in motion. Now it’s time to let humans give it a serious try. In a series of clinical trials, scientists are preparing to take thought-controlled technologies, known as brain-computer interfaces, to those who might benefit most. The trials are a major step in realizing what many scientists say is an ambitious, but fully obtainable, goal — to restore mobility and independence to people who have lost the use of their muscles through brain or spinal cord injury. Over the next few years, paralyzed patients will attempt to learn how to maneuver virtual hands and robotic arms to reach, push, grasp or eat. As the trials progress, researchers hope to train users to perform increasingly complex movements. © Society for Science & the Public 2000 - 2011
A woman's cat ears perk up as she passes a young man in a park, only to flatten as she brushes off the encounter. A team of Japanese inventors have come with a new device that blends the country's fascination with cuteness and its penchant for experimental high-tech -- brainwave-controlled cat ears. The fluffy headwear reads users' brain activity, meaning the ears perk up when they concentrate and then flop down again to lay flat against the head when users enter a relaxed state of mind, say its developers. The gizmo is called "Necomimi" -- a play on the Japanese words for cat and ear, but the first two syllables are also short for "neuro communication", says Neurowear, the inventor team whose brainchild it is. "We were exploring new ways of communicating and we thought it would be interesting to use brainwaves," said Neurowear's Kana Nakano. "Because the sensors must be attached to the head, we tried to come up with something cute and catchy." A promotional video shows a young woman's cat ears perk up as she bites into a doughnut and again when she passes a young man in a park, only to flatten as she apparently brushes off the missed encounter, relaxes and smiles. robotic hand © 2011 Discovery Communications, LLC.
By EMMA G. FITZSIMMONS CHICAGO — Martin Mireles says his mother was not happy with his tongue piercing: It didn’t fit his image as a former church youth leader. But as Mr. Mireles told her, it was for research. Paralyzed from a spinal cord injury since he was shot in the neck almost two decades ago, he was recently fitted with a magnetic stud that allows him to steer his wheelchair with his tongue. Now he is helping researchers at the Northwestern University School of Medicine here in a clinical trial of the technology, being financed with almost $1 million in federal stimulus funds. Mr. Mireles, 37, tested the equipment one recent afternoon by guiding a wheelchair through an obstacle course lined with trash cans. Mouth closed, he shifted the magnet to travel forward and backward, left and right. The study was one of about 200 projects selected from more than 20,000 applicants. “There was a ‘wow’ factor here,” said Naomi Kleitman, a program director at the National Institutes of Health and an expert on spinal cord injury research. “This is kind of a cool idea. The question is: Will it work well enough not to just be cool, but to be practical too?” A quarter-million Americans have severe spinal cord injuries, and experts estimate that there are about 10,000 new injuries each year. Millions more have some form of paralysis from an array of conditions, including stroke, multiple sclerosis and cerebral palsy. © 2011 The New York Times Company
by Miguel Nicolelis ANSWER quickly: what links the internet, the stock market, democratic elections, a perfect soccer play, the big bang theory, the frescoes of the Sistine Chapel and the iPad? Most people guess that the only possible link is they are all created by humans. While this is technically correct, it doesn't credit the true creator of such macro structures and exquisite tools: the human brain. As well as the almost infinite catalogue of artificial tools and beliefs that rule most of our lives, our cherished social, political, and economic systems also blossom as by-products of the incessant electrochemical storms brewed by the brain circuits formed by billions of interconnected cellular elements. These neurons make up an organic structure so majestic and mysterious that its only true rival in complexity and power is the cosmos that hosts us all. For the past 200 years or so, neuroscientists have been obsessed with understanding how the roots of all our glory and disgrace, as individuals and as a species, emerge from waves of neuronal electrical activity that propagate through a neural ocean. Just how do they morph into what is conventionally known as thinking, the main currency of our primate brains? In the early 19th century, Franz Joseph Gall in Germany and Thomas Young in Britain pioneered the modern age of neuroscience with opposing theories of how the brain worked. Gall's phrenology proposed that brain functions were localised in particular spatial territories of the human cortex, the most superficial part of the nervous system, just beneath the skull. Gall and his disciples made a living by claiming to ascertain the key personality traits of his patients by palpating the bumps on their heads. © Copyright Reed Business Information Ltd.
Helen Thomson, biomedical news editor A paralysed woman was still able to accurately control a computer cursor with her thoughts 1000 days after having a tiny electronic device implanted in her brain, say the researchers who devised the system. The achievement demonstrates the longevity of brain-machine implants. The woman, for whom the researchers use the pseudonym S3, had a brainstem stroke in the mid-1990s that caused tetraplegia - paralysis of all four limbs and the vocal cords. In 2005, researchers from Brown University in Providence, Rhode Island, the Providence VA Medical Center and Massachusetts General Hospital in Boston implanted a tiny silicon electrode array the size of a small aspirin into S3's brain to help her communicate better with the outside world. The electrode array is part of the team's BrainGate system, which includes a combination of hardware and software that directly senses the electrical signals produced by neurons in the brain which control the planning of movement. The electrode decodes these signals to allow people with paralysis to control external devices such as computers, wheelchairs and bionic limbs. In a study just published, the researchers say that in 2008 - 1000 days after implantation - S3 proved the durability of the device by performing two different "point-and-click" tasks by thinking about moving a cursor with her hand. © Copyright Reed Business Information Ltd.
By Rachel Ehrenberg Nerve cell tendrils readily thread their way through tiny semiconductor tubes, researchers find, forming a crisscrossed network like vines twining towards the sun. The discovery that offshoots from nascent mouse nerve cells explore the specially designed tubes could lead to tricks for studying nervous system diseases or testing the effects of potential drugs. Such a system may even bring researchers closer to brain-computer interfaces that seamlessly integrate artificial limbs or other prosthetic devices. “This is quite innovative and interesting,” says nanomaterials expert Nicholas Kotov of the University of Michigan in Ann Arbor. “There is a great need for interfaces between electronic and neuronal tissues.” To lay the groundwork for a nerve-electronic hybrid, graduate student Minrui Yu of the University of Wisconsin–Madison and his colleagues created tubes of layered silicon and germanium, materials that could insulate electric signals sent by a nerve cell. The tubes were various sizes and shapes and big enough for a nerve cell’s extensions to crawl through but too small for the cell’s main body to get inside. When the team seeded areas outside the tubes with mouse nerve cells the cells went exploring, sending their threadlike projections into the tubes and even following the curves of helical tunnels, the researchers report in an upcoming ACS Nano. © Society for Science & the Public 2000 - 2011
Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 15115 - Posted: 03.19.2011
Analysis by Amy Dusto For people with spinal injuries or other conditions that impair use of the arms or vocal cords -- or for the curious who just think it's cool -- the intendiX spells words based on brain waves. A skullcap along with a computer interface, the system is in development by Austrian company Guger Technologies. It was demonstrated at CeBIT, an annual worldwide digital industry event, held this year in Hanover, Germany, from March 1 to 5. To pick up brain activity, the skullcap is covered in electroencephalographic (EEG) electrodes. Unfortunately, this early model requires that the user put gel between his and her head and the EEG electrodes to function properly (though a dry version is forthcoming). The wearer stares at a computer screen, which flashes highlights over different rows in a matrix of letters and symbols set up like a keyboard on the screen. Simply by paying attention to the desired letter for a few seconds, the program can determine what the user intended to pick. According to Guger Technologies, most people become competent thought-communicators after 10 minutes of training on the system and are able to spell out five to 10 characters a minute. Designed for use by the severely handicapped in the home or with caregivers, intendiX can do more than just write out a text message. The user can also make it read the message out loud in digitized prose, print the text, or send it in email or via another electronic messaging system -- intendiX is Bluetooth-ready. The only ability needed to use the system, besides a few seconds of concentration, is eyesight. © 2011 Discovery Communications, LLC.