Links for Keyword: Robotics

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 189

By Dominic Basulto In last weekend’s Wall Street Journal, two leading brain researchers conjectured that as a result of rapid breakthroughs in fields such as molecular biology and neuroscience, one day “brain implants” will be just about as common as getting a bit of plastic surgery is today. In short, today’s tummy tucks are tomorrow’s brain tucks. Similar to what you’d expect from watching science fiction films such as “The Matrix,” these brain implants would enable you to learn foreign languages effortlessly, upgrade your memory capabilities, and, yes, help you to know Kung Fu. Vinton Cerf argues that today’s Internet (think Google) is already a form of cognitive implant, helping us to learn the answer to just about anything within seconds. If computing power continues to increase at the same rate as it has for the past 50 years, it is likely that a single computer will have the computing capacity of a human brain by 2023. By 2045, a single computer could have the processing capability of all human brains put together. Just think what you’d be able to use Google to do then. You wouldn’t even need to type in a search query, your brain would already know the answer. Of course, the ability to create these brain implants raises a number of philosophical, ethical and moral questions. If you’re a young student having a tough time in a boring class, why not just buy a brain module that simulates the often repetitive nature of learning? If you’re a parent of a child looking to get into a top university, why not buy a brain implant as a way to gain an advantage over children from less privileged backgrounds, especially when it’s SAT time? Instead of the digital divide, we may be talking about the cognitive divide at some point in the next two decades. Some parents would be able to afford a 99 percent percentile brain for their children, while others wouldn’t. © 1996-2014 The Washington Post

Related chapters from BP7e: Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior
Link ID: 19391 - Posted: 03.21.2014

By Gary Marcus and Christof Koch What would you give for a retinal chip that let you see in the dark or for a next-generation cochlear implant that let you hear any conversation in a noisy restaurant, no matter how loud? Or for a memory chip, wired directly into your brain's hippocampus, that gave you perfect recall of everything you read? Or for an implanted interface with the Internet that automatically translated a clearly articulated silent thought ("the French sun king") into an online search that digested the relevant Wikipedia page and projected a summary directly into your brain? Science fiction? Perhaps not for very much longer. Brain implants today are where laser eye surgery was several decades ago. They are not risk-free and make sense only for a narrowly defined set of patients—but they are a sign of things to come. Unlike pacemakers, dental crowns or implantable insulin pumps, neuroprosthetics—devices that restore or supplement the mind's capacities with electronics inserted directly into the nervous system—change how we perceive the world and move through it. For better or worse, these devices become part of who we are. Neuroprosthetics aren't new. They have been around commercially for three decades, in the form of the cochlear implants used in the ears (the outer reaches of the nervous system) of more than 300,000 hearing-impaired people around the world. Last year, the Food and Drug Administration approved the first retinal implant, made by the company Second Sight. ©2014 Dow Jones & Company, Inc.

Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 5: The Sensorimotor System
Link ID: 19371 - Posted: 03.17.2014

by Clare Wilson A monkey controlling the hand of its unconscious cage-mate with its thoughts may sound like animal voodoo, but it is a step towards returning movement to people with spinal cord injuries. The hope is that people who are paralysed could have electrodes implanted in their brains that pick up their intended movements. These electrical signals could then be sent to a prosthetic limb, or directly to the person's paralysed muscles, bypassing the injury in their spinal cord. Ziv Williams at Harvard Medical School in Boston wanted to see if sending these signals to nerves in the spinal cord would also work, as this might ultimately give a greater range of movement from each electrode. His team placed electrodes in a monkey's brain, connecting them via a computer to wires going into the spinal cord of an anaesthetised, unconscious monkey. The unconscious monkey's limbs served as the equivalent of paralysed limbs. A hand of the unconscious monkey was strapped to a joystick, controlling a cursor that the other monkey could see on a screen. Williams's team had previously had the conscious monkey practise the joystick task for itself and had recorded its brain activity to work out which signals corresponded to moving the joystick back and forth. Through trial and error, they deduced which nerves to stimulate in the spinal cord of the anaesthetised monkey to produce similar movements in that monkey's hand. When both parts were fed to the computer, the conscious monkey was able to move the "paralysed" monkey's hand to make the cursor hit a target. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 19266 - Posted: 02.19.2014

by Erika Engelhaupt If you had to have a prosthetic hand, would you want it to look like a real hand? Or would you prefer a gleaming metallic number, something that doesn’t even try to look human? A new study looks at one of the issues that prosthetic designers and wearers face in making this decision: the creepy factor. People tend to get creeped out by robots or prosthetic devices that look almost, but not quite, human. So Ellen Poliakoff and colleagues at the University of Manchester in England had people rate the eeriness of various prosthetic hands. Forty-three volunteers looked at photographs of prosthetic and real hands. They rated both how humanlike (realistic) the hands were and how eerie they were, defined as “mysterious, strange, or unexpected as to send a chill up the spine.” Real human hands were rated both the most humanlike and the least eerie (a good thing for humans). Metal hands that were clearly mechanical were rated the least humanlike, but less eerie overall than prosthetic hands made to look like real hands, the team reports in the latest issue of Perception. The realistic prosthetics, like the rubber hand shown above, fell into what's known as the uncanny valley. That term, invented by roboticist Matsuhiro Mori in 1970, describes how robots become unnerving as they come to look more humanlike. The superrealistic Geminoid DK robot and the animated characters in the movie The Polar Express suffer from this problem. They look almost human, but not quite, and this mismatch between expectation and reality is one of the proposed explanations for the uncanny valley. In particular, if something looks like a human but doesn’t quite move like one, it’s often considered eerie. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 8: General Principles of Sensory Processing, Touch, and Pain; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 5: The Sensorimotor System
Link ID: 18966 - Posted: 11.25.2013

M. Mitchell Waldrop Kwabena Boahen got his first computer in 1982, when he was a teenager living in Accra. “It was a really cool device,” he recalls. He just had to connect up a cassette player for storage and a television set for a monitor, and he could start writing programs. But Boahen wasn't so impressed when he found out how the guts of his computer worked. “I learned how the central processing unit is constantly shuffling data back and forth. And I thought to myself, 'Man! It really has to work like crazy!'” He instinctively felt that computers needed a little more 'Africa' in their design, “something more distributed, more fluid and less rigid”. Today, as a bioengineer at Stanford University in California, Boahen is among a small band of researchers trying to create this kind of computing by reverse-engineering the brain. The brain is remarkably energy efficient and can carry out computations that challenge the world's largest supercomputers, even though it relies on decidedly imperfect components: neurons that are a slow, variable, organic mess. Comprehending language, conducting abstract reasoning, controlling movement — the brain does all this and more in a package that is smaller than a shoebox, consumes less power than a household light bulb, and contains nothing remotely like a central processor. To achieve similar feats in silicon, researchers are building systems of non-digital chips that function as much as possible like networks of real neurons. Just a few years ago, Boahen completed a device called Neurogrid that emulates a million neurons — about as many as there are in a honeybee's brain. And now, after a quarter-century of development, applications for 'neuromorphic technology' are finally in sight. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 18891 - Posted: 11.08.2013

Most of us don’t think twice when we extend our arms to hug a friend or push a shopping cart—our limbs work together seamlessly to follow our mental commands. For researchers designing brain-controlled prosthetic limbs for people, however, this coordinated arm movement is a daunting technical challenge. A new study showing that monkeys can move two virtual limbs with only their brain activity is a major step toward achieving that goal, scientists say. The brain controls movement by sending electrical signals to our muscles through nerve cells. When limb-connecting nerve cells are damaged or a limb is amputated, the brain is still able to produce those motion-inducing signals, but the limb can't receive them or simply doesn’t exist. In recent years, scientists have worked to create devices called brain-machine interfaces (BMIs) that can pick up these interrupted electrical signals and control the movements of a computer cursor or a real or virtual prosthetic. So far, the success of BMIs in humans has been largely limited to moving single body parts, such as a hand or an arm. Last year, for example, a woman paralyzed from the neck down for 10 years commanded a robotic arm to pick up and lift a piece of chocolate to her mouth just by thinking about it. But, "no device will ever work for people unless it restores bimanual behaviors,” says neuroscientist Miguel Nicolelis at Duke University in Durham, North Carolina, senior author of the paper. "You need to use both arms and hands for the simplest tasks.” In 2011, Nicolelis made waves by announcing on The Daily Show that he is developing a robotic, thought-controlled "exoskeleton" that will allow paralyzed people to walk again. Further raising the stakes, he pledged that the robotic body suit will enable a paralyzed person to kick a soccer ball during the opening ceremony of the 2014 Brazil World Cup. (Nicolelis is Brazilian and his research is partly funded by the nation’s government.) © 2013 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18888 - Posted: 11.07.2013

by NPR Staff Soon you'll be able to direct the path of a cockroach with a smartphone and the swipe of your finger. Greg Gage and his colleagues at Backyard Brains have developed a device called the that lets you control the path of an insect. It may make you squirm, but Gage says the device could inspire a new generation of neuroscientists. "The sharpest kids amongst us are probably going into other fields right now. And so we're kind of in the dark ages when it comes to neuroscience," he tells NPR's Arun Rath. He wants to get kids interested in neuroscience early enough to guide them toward that career path. And a cyborg cockroach might be the inspiration. "The neurons in the insects are very, very similar to the neurons inside the human brain," Gage says. "It's a beautiful way to just really understand what's happening inside your brain by looking at these little insects." The idea was spawned by a device the Backyard Brain-iacs developed called , which is capable of amplifying real living neurons. Insert a small wire into a cockroach's antennae, and you can hear the sound of actual neurons. "Lining the inside of the cockroach are these neurons that are picking up touch or vibration sensing, chemical sensing," Gage says. "They use it like a nose or a large tongue, their antennas, and they use it to sort of navigate the world. "So when you put a small wire inside of there, you can actually pick up the information as it's being encoded and being sent to the brain." With the RoboRoach device and smartphone app, you can interact with the antennae to influence the insect's behavior. ©2013 NPR

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 1: An Introduction to Brain and Behavior
Link ID: 18819 - Posted: 10.22.2013

Erika Check Hayden The power of thought alone is not enough to move inanimate objects — unless the object is a robotic leg wired to your brain, that is. A 32-year-old man whose knee and lower leg were amputated in 2009 after a motorcycle accident is apparently the first person with a missing lower limb to control a robotic leg with his mind. A team led by biomedical engineer Levi Hargrove at the Rehabilitation Institute of Chicago in Illinois reported the breakthrough last week in the New England Journal of Medicine1, including a video that shows the man using the bionic leg to walk up stairs and down a ramp, and to kick a football. The major advance is that the man does not have to use a remote-control switch or exaggerated muscle movements to tell the robotic leg to switch between types of movements, and he does not have to reposition the leg with his hands when seated, Hargrove says. “To our knowledge, this is the first time that neural signals have been used to control both a motorized knee and ankle prosthesis,” he says. Scientists had previously shown that paralysed people could move robotic arms using their thoughts and that able-bodied people can walk using robotic legs controlled by their brains (see, for example, go.nature.com/dgtykw). The latest work goes a step further by using muscle signals to amplify messages sent by the brain when the person intends to move. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18725 - Posted: 10.01.2013

by Colin Barras A man missing his lower leg has gained precise control over a prosthetic limb, just by thinking about moving it – all because his unused nerves were preserved during the amputation and rerouted to his thigh where they can be used to communicate with a robotic leg. The man can now seamlessly switch from walking on level ground to climbing stairs and can even kick a football around. During a traditional limb amputation, the main sensory nerves are severed and lose their function. In 2006, Todd Kuiken and his colleagues at the Rehabilitation Institute of Chicago in Illinois realised they could preserve some of that functionality by carefully rerouting sensory nerves during an amputation and attaching them to another part of the body. They could then use the rerouted nerve signals to control a robotic limb, allowing a person to control their prosthesis with the same nerves they originally used to control their real limb. Kuiken's team first attempted the procedure – which is called targeted muscle reinnervation (TMR) – on people who were having their arm amputated. Now, Kuiken's team has performed TMR for the first time on a man with a leg amputation. First, the team rerouted the two main branches of the man's sciatic nerve to muscles in the thigh above the amputation. One branch controls the calf and some foot muscles, the other controls the muscle running down the outside leg and some more foot muscles. © Copyright Reed Business Information Ltd

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18707 - Posted: 09.26.2013

By Ingrid Wickelgren An attractive blonde in a bright red blouse sits in a wheelchair before the assembled scientists, doctors, writers and members of the community. We are in a conference room at the Aspen Meadows Resort, the site of the 2013 Aspen Brain Forum. Amanda Boxtel recalls what life was like for her at 24. She had been a skier, a runner and a ballet dancer, she tells us. She liked to hike in the wilderness. Pictures of a beautiful young woman appear on a screen. In the photos, she’s standing. Then one day on a slope, the tips of Boxtel’s skis crossed. She did a somersault and shattered four vertebrae. “I also shattered illusions of my immortality. I was paralyzed from here”—she hold her hands at her hips—“down. No movement and no sensation.” That life changed radically for her right then is difficult to dispute. But Boxtel eventually embraced a road to recovery. “It took time to turn wounds into wisdom. It took guts. This is a cruel injury. It is so much more than not being able to walk,” she tells us. With the aid of adaptive technology, she got back on her skis. She took up waterskiing, rock climbing, kayaking and hang gliding. But still, she couldn’t bear weight on her legs or walk. Walking seems easy to most of us, because the action is built-in; it is automatic. In reality, however, walking is a highly complex motion involving many different muscles that must contract in a precisely timed sequence. Once the spinal cord can no longer orchestrate this motion, it is exceedingly hard to replicate. Walking, for Boxtel, was arguably a pipe dream. And so she sat for 21 years. © 2013 Scientific American

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18699 - Posted: 09.25.2013

By Athena Andreadis Recently, two studies surfaced almost simultaneously that led to exclamations of “Vulcan mind meld!”, “Zombie armies!” and “Brains in jars!” One is the announcement by Rajesh Rao and Andrea Stocco of Washington U. that they “achieved the first human-to-human brain interface”. The other is the Nature paper by Madeline Lancaster et al about stem-cell-derived “organoids” that mimic early developmental aspects of the human cortex. My condensed evaluation: the latter is far more interesting and promising than the former, which doesn’t quite do what people (want to) think it’s doing. The purported result of brain interfacing hit many hot buttons that have been staples of science fiction and Stephen King novels: primarily telepathy, with its fictional potential for non-consensual control. Essentially, the sender’s EEG (electroencephalogram) output was linked to the receiver’s TMS (transcranial magnetic stimulation) input. What the experiment actually did is not send a thought but induce a muscle twitch; nothing novel, given the known properties of the two technologies. The conditions were severely constrained to produce the desired result and I suspect the outcome was independent of the stimulus details: the EEG simply recorded that a signal had been produced and the TMS apparatus was positioned so that a signal would elicit a movement of the right hand. Since both sender and receiver were poised over a keyboard operating a video game, the twitch was sufficient to press the space bar, programmed by the game to fire a cannon. © 2013 Scientific American

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 18623 - Posted: 09.10.2013

American researchers say they’ve performed what they believe is the first ever human-to-human brain interface, where one person was able to send a brain signal to trigger the hand motions of another person. “It was both exciting and eerie to watch an imagined section from my brain get translated into actual action by another brain,” said Rajesh Rao, a professor of computer science and engineering at the University of Washington, in a statement. Previous studies have done brain-to-brain transmissions between rats and one was done between a human and a rat. Rao was able to send a brain signal through the internet – utilizing electrical brain recordings and a form of magnetic stimulation – to the other side of the university campus to his colleague Andrea Stocco, an assistant professor of psychology, triggering Stocco’s finger to move on a keyboard. “The internet was a way to connect computers, and now it can be a way to connect brains,” said Stocco. “We want to take the knowledge of a brain and transmit it directly from brain to brain.” On Aug. 12, Rao sat in his lab with a cap on his head. The cap had electrodes hooked up to an electroencephalography machine, which reads the brain’s electrical activity. Meanwhile, Stocco was at his lab across campus, wearing a similar cap which had a transcranial magnetic stimulation coil place over his left motor cortex – the part of the brain that controls hand movement. Rao looked at a computer and in his mind, he played a video game. When he was supposed to fire a cannon at a target, he imagined moving his right hand, which stayed motionless. Stocco, almost instantaneously, moved his right index finger to push the space bar on the keyboard in front of him. Only simple brain signals, not thoughts “This was basically a one-way flow of information from my brain to his,” said Rao. © CBC 2013

Related chapters from BP7e: Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 2: Cells and Structures: The Anatomy of the Nervous System; Chapter 5: The Sensorimotor System
Link ID: 18583 - Posted: 08.29.2013

by Douglas Heaven It's a cognitive leap forward. IBM can now program an experimental chip they unveiled two years ago. The chips, designed to mimic how our brains work, are set to power computers that handle many streams of input data at once – much like the sensory input we deal with all the time. IBM's TrueNorth computer chips contain memory, processors and communication channels wired up like the synapses, neurons and axons of a brain. A key idea is that the chips can be hooked up into vast grids with many thousands working together in parallel. For certain types of task, such as quickly responding to large amounts of input data from sensors, they are much faster and less power-hungry than standard chips. They could one day replace human reflexes in self-driving cars or power the sensory systems of a robot, for example. But because the chips rewrite the rulebook for how computers are normally put together, they are not easy to program. Dharmendra Modha and his colleagues at IBM Research in San Jose, California, learned this the hard way. The team's first attempts were full of errors: "The programs were very unintuitive and extremely difficult to debug," says Modha. "Things looked hopeless." So they designed a new way of programming. This involves telling the computer how to yoke together the many individual chips in play at once. The IBM team came up with a way to package the functionality of each chip inside blocks of code they call "corelets". © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 18490 - Posted: 08.12.2013

By NICK BILTON Scientists haven’t yet found a way to mend a broken heart, but they’re edging closer to manipulating memory and downloading instructions from a computer right into a brain. Researchers from the Riken-M.I.T. Center for Neural Circuit Genetics at the Massachusetts Institute of Technology took us closer to this science-fiction world of brain tweaking last week when they said they were able to create a false memory in a mouse. The scientists reported in the journal Science that they caused mice to remember receiving an electrical shock in one location, when in reality they were zapped in a completely different place. The researchers weren’t able to create entirely new thoughts, but they applied good or bad feelings to memories that already existed. “It wasn’t so much writing a memory from scratch, it was basically connecting two different types of memories. We took a neutral memory, and we artificially updated that to make it a negative memory,” said Steve Ramirez, one of the M.I.T. neuroscientists on the project. It may sound insignificant and perhaps not a nice way to treat mice, but it is not a dramatic leap to imagine that one day this research could lead to computer-manipulation of the mind for things like the treatment of post-traumatic stress disorder, Mr. Ramirez said. Technologists are already working on brain-computer interfaces, which will allow us to interact with our smartphones and computers simply by using our minds. And there are already gadgets that read our thoughts and allow us to do things like dodge virtual objects in a computer game or turn switches on and off with a thought. Copyright 2013 The New York Times Company

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 13: Memory, Learning, and Development
Link ID: 18460 - Posted: 08.06.2013

by Alyssa Danigelis Next time you happen across an enormous cockroach, check to see whether it’s got a backpack on. Then look for the person controlling its movements with a phone. The RoboRoach has arrived. The RoboRoach is a system created by University of Michigan grads who have backgrounds in neuroscience, Greg Gage and Tim Marzullo. They came up with the cyborg roach idea as part of an effort to show students what real brain spiking activity looks like using off-the-shelf electronics. Essentially the RoboRoach involves taking a real live cockroach, putting it under anesthesia and placing wires in its antenna. Then the cockroach is outfitted with a special lightweight little backpack Gage and Marzullo developed that sends pulses to the antenna, causing the neurons to fire and the roach to think there’s a wall on one side. So it turns. The backpack connects to a phone via Bluetooth, enabling a human user to steer the cockroach through an app. Why? Why would anyone do this? ”We want to create neural interfaces that the general public can use,” the scientists say in a video. “Typically, to understand how these hardware devices and biological interfaces work, you’d have to go to graduate school in a neuro-engineering lab.” They added that the product is a learning tool, not a toy, and through it they hope to start a neuro-revolution. Currently the duo’s Backyard Brains startup is raising money through a Kickstarter campaign to develop more fine-tuned prototypes, make them more affordable, and extend battery life. The startup says it will make the RoboRoach hardware by hand in an Ann Arbor hacker space. © 2013 Discovery Communications, LLC

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 18264 - Posted: 06.12.2013

By Melissa Hogenboom Science reporter, BBC News Activity observed in the brain when using a "mind machine" is similar to how the brain learns new motor skills, scientists have found. Participants' neural activity was recorded by using sensors implanted in their brain, which were linked to a computer that translated electrical impulses into actions. The researchers believe people will be able to perform increasingly complex tasks just by thinking them. The study is published in PNAS journal. The subjects in the study moved from thinking about a task to automatically processing a task, in a similar way to how other motor movements are learnt - like playing the piano or learning to ride a bicycle. This was shown by the areas of neurons that were active in the brain, which changed as subjects became more adept at a mental task. Scientists analysed the results of a mind control task on a brain-computer interface (BCI) of seven participants with epilepsy. They were asked to play a computer game where they had to manipulate a ball to move across a screen - using only their mind. Recent studies using BCIs have shown that our minds can control various objects, like a robotic arm, "but there is still a lot of mystery in the way we learn to control them", said Jeremiah Wander from the University of Washington in Seattle, US, who led the study. BBC © 2013

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18255 - Posted: 06.11.2013

Devin Powell A model helicopter can now be steered through an obstacle course by thought alone, researchers report today in the Journal of Neural Engineering. The aircraft's pilot operates it remotely using a cap of electrodes to detect brainwaves that are translated into commands.1 Ultimately, the developers of the mind-controlled copter hope to adapt their technology for directing artificial robotic limbs and other medical devices. Today's best neural prosthetics require electrodes to be implanted in the body and are thus reserved for quadriplegics and others with disabilities severe enough justify invasive surgery. "We want to develop something non-invasive that can benefit lots of people, not just a limited number of patients," says Bin He, a biomedical engineer at the University of Minnesota in Minneapolis, whose new results build on his previous work with a virtual thought-controlled helicopter.2 But He's mechanical whirlybird isn't the first vehicle to be flown by the brain. In 2010 a team at the University of Illinois at Urbana-Champaign reported an unmanned aircraft that flies a fixed altitude but adjusts its heading to the left or right in response to a user's thoughts.3 The new chopper goes a step further. It can be guided up and down, as well as left or right, and it offers more precise control. To move it in a particular direction, a user imagines clenching his or her hands — the left one to go left, for instance, or both to go up. That mental image alters brain activity in the motor cortex. Changes in the strength and frequency of signals recorded by electrodes on the scalp using electroencephalography (EEG), and deciphered by a computer program, reveal the pilot's intent. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18231 - Posted: 06.05.2013

by Helen Thomson TWO years ago, Antonio Melillo was in a car crash that completely severed his spinal cord. He has not been able to move or feel his legs since. And yet here I am, in a lab at the Santa Lucia Foundation hospital in Rome, Italy, watching him walk. Melillo is one of the first people with lower limb paralysis to try out MindWalker – the world's first exoskeleton that aims to enable paralysed and locked-in people to walk using only their mind. Five people have been involved in the clinical trial of MindWalker over the past eight weeks. The trial culminates this week with a review by the European Commission, which funded the work. It's the end of a three-year development period for the project, which has three main elements. There is the exoskeleton itself, a contraption that holds a person's body weight and moves their legs when instructed. People learn how to use it in the second element: a virtual-reality environment. And then there's the mind-reading component. Over in the corner of the lab, Thomas Hoellinger of the Free University of Brussels (ULB) in Belgium is wearing an EEG cap, which measures electrical activity at various points across his scalp. There are several ways he can use it to control the exoskeleton through thought alone – at the moment, the most promising involves wearing a pair of glasses with flickering diodes attached to each lens. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18227 - Posted: 06.04.2013

By Meghan Rosen Save the clunky tricorders for Star Trek. One day, tiny biological computers with DNA-based circuitry could diagnose diseases. Using snippets of DNA and DNA-clipping chemicals, researchers have created one key component of a computer’s brain: the transistor, a switch that helps electronics perform logic. The biological switch, dubbed a transcriptor, could be plugged together with other biological devices to boost the power of DNA-based computers, researchers report March 28 in Science. With these switches, researchers might be able to program probiotic bacteria — the kind found in yogurt — to detect signs of colon cancer and then spit out warning signals, says study coauthor Jerome Bonnet of Stanford University. “The bacteria could actually travel through your gut and make a color in your poop,” he says. Inside every smartphone, television and iPod, a computer chip holds circuits loaded with millions of transistors. By flipping on or off, the tiny switches direct electrical current to different parts of the chip. But inside cells, even just a few linked-up switches could be powerful, says synthetic biologist Timothy Lu of MIT. The simple circuits “probably wouldn’t be able to compute square roots,” he says, “but you don’t need to put a MacBook chip inside a cell to get some really interesting functions.” And genetic computers can go places conventional electronics can’t. Instead of controlling the flow of electrons across metal circuit wires, the biological switches control the flow of a protein along a “wire” of DNA in living bacteria. As the protein chugs along the wire, it sends out messages telling the cell to make specific molecules — molecules that color a person’s poop green, for example. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17986 - Posted: 04.03.2013

A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering. “For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.” Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power. Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17923 - Posted: 03.20.2013