Links for Keyword: Robotics

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 101 - 120 of 268

By Scicurious I heard the rumblings on Twitter, and then on the blogs. It was telepathy. No, it wasn’t telepathy, but it was close. It was like the Borg. No it wasn’t. It was a mind meld! Ok, maybe. So what was it? It was one rat learning to do something, while electrodes recorded his every move. In the meantime, on another continent, another rat received the signals into his own brain…and changed his behavior. Telepathy? No. A good solid proof of concept? I’m not sure. An interesting idea? Absolutely. So I wanted to look at this paper in depth. We know already that some other experts weren’t really thrilled with the results. But I’m going to look at WHY, and what a more convincing experiment might look like. So what actually happened here? Each experiment involved two sets of rats. First, you have your “encoder rats”. These rats were water-deprived (not terribly, just thirsty), and trained to press a lever for a water reward (water deprivation is one training technique for lever pressing, and is one of the fastest. But you can also food-deprive and train for food or just train the animal to something tasty, like Crisco or sweetened milk). The rats were trained until they were 95% accurate at the task. They were then implanted with electrodes in the motor cortex, that recorded the firing of the neurons as the rats pressed the left or right lever. © 2013 Scientific American,

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17874 - Posted: 03.07.2013

But critics are sceptical about predicted organic computer. Ed Yong The brains of two rats on different continents have been made to act in tandem. When the first, in Brazil, uses its whiskers to choose between two stimuli, an implant records its brain activity and signals to a similar device in the brain of a rat in the United States. The US rat then usually makes the same choice on the same task. Miguel Nicolelis, a neuroscientist at Duke University in Durham, North Carolina, says that this system allows one rat to use the senses of another, incorporating information from its far-away partner into its own representation of the world. “It’s not telepathy. It’s not the Borg,” he says. “But we created a new central nervous system made of two brains.” Nicolelis says that the work, published today in Scientific Reports1, is the first step towards constructing an organic computer that uses networks of linked animal brains to solve tasks. But other scientists who work on neural implants are sceptical. Lee Miller, a physiologist at Northwestern University in Evanston, Illinois, says that Nicolelis’s team has made many important contributions to neural interfaces, but the current paper could be mistaken for a “poor Hollywood science-fiction script”. He adds, “It is not clear to what end the effort is really being made.” In earlier work2, Nicolelis’s team developed implants that can send and receive signals from the brain, allowing monkeys to control robotic or virtual arms and get a sense of touch in return. This time, Nicolelis wanted to see whether he could use these implants to couple the brains of two separate animals. © 2013 Nature Publishing Group

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17862 - Posted: 03.02.2013

Alison Abbott & Quirin Schiermeier Two of the biggest awards ever made for research have gone to boosting studies of the wonder material graphene and an elaborate simulation of the brain. The winners of the European Commission’s two-year Future and Emerging Technologies ‘flagship’ competition, announced on 28 January, will receive €500 million (US$670 million) each for their planned work, which the commission hopes will help to improve the lives, health and prosperity of millions of Europeans. The Human Brain Project, a supercomputer simulation of the human brain conceived and led by neuroscientist Henry Markram at the Swiss Federal Insitute of Technology in Lausanne, scooped one of the prizes. The other winning team, led by Jari Kinaret at Chalmers University of Technology in Gothenburg, Sweden, hopes to develop the potential of graphene — an ultrathin, flexible, electrically conducting form of carbon — in applications such as personal-communication technologies, energy storage and sensors. The size of the awards — matching funds raised by the participants are expected to bring each project’s budget up to €1 billion over ten years — have some researchers worrying that the flagship programme may draw resources from other research. And both winners have already faced criticism. Many neuroscientists have argued, for example, that the Human Brain Project’s approach to modelling the brain is too cumbersome to succeed (see Nature 482, 456–458; 2012). Markram is unfazed. He explains that the project will have three main thrusts. One will be to study the structure of the mouse brain, from the molecular to the cellular scale and up. Another will generate similar human data. A third will try to identify the brain wiring associated with particular behaviours. The long-term goals, Markram says, include improved diagnosis and treatment of brain diseases, and brain-inspired technology. © 2013 Nature Publishing Group

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17734 - Posted: 01.30.2013

By Alexandra Witze Quietly, on the top floor of a nondescript commercial building overlooking Boston Harbor, the future is being born. Rows of young scientists tap intently in front of computer monitors, their concentration unbroken even as the occasional plane from Logan Airport buzzes by. State-of-the-art lab equipment hums away in the background. This office, in Boston’s Marine Industrial Park, is what California’s Silicon Valley was four decades ago — the vanguard of an industry that will change your life. Just as researchers from Stanford provided the brains behind the semiconductor revolution, so are MIT and Harvard fueling the next big transformation. Students and faculty cross the Charles River not to build computer chips, but to re-engineer life itself. Take Reshma Shetty, one of the young minds at work in the eighth-floor biological production facility. After receiving her doctorate at MIT in 2008, she, like many new graduates, decided she wanted to make her mark on the world. She got together with four colleagues, including her Ph.D. adviser Tom Knight, to establish a company that aims “to make biology easy to engineer.” Place an order with Ginkgo BioWorks and its researchers will make an organism to do whatever you want. Need to suck carbon dioxide out of the atmosphere? They can engineer the insides of a bacterium to do just that. Want clean, biologically based fuels to replace petroleum taken from the ground? Company scientists will design a microbe to poop those out. © Society for Science & the Public 2000 - 2013

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17650 - Posted: 01.05.2013

By Rachel Ehrenberg Outfitted with a bionic eye, arm, legs and fantastic ’70s hair, Steve Austin was a cyborg whose implants allowed him to recover stolen atomic weapons, fight aliens and protect cryptographers in distress. Finally, real life is starting to catch up with the Six Million Dollar Man. In one of this year’s bionic breakthroughs, a paralyzed woman carried out her own superhuman feat: Using an implanted brain chip, she controlled a robotic arm with her mind (SN: 6/16/12, p. 5). She used the arm to grasp a cuppa joe and take a long, satisfying sip of coffee through a straw, an act she hadn’t done on her own for nearly 15 years. “We’re entering a really exciting area where we can develop all sorts of very complicated technologies that can actually have biomedical applications and improve the quality of life for people,” says bioengineer Grégoire Courtine of the Swiss Federal Institute of Technology in Lausanne. “It’s a revolution.” After her groundbreaking sip, Cathy Hutchinson, who had been paralyzed years earlier by a stroke, smiled and then laughed. A roomful of scientists burst into applause. This was a big year for prosthetic parts, both in and out of the lab. Athletes in London for the Paralympics and the Olympics sprinted on high-tech carbon blades and hurled javelins while balancing on the microprocessor-controlled C-Leg. People in wheelchairs used battery-powered robotic suits to keep their lower limbs in shape. A young man who lost his right leg in a motorcycle accident climbed the 103 flights of stairs in Chicago’s Willis Tower with a thought-controlled limb. That technology is still in development. But some bionic add-ons are starting to come out of the lab and into the clinic for the first time, though costs remain prohibitive for many potential users. © Society for Science & the Public 2000 - 2012

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17634 - Posted: 12.27.2012

By James Gallagher Health and science reporter, BBC News Unrivalled control of a robotic arm has been achieved using a paralysed woman's thoughts, a US study says. Jan Scheuermann, who is 53 and paralysed from the neck down, was able to deftly grasp and move a variety of objects just like a normal arm. Brain implants were used to control the robotic arm, in the study reported in the Lancet medical journal. Experts in the field said it was an "unprecedented performance" and a "remarkable achievement". Jan was diagnosed with spinocerebellar degeneration 13 years ago and progressively lost control of her body. She is now unable to move her arms or legs. She was implanted with two sensors - each four millimetres by four millimetres - in the motor cortex of her brain. A hundred tiny needles on each sensor pick up the electrical activity from about 200 individual brain cells. "The way that neurons communicate with each other is by how fast they fire pulses, it's a little bit akin to listening to a Geiger counter click, and it's that property that we lock onto," said Professor Andrew Schwartz from the University of Pittsburgh. The pulses of electricity in the brain are then translated into commands to move the arm, which bends at the elbow, wrist and could grab an object. BBC © 2012

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17611 - Posted: 12.17.2012

By Laura Sanders A new computer simulation of the brain can count, remember and gamble. And the system, called Spaun, performs these tasks in a way that’s eerily similar to how people do. Short for Semantic Pointer Architecture Unified Network, Spaun is a crude approximation of the human brain. But scientists hope that the program and efforts like it could be a proving ground to test ideas about the brain. Several groups of scientists have been racing to construct a realistic model of the human brain, or at least parts of it. What distinguishes Spaun from other attempts is that the model actually does something, says computational neuroscientist Christian Machens of the Champalimaud Centre for the Unknown in Lisbon, Portugal. At the end of an intense computational session, Spaun spits out instructions for a behavior, such as how to reproduce a number it’s been shown. “And of course, that’s why the brain is interesting,” Machens says. “That’s what makes it different from a plant.” Like a digital Frankenstein’s monster, Spaun was cobbled together from bits and pieces of knowledge gleaned from years of basic brain research. The behavior of 2.5 million nerve cells in parts of the brain important for vision, memory, reasoning and other tasks forms the basis of the new system, says Chris Eliasmith of the University of Waterloo in Canada, coauthor of the study, which appears in the Nov. 30 Science. © Society for Science & the Public 2000 - 2012

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17557 - Posted: 12.01.2012

David Perlman With an ultimate goal to help paralyzed patients achieve a degree of independence, Stanford brain researchers report they have taken a promising step forward in efforts to link nerve centers in the human brain with computers controlled by only a person's thought. In their latest development, the Stanford scientists have successfully enabled a pair of rhesus monkeys to move a virtual cursor across a computer screen merely by thinking about their response to human commands. The monkeys' ability to manipulate a cursor without using a mouse is based on a powerful new algorithm, a mathematical computing program devised by Vikash Gilja, a Stanford electrical engineer and computer scientist. Four years ago, neurosurgeons at Brown University and Massachusetts General Hospital had demonstrated a simpler version of an algorithm that enabled completely paralyzed humans with implanted sensors in their brains to command a cursor to move erratically toward targets on a computer screen. But with Gilja's algorithm, called ReFit, the monkeys showed they could aim their virtual cursor, a moving dot of light, at another bright light on a computer screen, and hold it steadily there for 15 seconds - far more precisely than the humans four years ago. With the new algorithm, they were able to perform their thinking tasks faster and more accurately as they sat comfortably in a chair facing the computer. The development is "a big step toward clinically useful brain-machine technology that has faster, smoother, and more natural movements" than anything before it, said James Gnadt of the National Institute of Neurological Disorders and Stroke. © 2012 Hearst Communications Inc.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17537 - Posted: 11.26.2012

By David Pogue Okay, great: we can control Our phones with speech recognition and our television sets with gesture recognition. But those technologies don't work in all situations for all people. So I say, forget about those crude beginnings; what we really want is thought recognition. As I found out during research for a recent NOVA episode, it mostly appears that brain-computer interface (BCI) technology has not advanced very far just yet. For example, I tried to make a toy helicopter fly by thinking “up” as I wore a $300 commercial EEG headset. It barely worked. Such “mind-reading” caps are quick to put on and noninvasive. They listen, through your scalp, for the incredibly weak remnants of electrical signals from your brain activity. But they're lousy at figuring out where in your brain they originated. Furthermore, the headset software didn't even know that I was thinking “up.” I could just as easily have thought “goofy” or “shoelace” or “pickle”—whatever I had thought about during the 15-second training session. There are other noninvasive brain scanners—magnetoencephalography, positron-emission tomography and near-infrared spectroscopy, and so on—but each also has its trade-offs. Of course, you can implant sensors inside someone's skull for the best readings of all; immobilized patients have successfully manipulated computer cursors and robotic arms using this approach. Still, when it comes to controlling everyday electronics, brain surgery might be a tough sell. © 2012 Scientific American,

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17518 - Posted: 11.21.2012

By Meghan Rosen Michael McAlpine’s shiny circuit doesn’t look like something you would stick in your mouth. It’s dashed with gold, has a coiled antenna and is glued to a stiff rectangle. But the antenna flexes, and the rectangle is actually silk, its stiffness melting away under water. And if you paste the device on your tooth, it could keep you healthy. The electronic gizmo is designed to detect dangerous bacteria and send out warning signals, alerting its bearer to microbes slipping past the lips. Recently, McAlpine, of Princeton University, and his colleagues spotted a single E. coli bacterium skittering across the surface of the gadget’s sensor. The sensor also picked out ulcer-causing H. pylori amid the molecular medley of human saliva, the team reported earlier this year in Nature Communications. At about the size of a standard postage stamp, the dental device is still too big to fit comfortably in a human mouth. “We had to use a cow tooth,” McAlpine says, describing test experiments. But his team plans to shrink the gadget so it can nestle against human enamel. McAlpine is convinced that one day, perhaps five to 10 years from now, everyone will wear some sort of electronic device. “It’s not just teeth,” he says. “People are going to be bionic.” McAlpine belongs to a growing pack of tech-savvy scientists figuring out how to merge the rigid, brittle materials of conventional electronics with the soft, curving surfaces of human tissues. Their goal: To create products that have the high performance of silicon wafers — the crystalline material used in computer chips — while still moving with the body. © Society for Science & the Public 2000 - 2012

Related chapters from BN: Chapter 11: Motor Control and Plasticity; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 15: Language and Lateralization
Link ID: 17455 - Posted: 11.05.2012

By Miguel A. L. Nicolelis In 2014 billions of viewers worldwide may remember the opening game of the World Cup in Brazil for more than just the goals scored by the Brazilian national team and the red cards given to its adversary. On that day my laboratory at Duke University, which specializes in developing technologies that allow electrical signals from the brain to control robotic limbs, plans to mark a milestone in overcoming paralysis. If we succeed in meeting still formidable challenges, the first ceremonial kick of the World Cup game may be made by a paralyzed teenager, who, flanked by the two contending soccer teams, will saunter onto the pitch clad in a robotic body suit. This suit—or exoskeleton, as we call it—will envelop the teenager's legs. His or her first steps onto the field will be controlled by motor signals originating in the kicker's brain and transmitted wirelessly to a computer unit the size of a laptop in a backpack carried by our patient. This computer will be responsible for translating electrical brain signals into digital motor commands so that the exoskeleton can first stabilize the kicker's body weight and then induce the robotic legs to begin the back-and-forth coordinated movements of a walk over the manicured grass. Then, on approaching the ball, the kicker will visualize placing a foot in contact with it. Three hundred milliseconds later brain signals will instruct the exoskeleton's robotic foot to hook under the leather sphere, Brazilian style, and boot it aloft. This scientific demonstration of a radically new technology, undertaken with collaborators in Europe and Brazil, will convey to a global audience of billions that brain control of machines has moved from lab demos and futuristic speculation to a new era in which tools capable of bringing mobility to patients incapacitated by injury or disease may become a reality. © 2012 Scientific American

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17220 - Posted: 08.30.2012

Helen Shen Automated assistance may soon be available to neuroscientists tackling the brain’s complex circuitry, according to research presented last week at the Aspen Brain Forum in Colorado. Robots that can find and simultaneously record the activity of dozens of neurons in live animals could help researchers to reveal how connected cells interpret signals from one another and transmit information across brain areas — a task that would be impossible using single-neuron studies. The robots are designed to perform whole-cell patch-clamping, a difficult but powerful method that allows neuroscientists to access neurons' internal electrical workings, says Edward Boyden of the Massachusetts Institute of Technology in Cambridge, who is leading the work. Manually performing the method on live animals requires extensive training to perfect and, as a result, only a handful of neurophysiologists use the technique, says Boyden, who presented at the conference. He is developing the automated tool with Craig Forest at the Georgia Institute of Technology in Atlanta and others. “We think that it helps democratize procedures that require a lot of skill,” he says. In May, the group described how a basic version of the robot can record electrical currents in single neurons in the brains of anaesthetized mice1. The robot finds its target on the basis of characteristic changes in the electrical environment near neurons. Then, the device nicks the cell’s membrane and seals itself around the tiny hole to access the neuron's contents. On 24 August, Boyden presented results showing that a more advanced version of the robot could be used to identify and probe four neurons at once — and he says he wants to push the design further, perhaps to tap as many as 100 neurons at a time. © 2012 Nature Publishing Group

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 17215 - Posted: 08.29.2012

Analysis by Jesse Emspak The phrase, "use your brainpower" may soon become literal. Engineers at MIT have developed a tiny prototype fuel cell that creates electricy from the body's natural sugars. The fuel cell could be used to power brain implants for treating epilepsy, Parkinson's diseases and paralysis. Currently, devices implanted in the body are typically powered by lithium-ion batteries, but they have a limited lifetime and need to be replaced. Opening up the body to replace a battery is not something doctor like to do, but doing it in the brain is even less desirable. The researchers, led by Rahul Sarpeshkar, an associate professor of electrical engineering and computer science, built the fuel cell using a platinum catalyst at one end and a layer of carbon nanotubes at the other. It rests on a silicon chip, allowing it to be connected to electronics that would be used in brain implants. coughing robot As glucose passes over the platinum, electrons and hydrogen ions are stripped off as it is oxidized. That's what makes the current. At the other end of the cell, oxygen mixes with the hydrogen to make water when it hits the layer of single-walled carbon nanotubes. The cell produces up to 180 microwatts, enough to power a brain implant that might send signals to bypass damaged region, or stimulate part of the brain (a treatment used in disorders such as Parkinson's). © 2012 Discovery Communications, LLC.

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 16982 - Posted: 06.28.2012

By Rachel Ehrenberg Directing a robotic arm with her thoughts, a paralyzed woman named Cathy can pick up a bottle of coffee and sip it through a straw, a simple task that she hasn’t done on her own for nearly 15 years. The technology that brought about the feat is a brain-computer interface system: A computer decodes signals from a tiny chip implanted in the woman’s brain, translating her thoughts into actions that are carried out by the robot arm. The seemingly mundane task of bringing a drink to one’s mouth is the first published demonstration that severely paralyzed people can conduct directed movements in three-dimensional space using a brain-controlled robotic device. This latest application of the system, called BrainGate, is described in the May 17 Nature. “Much has been demonstrated in terms of laboratory work and monkeys, but this is the first time showing something that’s going to be useful for patients,” says neuroscientist Andrew Jackson, of Newcastle University in England. A commentary by Jackson on the new developments appears in the same issue of Nature. There’s still a lot of work to do before BrainGate can be used outside a lab. In the current design, the tiny sensor that sits in the patient’s brain is attached to a mini fridge–sized computer via ungainly wires. So making the system wireless is one goal. The researchers hope that within a decade the BrainGate system will be available and affordable for people who are paralyzed or have prosthetic limbs. Eventually, similar technology might restore function to a natural limb that no longer works. © Society for Science & the Public 2000 - 2012

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16807 - Posted: 05.17.2012

by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16675 - Posted: 04.19.2012

By DAVID EWING DUNCAN SAN DIEGO — Already surrounded by machines that allow him, painstakingly, to communicate, the physicist Stephen Hawking last summer donned what looked like a rakish black headband that held a feather-light device the size of a small matchbox. Called the iBrain, this simple-looking contraption is part of an experiment that aims to allow Dr. Hawking — long paralyzed by amyotrophic lateral sclerosis, or Lou Gehrig’s disease — to communicate by merely thinking. The iBrain is part of a new generation of portable neural devices and algorithms intended to monitor and diagnose conditions like sleep apnea, depression and autism. Invented by a team led by Philip Low, a 32-year-old neuroscientist who is chief executive of NeuroVigil, a company based in San Diego, the iBrain is gaining attention as a possible alternative to expensive sleep labs that use rubber and plastic caps riddled with dozens of electrodes and usually require a patient to stay overnight. “The iBrain can collect data in real time in a person’s own bed, or when they’re watching TV, or doing just about anything,” Dr. Low said. The device uses a single channel to pick up waves of electrical brain signals, which change with different activities and thoughts, or with the pathologies that accompany brain disorders. © 2012 The New York Times Company

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16600 - Posted: 04.04.2012

David T. Blake After training, animals and humans can make their thoughts interact directly with computers. A study provides evidence that the corticostriatal system of the brain is essential for this learning process. Brain–machine interfaces have a rich history in the sci-fi genre: in The Matrix films, human brains are plugged into a computer-based simulation that then becomes their 'reality'. But using our thoughts to directly control computers or other devices is not just in the realm of fantasy. Monkeys can learn to use visual cues to instruct a brain–machine interface to move a robotic arm or a computer cursor1, 2. And electrode arrays were implanted into the brain of a paralysed man in 2006, enabling him to control an artificial arm, to move a cursor on a computer screen and even to open e-mail3. Over time, an individual learns to improve their control over the brain–machine interface by modifying the activity of their brain, but how this happens is not well understood. In an article published on Nature's website today, Koralek et al.4 report that the corticostriatal system of the brain is involved in learning mental actions and skills that do not involve physical movement, such as those required for control of brain–machine interfaces. The corticostriatal system has a unique pattern of connectivity that enables sensory inputs to be associated with appropriate motor or cognitive responses5. It consists of a cortical component, the primary motor cortex, that exerts control over muscles, and a striatal component, the basal ganglia, that receives direct inputs from the motor cortex. The basal ganglia are involved in a wide range of learning conditions and are crucial to the motor deficits observed in Parkinson's and Huntington's diseases. Both corticostriatal components have a role in the learning and execution of physical skills requiring movement. © 2012 Nature Publishing Group

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16469 - Posted: 03.05.2012

M. Mitchell Waldrop It wasn't quite the lynching that Henry Markram had expected. But the barrage of sceptical comments from his fellow neuroscientists — “It's crap,” said one — definitely made the day feel like a tribunal. Officially, the Swiss Academy of Sciences meeting in Bern on 20 January was an overview of large-scale computer modelling in neuroscience. Unofficially, it was neuroscientists' first real chance to get answers about Markram's controversial proposal for the Human Brain Project (HBP) — an effort to build a supercomputer simulation that integrates everything known about the human brain, from the structures of ion channels in neural cell membranes up to mechanisms behind conscious decision-making. Markram, a South-African-born brain electrophysiologist who joined the Swiss Federal Institute of Technology in Lausanne (EPFL) a decade ago, may soon see his ambition fulfilled. The project is one of six finalists vying to win €1 billion (US$1.3 billion) as one of the European Union's two new decade-long Flagship initiatives. “Brain researchers are generating 60,000 papers per year,” said Markram as he explained the concept in Bern. “They're all beautiful, fantastic studies — but all focused on their one little corner: this molecule, this brain region, this function, this map.” The HBP would integrate these discoveries, he said, and create models to explore how neural circuits are organized, and how they give rise to behaviour and cognition — among the deepest mysteries in neuroscience. Ultimately, said Markram, the HBP would even help researchers to grapple with disorders such as Alzheimer's disease. “If we don't have an integrated view, we won't understand these diseases,” he declared. © 2012 Nature Publishing Group

Related chapters from BN: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 2: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 16416 - Posted: 02.23.2012

by Anil Ananthaswamy GOVERNMENT spooks want cyborg insects to snoop on their enemies. Biologists want to tap into the nervous systems of insects to understand how they fly. A probe that can be implanted into moths to control their flight could help satisfy both parties. One day, it could even help rehabilitate people who have had strokes. The US Defense Advanced Research Projects Agency (DARPA) has been running a programme to develop machine-insect interfaces for years but electrodes implanted to stimulate the brains or wing muscles of insects were not precise enough. Now Joel Voldman of the Massachusetts Institute of Technology and colleagues have designed a unique, flexible neural probe that can be attached directly to an insect's ventral nerve cord (VNC), which, along with the brain, makes up the central nervous system in insects. Another reason previous attempts have not been entirely successful was because the impedance of the electrodes did not match that of the insect's tissue. This probe is made of a polyimide polymer coated with gold and carbon nanotubes, and its impedance is much closer to that of nerve tissue. One end of the probe is a ring that clamps around the VNC. The inside of the ring has five electrodes which stimulate distinct nerve bundles within the VNC. Attached to the probe is a wireless stimulator, which contains a radio receiver, as well as a battery and a device to generate electrical pulses. The team implanted the device in the abdomen of a tobacco hawkmoth (Manduca sexta). As it weighs less than half a gram, it is easy for the moth to carry. "Their wingspan is the width of your hand," says Voldman. "These are big guys." © Copyright Reed Business Information Ltd

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16359 - Posted: 02.09.2012

By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,

Related chapters from BN: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 16303 - Posted: 01.28.2012