Links for Keyword: Robotics

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 198

American researchers say they’ve performed what they believe is the first ever human-to-human brain interface, where one person was able to send a brain signal to trigger the hand motions of another person. “It was both exciting and eerie to watch an imagined section from my brain get translated into actual action by another brain,” said Rajesh Rao, a professor of computer science and engineering at the University of Washington, in a statement. Previous studies have done brain-to-brain transmissions between rats and one was done between a human and a rat. Rao was able to send a brain signal through the internet – utilizing electrical brain recordings and a form of magnetic stimulation – to the other side of the university campus to his colleague Andrea Stocco, an assistant professor of psychology, triggering Stocco’s finger to move on a keyboard. “The internet was a way to connect computers, and now it can be a way to connect brains,” said Stocco. “We want to take the knowledge of a brain and transmit it directly from brain to brain.” On Aug. 12, Rao sat in his lab with a cap on his head. The cap had electrodes hooked up to an electroencephalography machine, which reads the brain’s electrical activity. Meanwhile, Stocco was at his lab across campus, wearing a similar cap which had a transcranial magnetic stimulation coil place over his left motor cortex – the part of the brain that controls hand movement. Rao looked at a computer and in his mind, he played a video game. When he was supposed to fire a cannon at a target, he imagined moving his right hand, which stayed motionless. Stocco, almost instantaneously, moved his right index finger to push the space bar on the keyboard in front of him. Only simple brain signals, not thoughts “This was basically a one-way flow of information from my brain to his,” said Rao. © CBC 2013

Related chapters from BP7e: Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 2: Cells and Structures: The Anatomy of the Nervous System; Chapter 5: The Sensorimotor System
Link ID: 18583 - Posted: 08.29.2013

by Douglas Heaven It's a cognitive leap forward. IBM can now program an experimental chip they unveiled two years ago. The chips, designed to mimic how our brains work, are set to power computers that handle many streams of input data at once – much like the sensory input we deal with all the time. IBM's TrueNorth computer chips contain memory, processors and communication channels wired up like the synapses, neurons and axons of a brain. A key idea is that the chips can be hooked up into vast grids with many thousands working together in parallel. For certain types of task, such as quickly responding to large amounts of input data from sensors, they are much faster and less power-hungry than standard chips. They could one day replace human reflexes in self-driving cars or power the sensory systems of a robot, for example. But because the chips rewrite the rulebook for how computers are normally put together, they are not easy to program. Dharmendra Modha and his colleagues at IBM Research in San Jose, California, learned this the hard way. The team's first attempts were full of errors: "The programs were very unintuitive and extremely difficult to debug," says Modha. "Things looked hopeless." So they designed a new way of programming. This involves telling the computer how to yoke together the many individual chips in play at once. The IBM team came up with a way to package the functionality of each chip inside blocks of code they call "corelets". © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 18490 - Posted: 08.12.2013

By NICK BILTON Scientists haven’t yet found a way to mend a broken heart, but they’re edging closer to manipulating memory and downloading instructions from a computer right into a brain. Researchers from the Riken-M.I.T. Center for Neural Circuit Genetics at the Massachusetts Institute of Technology took us closer to this science-fiction world of brain tweaking last week when they said they were able to create a false memory in a mouse. The scientists reported in the journal Science that they caused mice to remember receiving an electrical shock in one location, when in reality they were zapped in a completely different place. The researchers weren’t able to create entirely new thoughts, but they applied good or bad feelings to memories that already existed. “It wasn’t so much writing a memory from scratch, it was basically connecting two different types of memories. We took a neutral memory, and we artificially updated that to make it a negative memory,” said Steve Ramirez, one of the M.I.T. neuroscientists on the project. It may sound insignificant and perhaps not a nice way to treat mice, but it is not a dramatic leap to imagine that one day this research could lead to computer-manipulation of the mind for things like the treatment of post-traumatic stress disorder, Mr. Ramirez said. Technologists are already working on brain-computer interfaces, which will allow us to interact with our smartphones and computers simply by using our minds. And there are already gadgets that read our thoughts and allow us to do things like dodge virtual objects in a computer game or turn switches on and off with a thought. Copyright 2013 The New York Times Company

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 13: Memory, Learning, and Development
Link ID: 18460 - Posted: 08.06.2013

by Alyssa Danigelis Next time you happen across an enormous cockroach, check to see whether it’s got a backpack on. Then look for the person controlling its movements with a phone. The RoboRoach has arrived. The RoboRoach is a system created by University of Michigan grads who have backgrounds in neuroscience, Greg Gage and Tim Marzullo. They came up with the cyborg roach idea as part of an effort to show students what real brain spiking activity looks like using off-the-shelf electronics. Essentially the RoboRoach involves taking a real live cockroach, putting it under anesthesia and placing wires in its antenna. Then the cockroach is outfitted with a special lightweight little backpack Gage and Marzullo developed that sends pulses to the antenna, causing the neurons to fire and the roach to think there’s a wall on one side. So it turns. The backpack connects to a phone via Bluetooth, enabling a human user to steer the cockroach through an app. Why? Why would anyone do this? ”We want to create neural interfaces that the general public can use,” the scientists say in a video. “Typically, to understand how these hardware devices and biological interfaces work, you’d have to go to graduate school in a neuro-engineering lab.” They added that the product is a learning tool, not a toy, and through it they hope to start a neuro-revolution. Currently the duo’s Backyard Brains startup is raising money through a Kickstarter campaign to develop more fine-tuned prototypes, make them more affordable, and extend battery life. The startup says it will make the RoboRoach hardware by hand in an Ann Arbor hacker space. © 2013 Discovery Communications, LLC

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 5: The Sensorimotor System
Link ID: 18264 - Posted: 06.12.2013

By Melissa Hogenboom Science reporter, BBC News Activity observed in the brain when using a "mind machine" is similar to how the brain learns new motor skills, scientists have found. Participants' neural activity was recorded by using sensors implanted in their brain, which were linked to a computer that translated electrical impulses into actions. The researchers believe people will be able to perform increasingly complex tasks just by thinking them. The study is published in PNAS journal. The subjects in the study moved from thinking about a task to automatically processing a task, in a similar way to how other motor movements are learnt - like playing the piano or learning to ride a bicycle. This was shown by the areas of neurons that were active in the brain, which changed as subjects became more adept at a mental task. Scientists analysed the results of a mind control task on a brain-computer interface (BCI) of seven participants with epilepsy. They were asked to play a computer game where they had to manipulate a ball to move across a screen - using only their mind. Recent studies using BCIs have shown that our minds can control various objects, like a robotic arm, "but there is still a lot of mystery in the way we learn to control them", said Jeremiah Wander from the University of Washington in Seattle, US, who led the study. BBC © 2013

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18255 - Posted: 06.11.2013

Devin Powell A model helicopter can now be steered through an obstacle course by thought alone, researchers report today in the Journal of Neural Engineering. The aircraft's pilot operates it remotely using a cap of electrodes to detect brainwaves that are translated into commands.1 Ultimately, the developers of the mind-controlled copter hope to adapt their technology for directing artificial robotic limbs and other medical devices. Today's best neural prosthetics require electrodes to be implanted in the body and are thus reserved for quadriplegics and others with disabilities severe enough justify invasive surgery. "We want to develop something non-invasive that can benefit lots of people, not just a limited number of patients," says Bin He, a biomedical engineer at the University of Minnesota in Minneapolis, whose new results build on his previous work with a virtual thought-controlled helicopter.2 But He's mechanical whirlybird isn't the first vehicle to be flown by the brain. In 2010 a team at the University of Illinois at Urbana-Champaign reported an unmanned aircraft that flies a fixed altitude but adjusts its heading to the left or right in response to a user's thoughts.3 The new chopper goes a step further. It can be guided up and down, as well as left or right, and it offers more precise control. To move it in a particular direction, a user imagines clenching his or her hands — the left one to go left, for instance, or both to go up. That mental image alters brain activity in the motor cortex. Changes in the strength and frequency of signals recorded by electrodes on the scalp using electroencephalography (EEG), and deciphered by a computer program, reveal the pilot's intent. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18231 - Posted: 06.05.2013

by Helen Thomson TWO years ago, Antonio Melillo was in a car crash that completely severed his spinal cord. He has not been able to move or feel his legs since. And yet here I am, in a lab at the Santa Lucia Foundation hospital in Rome, Italy, watching him walk. Melillo is one of the first people with lower limb paralysis to try out MindWalker – the world's first exoskeleton that aims to enable paralysed and locked-in people to walk using only their mind. Five people have been involved in the clinical trial of MindWalker over the past eight weeks. The trial culminates this week with a review by the European Commission, which funded the work. It's the end of a three-year development period for the project, which has three main elements. There is the exoskeleton itself, a contraption that holds a person's body weight and moves their legs when instructed. People learn how to use it in the second element: a virtual-reality environment. And then there's the mind-reading component. Over in the corner of the lab, Thomas Hoellinger of the Free University of Brussels (ULB) in Belgium is wearing an EEG cap, which measures electrical activity at various points across his scalp. There are several ways he can use it to control the exoskeleton through thought alone – at the moment, the most promising involves wearing a pair of glasses with flickering diodes attached to each lens. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 18227 - Posted: 06.04.2013

By Meghan Rosen Save the clunky tricorders for Star Trek. One day, tiny biological computers with DNA-based circuitry could diagnose diseases. Using snippets of DNA and DNA-clipping chemicals, researchers have created one key component of a computer’s brain: the transistor, a switch that helps electronics perform logic. The biological switch, dubbed a transcriptor, could be plugged together with other biological devices to boost the power of DNA-based computers, researchers report March 28 in Science. With these switches, researchers might be able to program probiotic bacteria — the kind found in yogurt — to detect signs of colon cancer and then spit out warning signals, says study coauthor Jerome Bonnet of Stanford University. “The bacteria could actually travel through your gut and make a color in your poop,” he says. Inside every smartphone, television and iPod, a computer chip holds circuits loaded with millions of transistors. By flipping on or off, the tiny switches direct electrical current to different parts of the chip. But inside cells, even just a few linked-up switches could be powerful, says synthetic biologist Timothy Lu of MIT. The simple circuits “probably wouldn’t be able to compute square roots,” he says, “but you don’t need to put a MacBook chip inside a cell to get some really interesting functions.” And genetic computers can go places conventional electronics can’t. Instead of controlling the flow of electrons across metal circuit wires, the biological switches control the flow of a protein along a “wire” of DNA in living bacteria. As the protein chugs along the wire, it sends out messages telling the cell to make specific molecules — molecules that color a person’s poop green, for example. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17986 - Posted: 04.03.2013

A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering. “For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.” Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power. Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17923 - Posted: 03.20.2013

By Rachel Ehrenberg Surgeons have replaced 75 percent of a man’s skull with a custom-designed polymer cranium constructed with a 3-D printer. The surgery took place on March 4 and is the first U.S. case following the FDA’s approval of the implants last month. The patient’s reason for needing such extensive replacement surgery has not been revealed. Similar surgeries may follow in other cases where sections of the skull are removed because the brain has swollen during a surgery or after an accident, says Scott DeFelice, president of Connecticut-based Oxford Performance Materials, the company that created the prosthetic. Technicians used CT scans to get images of the part of the skull that needed replacing. Then, with computer software and input from surgeons, engineers designed the replacement part. A machine that uses lasers to fuse granules of material built the prosthetic layer by layer out of a special plastic called PEKK. While inert like titanium, PEKK is riddled on its surface with pocks and ridges that promote bone cell growth, DeFelice says. Such implants have value as a brain-protecting material, says Jeremy Mao, a biomedical engineer and codirector of Columbia University’s center for craniofacial regeneration. But doctors will need to keep an eye out for long-term problems; The skull isn’t just a box for the brain but a complicated piece of anatomy linked to connective and soft tissues. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17891 - Posted: 03.12.2013

By Scicurious I heard the rumblings on Twitter, and then on the blogs. It was telepathy. No, it wasn’t telepathy, but it was close. It was like the Borg. No it wasn’t. It was a mind meld! Ok, maybe. So what was it? It was one rat learning to do something, while electrodes recorded his every move. In the meantime, on another continent, another rat received the signals into his own brain…and changed his behavior. Telepathy? No. A good solid proof of concept? I’m not sure. An interesting idea? Absolutely. So I wanted to look at this paper in depth. We know already that some other experts weren’t really thrilled with the results. But I’m going to look at WHY, and what a more convincing experiment might look like. So what actually happened here? Each experiment involved two sets of rats. First, you have your “encoder rats”. These rats were water-deprived (not terribly, just thirsty), and trained to press a lever for a water reward (water deprivation is one training technique for lever pressing, and is one of the fastest. But you can also food-deprive and train for food or just train the animal to something tasty, like Crisco or sweetened milk). The rats were trained until they were 95% accurate at the task. They were then implanted with electrodes in the motor cortex, that recorded the firing of the neurons as the rats pressed the left or right lever. © 2013 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17874 - Posted: 03.07.2013

But critics are sceptical about predicted organic computer. Ed Yong The brains of two rats on different continents have been made to act in tandem. When the first, in Brazil, uses its whiskers to choose between two stimuli, an implant records its brain activity and signals to a similar device in the brain of a rat in the United States. The US rat then usually makes the same choice on the same task. Miguel Nicolelis, a neuroscientist at Duke University in Durham, North Carolina, says that this system allows one rat to use the senses of another, incorporating information from its far-away partner into its own representation of the world. “It’s not telepathy. It’s not the Borg,” he says. “But we created a new central nervous system made of two brains.” Nicolelis says that the work, published today in Scientific Reports1, is the first step towards constructing an organic computer that uses networks of linked animal brains to solve tasks. But other scientists who work on neural implants are sceptical. Lee Miller, a physiologist at Northwestern University in Evanston, Illinois, says that Nicolelis’s team has made many important contributions to neural interfaces, but the current paper could be mistaken for a “poor Hollywood science-fiction script”. He adds, “It is not clear to what end the effort is really being made.” In earlier work2, Nicolelis’s team developed implants that can send and receive signals from the brain, allowing monkeys to control robotic or virtual arms and get a sense of touch in return. This time, Nicolelis wanted to see whether he could use these implants to couple the brains of two separate animals. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17862 - Posted: 03.02.2013

Alison Abbott & Quirin Schiermeier Two of the biggest awards ever made for research have gone to boosting studies of the wonder material graphene and an elaborate simulation of the brain. The winners of the European Commission’s two-year Future and Emerging Technologies ‘flagship’ competition, announced on 28 January, will receive €500 million (US$670 million) each for their planned work, which the commission hopes will help to improve the lives, health and prosperity of millions of Europeans. The Human Brain Project, a supercomputer simulation of the human brain conceived and led by neuroscientist Henry Markram at the Swiss Federal Insitute of Technology in Lausanne, scooped one of the prizes. The other winning team, led by Jari Kinaret at Chalmers University of Technology in Gothenburg, Sweden, hopes to develop the potential of graphene — an ultrathin, flexible, electrically conducting form of carbon — in applications such as personal-communication technologies, energy storage and sensors. The size of the awards — matching funds raised by the participants are expected to bring each project’s budget up to €1 billion over ten years — have some researchers worrying that the flagship programme may draw resources from other research. And both winners have already faced criticism. Many neuroscientists have argued, for example, that the Human Brain Project’s approach to modelling the brain is too cumbersome to succeed (see Nature 482, 456–458; 2012). Markram is unfazed. He explains that the project will have three main thrusts. One will be to study the structure of the mouse brain, from the molecular to the cellular scale and up. Another will generate similar human data. A third will try to identify the brain wiring associated with particular behaviours. The long-term goals, Markram says, include improved diagnosis and treatment of brain diseases, and brain-inspired technology. © 2013 Nature Publishing Group

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17734 - Posted: 01.30.2013

By Alexandra Witze Quietly, on the top floor of a nondescript commercial building overlooking Boston Harbor, the future is being born. Rows of young scientists tap intently in front of computer monitors, their concentration unbroken even as the occasional plane from Logan Airport buzzes by. State-of-the-art lab equipment hums away in the background. This office, in Boston’s Marine Industrial Park, is what California’s Silicon Valley was four decades ago — the vanguard of an industry that will change your life. Just as researchers from Stanford provided the brains behind the semiconductor revolution, so are MIT and Harvard fueling the next big transformation. Students and faculty cross the Charles River not to build computer chips, but to re-engineer life itself. Take Reshma Shetty, one of the young minds at work in the eighth-floor biological production facility. After receiving her doctorate at MIT in 2008, she, like many new graduates, decided she wanted to make her mark on the world. She got together with four colleagues, including her Ph.D. adviser Tom Knight, to establish a company that aims “to make biology easy to engineer.” Place an order with Ginkgo BioWorks and its researchers will make an organism to do whatever you want. Need to suck carbon dioxide out of the atmosphere? They can engineer the insides of a bacterium to do just that. Want clean, biologically based fuels to replace petroleum taken from the ground? Company scientists will design a microbe to poop those out. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17650 - Posted: 01.05.2013

By Rachel Ehrenberg Outfitted with a bionic eye, arm, legs and fantastic ’70s hair, Steve Austin was a cyborg whose implants allowed him to recover stolen atomic weapons, fight aliens and protect cryptographers in distress. Finally, real life is starting to catch up with the Six Million Dollar Man. In one of this year’s bionic breakthroughs, a paralyzed woman carried out her own superhuman feat: Using an implanted brain chip, she controlled a robotic arm with her mind (SN: 6/16/12, p. 5). She used the arm to grasp a cuppa joe and take a long, satisfying sip of coffee through a straw, an act she hadn’t done on her own for nearly 15 years. “We’re entering a really exciting area where we can develop all sorts of very complicated technologies that can actually have biomedical applications and improve the quality of life for people,” says bioengineer Grégoire Courtine of the Swiss Federal Institute of Technology in Lausanne. “It’s a revolution.” After her groundbreaking sip, Cathy Hutchinson, who had been paralyzed years earlier by a stroke, smiled and then laughed. A roomful of scientists burst into applause. This was a big year for prosthetic parts, both in and out of the lab. Athletes in London for the Paralympics and the Olympics sprinted on high-tech carbon blades and hurled javelins while balancing on the microprocessor-controlled C-Leg. People in wheelchairs used battery-powered robotic suits to keep their lower limbs in shape. A young man who lost his right leg in a motorcycle accident climbed the 103 flights of stairs in Chicago’s Willis Tower with a thought-controlled limb. That technology is still in development. But some bionic add-ons are starting to come out of the lab and into the clinic for the first time, though costs remain prohibitive for many potential users. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17634 - Posted: 12.27.2012

By James Gallagher Health and science reporter, BBC News Unrivalled control of a robotic arm has been achieved using a paralysed woman's thoughts, a US study says. Jan Scheuermann, who is 53 and paralysed from the neck down, was able to deftly grasp and move a variety of objects just like a normal arm. Brain implants were used to control the robotic arm, in the study reported in the Lancet medical journal. Experts in the field said it was an "unprecedented performance" and a "remarkable achievement". Jan was diagnosed with spinocerebellar degeneration 13 years ago and progressively lost control of her body. She is now unable to move her arms or legs. She was implanted with two sensors - each four millimetres by four millimetres - in the motor cortex of her brain. A hundred tiny needles on each sensor pick up the electrical activity from about 200 individual brain cells. "The way that neurons communicate with each other is by how fast they fire pulses, it's a little bit akin to listening to a Geiger counter click, and it's that property that we lock onto," said Professor Andrew Schwartz from the University of Pittsburgh. The pulses of electricity in the brain are then translated into commands to move the arm, which bends at the elbow, wrist and could grab an object. BBC © 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17611 - Posted: 12.17.2012

By Laura Sanders A new computer simulation of the brain can count, remember and gamble. And the system, called Spaun, performs these tasks in a way that’s eerily similar to how people do. Short for Semantic Pointer Architecture Unified Network, Spaun is a crude approximation of the human brain. But scientists hope that the program and efforts like it could be a proving ground to test ideas about the brain. Several groups of scientists have been racing to construct a realistic model of the human brain, or at least parts of it. What distinguishes Spaun from other attempts is that the model actually does something, says computational neuroscientist Christian Machens of the Champalimaud Centre for the Unknown in Lisbon, Portugal. At the end of an intense computational session, Spaun spits out instructions for a behavior, such as how to reproduce a number it’s been shown. “And of course, that’s why the brain is interesting,” Machens says. “That’s what makes it different from a plant.” Like a digital Frankenstein’s monster, Spaun was cobbled together from bits and pieces of knowledge gleaned from years of basic brain research. The behavior of 2.5 million nerve cells in parts of the brain important for vision, memory, reasoning and other tasks forms the basis of the new system, says Chris Eliasmith of the University of Waterloo in Canada, coauthor of the study, which appears in the Nov. 30 Science. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 17557 - Posted: 12.01.2012

David Perlman With an ultimate goal to help paralyzed patients achieve a degree of independence, Stanford brain researchers report they have taken a promising step forward in efforts to link nerve centers in the human brain with computers controlled by only a person's thought. In their latest development, the Stanford scientists have successfully enabled a pair of rhesus monkeys to move a virtual cursor across a computer screen merely by thinking about their response to human commands. The monkeys' ability to manipulate a cursor without using a mouse is based on a powerful new algorithm, a mathematical computing program devised by Vikash Gilja, a Stanford electrical engineer and computer scientist. Four years ago, neurosurgeons at Brown University and Massachusetts General Hospital had demonstrated a simpler version of an algorithm that enabled completely paralyzed humans with implanted sensors in their brains to command a cursor to move erratically toward targets on a computer screen. But with Gilja's algorithm, called ReFit, the monkeys showed they could aim their virtual cursor, a moving dot of light, at another bright light on a computer screen, and hold it steadily there for 15 seconds - far more precisely than the humans four years ago. With the new algorithm, they were able to perform their thinking tasks faster and more accurately as they sat comfortably in a chair facing the computer. The development is "a big step toward clinically useful brain-machine technology that has faster, smoother, and more natural movements" than anything before it, said James Gnadt of the National Institute of Neurological Disorders and Stroke. © 2012 Hearst Communications Inc.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 17537 - Posted: 11.26.2012

By David Pogue Okay, great: we can control Our phones with speech recognition and our television sets with gesture recognition. But those technologies don't work in all situations for all people. So I say, forget about those crude beginnings; what we really want is thought recognition. As I found out during research for a recent NOVA episode, it mostly appears that brain-computer interface (BCI) technology has not advanced very far just yet. For example, I tried to make a toy helicopter fly by thinking “up” as I wore a $300 commercial EEG headset. It barely worked. Such “mind-reading” caps are quick to put on and noninvasive. They listen, through your scalp, for the incredibly weak remnants of electrical signals from your brain activity. But they're lousy at figuring out where in your brain they originated. Furthermore, the headset software didn't even know that I was thinking “up.” I could just as easily have thought “goofy” or “shoelace” or “pickle”—whatever I had thought about during the 15-second training session. There are other noninvasive brain scanners—magnetoencephalography, positron-emission tomography and near-infrared spectroscopy, and so on—but each also has its trade-offs. Of course, you can implant sensors inside someone's skull for the best readings of all; immobilized patients have successfully manipulated computer cursors and robotic arms using this approach. Still, when it comes to controlling everyday electronics, brain surgery might be a tough sell. © 2012 Scientific American,

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 17518 - Posted: 11.21.2012

By Meghan Rosen Michael McAlpine’s shiny circuit doesn’t look like something you would stick in your mouth. It’s dashed with gold, has a coiled antenna and is glued to a stiff rectangle. But the antenna flexes, and the rectangle is actually silk, its stiffness melting away under water. And if you paste the device on your tooth, it could keep you healthy. The electronic gizmo is designed to detect dangerous bacteria and send out warning signals, alerting its bearer to microbes slipping past the lips. Recently, McAlpine, of Princeton University, and his colleagues spotted a single E. coli bacterium skittering across the surface of the gadget’s sensor. The sensor also picked out ulcer-causing H. pylori amid the molecular medley of human saliva, the team reported earlier this year in Nature Communications. At about the size of a standard postage stamp, the dental device is still too big to fit comfortably in a human mouth. “We had to use a cow tooth,” McAlpine says, describing test experiments. But his team plans to shrink the gadget so it can nestle against human enamel. McAlpine is convinced that one day, perhaps five to 10 years from now, everyone will wear some sort of electronic device. “It’s not just teeth,” he says. “People are going to be bionic.” McAlpine belongs to a growing pack of tech-savvy scientists figuring out how to merge the rigid, brittle materials of conventional electronics with the soft, curving surfaces of human tissues. Their goal: To create products that have the high performance of silicon wafers — the crystalline material used in computer chips — while still moving with the body. © Society for Science & the Public 2000 - 2012

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 15: Language and Our Divided Brain
Link ID: 17455 - Posted: 11.05.2012