Mind-Controlled
Linking brain and computer may soon lead to practical prosthetics for daily life
By Susan Gaidos
Video games can be mesmerizing, even for a rhesus monkey. Which may explain, in part, why 6-year-old Jasper has been sitting transfixed at a computer screen in a Washington University lab for nearly an hour, his gaze trained on a small red ball. A more interesting reason for Jasper’s quiet demeanor is that he is hurling the ball at a moving target using just his thoughts.
Jasper is not the only monkey to control objects with his mind. At the University of Pittsburgh, a pair of macaques manipulated a thought-controlled synthetic arm to grab and eat marshmallows. The monkeys then worked the arm to turn a doorknob — no muscle power required. In another case, a monkey in North Carolina transmitted its thoughts halfway around the world to set a Japanese robot in motion.
Now it’s time to let humans give it a serious try. In a series of clinical trials, scientists are preparing to take thought-controlled technologies, known as brain-computer interfaces, to those who might benefit most. The trials are a major step in realizing what many scientists say is an ambitious, but fully obtainable, goal — to restore mobility and independence to people who have lost the use of their muscles through brain or spinal cord injury.
Over the next few years, paralyzed patients will attempt to learn how to maneuver virtual hands and robotic arms to reach, push, grasp or eat. As the trials progress, researchers hope to train users to perform increasingly complex movements.
“Ultimately, we’re going for something that patients could use to carry out daily tasks: pulling zippers, buttoning buttons, tying shoes and things like that,” says neurobiologist Andrew Schwartz of the University of Pittsburgh.
Key to pursuing this achievement is the fact that brain cells emit tiny electrical signals just before the body performs an action. Over the last two decades, scientists have figured out how to use a small electrode, in the form of a chip implanted in the brain, to pick up patterns in these signals and match them with specific movements. When sent to a computer programmed to translate them, the same signals that would ordinarily dictate movement of a living limb can be harnessed to control a computer cursor or a robotic arm.
Already, people have completed some simple brain-controlled tasks, spelling out words on a computer screen, turning on a TV or opening e-mail. In a few cases, patients have used their minds to perform basic reaching movements with a robotic arm or open and close a disembodied hand.
But the techniques have been clunky, with equipment that is too cumbersome and complicated to operate at home without assistance. And today’s devices are often painfully slow and require long periods of training. In upcoming trials, researchers will test two different approaches to plug into the human brainpower needed to better control external devices. By relying on single neuron firings, researchers are trying to make movement more precise. Others are taking a newer, more surface approach to get around the fact that signals from the brain sometimes weaken over time. And while these trials play out, some teams are thinking about sending signals from the outside world back to the brain.
Heady stuff
Efforts to develop machines that can be controlled by the human mind began in the 1960s when scientists first put single electrodes into the brains of monkeys to record neural activity. To the researchers’ surprise, they found that some cells in areas that control movement start firing before an animal actually moves. Scientists later discovered that these areas are active because the brain plans movements well before it carries them out.
People whose spinal cords have been damaged so that they can no longer deliver signals to the limbs are still able to produce the necessary planning signals in the brain. It is these signals that the researchers aim to capture and decode, making this science fiction vision a reality.
Most brain-computer interfaces gather information from specific neurons in the motor cortex, where movements are initiated and carried out. By implanting arrays of hair-thin electrodes directly into the brain, scientists can record clear, strong signals. This approach has some downsides: The method requires that the electrodes be surgically implanted deep into the brain, carrying the risk of infection and creating an immune response that can cause scarring around the electrodes and degrade the signal. But the technique is the only way to get clear signals from single neurons, so some scientists think it’s the way to go.
To date, five human patients in the United States have been fitted with fully implanted electrode arrays. The patients were part of a clinical trial investigating a device called BrainGate, developed by Cyberkinetics Neurotechnology Systems Inc., a company cofounded by Brown University neuroscientist John Donoghue. The implanted arrays send neural signals through tiny wires to a small pedestal that protrudes from the patient’s scalp. During lab tests, the pedestal can be connected via cables to a computer that decodes the brain’s signals into meaningful information.
One patient, a woman who suffered a stroke in her brain stem leaving her immobilized from the neck down and unable to speak, has used her implant in a lab setting for nearly five years. In the April Journal of Neural Engineering, Donoghue and his team document how after nearly three years of use, the device continued to work with little signal degradation.
“If she was using this system in everyday life, it would be reliable to a certain extent,” Donoghue says.
Still, researchers are working hard to make implantable devices that do more. BrainGate’s robotic arm could reach and grasp an object, but it didn’t have the maneuverability of a typical arm. A human arm uses dozens of independent muscles to move up-down or left-right and control the positions of the shoulder, elbow, forearm and wrist. Hands also require many independent muscle movements, or “degrees of freedom,” to pinch, grasp, hold and squeeze.
At the University of Pittsburgh, Schwartz is preparing to test in people a thought-controlled arm with 17 degrees of freedom. The arm will have a full range of motion in the shoulder, elbow and wrist, with a hand capable of curling around a coffee mug or picking up a small item such as a pencil.
“This will allow us to start trying to do dexterous tasks, things that have never been attempted before,” Schwartz says. Already, monkeys have used a version of this remote arm, as Schwartz reported in February at the annual meeting of the American Association for the Advancement of Science.
In order to get the brain signals to do all of this, Schwartz’s group will record firings from twice the number of neurons as used in the BrainGate studies. Three patients will have two Tic Tac–sized arrays implanted into their brains. Each array will contain 100 microelectrodes, making it possible to record from about 200 neurons at the same time. The implants will remain in the patients for one year.
Ultimately, scientists hope to implant patients with wireless devices that can beam brain signals out to control a prosthetic without the need for wires or cables. The system would be on all the time, available to patients when they want it. Such wireless systems could someday help amputees in addition to paralyzed patients, says Stanford University engineer Krishna Shenoy.
Shenoy and colleagues have been building wireless systems that can transmit signals from single neurons to nearby receivers. Researchers have used the devices to monitor the brains of monkeys moving around their cages or walking on a treadmill. In April, Shenoy’s team presented details on the studies in Cancún, Mexico, at the International IEEE EMBS Conference on Neural Engineering.
Further work is needed to make such feats practical for people, Shenoy says. Scientists know how to extract the necessary signals from the brains of paralyzed patients, but haven’t yet worked out the details of how to pick out particular signals from the brains of amputees, which might be busy directing other movements.
Scratching the surface
In recent years, researchers have found ways to capture electrical signals from the brain without having to poke anything into the brain tissue.
Daniel Moran of Washington University in St. Louis is among the scientists tapping into these signals. The approach is based on electrocorticography, or ECoG, a method used by doctors to detect electrical activity in the brain. Making an incision in the scalp and removing a portion of the skull are still required; surgeons then place the electrode grids directly on the surface of the dura mater, a thin leatherlike membrane covering the brain.
From this location, about two centimeters below the skull, the electrodes can’t record from single neurons. They can, however pick up the electrical activity of groups of neurons. These neural assemblies — thousands of neurons per group — have synchronized activity that produces what are called local field potentials, broadcasting what the brain is doing.
With training, the neural groups can adjust themselves to signal for specific movements. For example, patients can be taught to move a cursor on a screen in a specific direction as they think about wiggling their fingers. As the brain adapts, subjects no longer have to imagine wiggling digits; they simply think “cursor right” and the neural group connected to their fingers will automatically signal its intention.
Moran first tested this approach for extracting signals from the motor cortex in 2004 on a handful of patients being monitored for epileptic seizures. Doctors had placed the ECoG grids on the patients’ brains to figure out which areas were causing seizures. After connecting the sensors to a computer, the scientists picked up on the signals and taught patients to use the signals to move cursors and play computer games.
Since these early experiments, Moran’s group has found ways to space the electrodes on a grid to optimize the signals from the neurons for more precise movement. Together with Justin Williams at the University of Wisconsin–Madison, Moran built a small electrode array to fit over the brain’s sensorimotor cortex, a region concerned both with movement and the perception of outside stimuli. Jasper, one of three monkeys in Moran’s lab, is now using the new array to play video games and reach for and grasp virtual objects on a computer screen, all without moving a muscle.
This summer, the researchers will get their first look at how the device performs in human patients who need it. A thin, flexible grid will be implanted under the skull of a paralyzed patient at the University of Pittsburgh. Researchers will train the patient to use mind control to carry out movements on a computer screen. Over the next three years, as improvements are made to the device, future patients may be able to perform more complicated tasks and control a simple robotic arm. Moran says his goal is to develop an implantable device that will last years — up to 10 — making the choice to have the surgery practical.
“What we need is a type of implant that will be 95 to 99 percent effective and that is going to last for a decade,” he says.
While some scientists doubt that ECoG signals can provide enough information for fine movements, such as turning a key in a lock, others are working to attain more detailed information from the signals. Last year, biomedical engineer Soumyadipta Acharya of Johns Hopkins University in Baltimore and his team decoded signals for predicting the movement of individual fingers as they flexed and extended. The findings, published in the August 2010 Journal of Neural Engineering, show that ECoG, with some refinements, can probably provide the dexterity needed to operate a switch or turn a doorknob, Acharya says.
A feel for the future
As paralyzed patients learn to use robotic arms to reach for their morning coffee, the question becomes, how exactly do they hold onto the cup? While a Styrofoam cup will crumble under a clenching grip, a cup of any kind will slip from a loose one.
“For prosthetics, the better we get at moving arms out to things, the more we need to work on the sensors to allow us to feel those things,” Shenoy says.
Feeling requires the ability to turn the system around and put signals back into the brain. Some investigators have tried putting small amounts of electric current into the system. Shenoy says the problem with that approach is that sending electricity into the brain activates many cells at once, rather than a target cell.
“Putting electric current into the brain is like going into a classroom where each student in the classroom is a different neuron, and shouting loudly when you wanted to speak to only one student,” he says.
Working with Karl Deisseroth of Stanford, Shenoy is using an approach called optogenetics to put light-sensitive proteins into target neurons in monkeys (SN: 1/30/10, p. 18). When sensors at the end of a prosthetic hand make contact with a coffee mug, a signal would cause light sources to shine on those neurons. Though the light bathes many neurons, only the neurons that have been tagged would respond.
The approach, outlined in the March issue of Nature Neuroscience, could be the “holy grail” for writing information back into brain, Shenoy says, because it provides a way to speak to specific cells and can be turned on and off very rapidly.
Shenoy’s group is not alone in developing ways to put signals back into the brain. Duke University neurobiologist Miguel Nicolelis is working to find a way to send signals about the texture of the object seen on a computer screen to the part of the brain where sensory information is processed. Such signals will allow users of thought-controlled arms or legs to touch and feel things as they interact with the world.
Nicolelis’ lab is also creating a robotic “exoskeleton” that will be worn like a suit so that people who have lost control of all their limbs can become mobile again. Having a system that sends information back to the brain will allow patients to use an exoskeleton to step onto the ground and sense its firmness — feedback that’s needed for an ordinary
walking experience.
As the technology becomes safer and smaller, it may someday be as commonplace as having a Bluetooth stuck in your ear, Moran says. And when that time comes, even the nondisabled will latch onto brain interfaces to gain mental control over their computer, iPad or other communication and entertainment devices. Already, one company in Japan has designed “cat ears” that claim to display a person’s emotions by reading brain signals from the surface of the scalp. Household devices might be next.
“At some point, you’ll be able to walk into your house and turn on the lights without flicking a switch,” Moran says. “All you have to do is think ‘lights on’ and technology will do the rest.”