Researchers have connected the 58-year-old woman’s brain to a computer that runs a robotic arm. As Hutchinson sits at a table staring at a bottled drink and imagining the robot grabbing the bottle and bringing it to her mouth, the robot arm begins to move.
The robot is running on signals detected by sensors implanted in the part of Hutchinson’s brain that would normally control the movements of her right arm. The sensors pick up the sparking of nerve cells and send the signals to the computer which then translates them into commands for the robotic arm. Suddenly Hutchinson is able to do something she could only dream of before: As she thinks about getting herself a drink, the arm reaches over to the bottle and brings it to her lips, where she is able to sip the drink from a straw.
It’s the first time Hutchinson’s been able to do anything for herself since the stroke.
Hutchinson’s experiences, along with those of another quadriplegic patient, were described in a groundbreaking paper published Wednesday in Nature. Both patients are part of an ongoing government funded trial that is testing the new brain translation technology, BrainGate, which one day may free “locked-in” patients like Hutchinson and give functional limbs to amputees.
It will be years before BrainGate could be available to the general public. But Hutchinson’s happy to enjoy the future today. After realizing she could control the robot arm, she said she was “ecstatic.” Though Hutchinson cannot speak, she can type her thoughts through a device that takes its cues from her eye movements.
She’s optimistic about what the research might one day bring. “I would love to have robotic leg support,” she says.What’s amazing is how researchers have “taught” their computer to essentially read Hutchinson’s thoughts.
The baby-aspirin sized sensor implanted in Hutchinson’s brain contains 96 hair-thin electrodes that record the sparking of neurons in the movement control center, the motor cortex.
The first step in the learning process is for the computer to “see” which neurons spark, and in what pattern, when a person picks up a bottle and brings it to her lips, explains the study’s lead researcher, Dr. Leigh R. Hochberg, a professor of engineering at Brown University, a researcher at the Providence VA Medical Center, a critical care neurologist at the Massachusetts General Hospital/Brigham and Women’s Hospital in Boston, and a visiting associate professor of neurology at the Harvard Medical School.
Fortuitously, it doesn’t matter whether the person actually moves their limb or whether they’re merely imagining themselves doing it. So, for several trials, Hochberg and his colleagues had the computer observe the sparking patterns of neurons in Hutchinson’s brain as she watched the robot arm pick up the bottle and bring it to her lips.