Towards Open Source programming software for more sophisticated myoelectric prosthetic controls

It is important to find a way to intuitively tackle complex programming tasks. Teaching a complex prosthesis complex motion patterns is such a task. Current developments appear to be under way [1] .

The Revolutionizing Prosthetics (RP) 2009 project is spread over 30 research institutions worldwide and led by the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md. The project is sponsored by U.S. Defense Advanced Research Projects Agency (DARPA). It attempts to develop a mechanical arm that closely mimics the properties of a real limb. This project is reported to have achieved three goals so far: two prototype mechanical arms were developed, and a new type of nerve surgery to control the limbs was pioneered.

The surgery attempts to reroute intact nerve bundles still present after limb loss. These are attached to surgically created muscle packs that are placed in convenient locations under the skin. Electrodes then are placed there to pick up electromyographic (EMG) signals and control the mechanical arm. The more muscle packs a surgeon can create using the patient's own tissues, the more complex the control opportunities will optimally turn out to be. As result, far more sophisticated controls are attempted. So far, the surgery managed to remote control an arm with six degrees of freedom; a further goal that was not yet reached seems to be to control individual finger movements - even though I personally believe that even with healthy hands most people only very rarely employ individual finger movements for daily tasks.

In order to be able  to use such prosthesis through these newly established muscle pack triggers, the patient will have to learn. The patient will have to learn to control the prosthesis by activating the right type of switches. His nerves will activate muscle packs. These sit under the skin on which electrodes are placed to pick up trigger signals. Once these are picked up, software processes them to activate the prosthesis.

The tricky part now is that the software has to be trained! Training sessions of that type are very important but very boring. The software contains software that translates activation patterns (of the body of the patient) to motion patterns (of the prosthesis).

The training software is called Virtual Integration Environment (VIE). There, the patient is offered some virtual reality where a prosthetic model is animated on-screen. Repetitive trainings ("open fist", "close fist", ...) are hard and boring - yet essential for the software to be able to create distinct signal patterns from the electrode signals. Even highly motivated volunteers become tired.

Now the engineers started to use a computer game. Apparently, playing a game is a far more intuitive way to get volunteers to keep up the pace compared to cut-and-dry training cycles of technical software. In other words, the engineers use game software to get more input and longer training cycles out of their volunteer patients.

The trend to use games (rather than boring exercises) has been dubbed Wii-hab. Now, the engineers in charge of the software Armiger/Vogelstein used Guitar Hero, modified the  controller  with a soldering iron to allow it to be controlled by the VIE. Then, button clicks were substituted with muscle contraction signals as picked up by the electrodes.

Kuniholm (who is behind OpenProsthetics) was involved in this. Armiger and Kuniholm plan to continue with this type of software development under an Open Source agreement. DARPA apparently plans to release the APL-based video-game-interface open source so everyone can start "hacking away", possibly as early as January 2009.

For their next game-interface modification, the APL team is eyeing Wii Tennis so that people with more radical amputations can also benefit from the motivating powers of the Wii. “We want Jesse to be able to play, too,” says Vogelstein.

Their ultimate aim is  to distribute the software and interface documentation into as many hands as possible according to Kuniholm. And in that sense, the project part of Air Guitar Hero reflects the whole Revolutionizing Prosthetics project. Long term perspective is that DARPA and Johns Hopkins want to make both the new prosthetic arm’s hardware and the interface software, including the VIE, open source, so that prosthetic-arm research innovation can evolve organically.

[1] R. S. Armiger, F. V. Tenore, W. E. Bishop, J. D. Beaty, M. M. Bridges, J. M. Burck, J. R. Vogelstein, and S. D. Harshbarger, "A Real-Time Virtual Integration Environment for Neuroprosthetics and Rehabilitation," JOHNS HOPKINS APL TECHNICAL DIGEST, vol. 30, iss. 3, p. 198, 2011.
  title={A Real-Time Virtual Integration Environment for Neuroprosthetics and Rehabilitation},
  author={Armiger, Robert S and Tenore, Francesco V and Bishop, William E and Beaty, James D and Bridges, Michael M and Burck, James M and Vogelstein, R Jacob and Harshbarger, Stuart D},

Cite this article:
Wolf Schweitzer: Technical Below Elbow Amputee Issues - Towards Open Source programming software for more sophisticated myoelectric prosthetic controls; published 26/11/2008, 12:22; URL:

BibTeX: @MISC{schweitzer_wolf_1603667055, author = {Wolf Schweitzer}, title = {{Technical Below Elbow Amputee Issues - Towards Open Source programming software for more sophisticated myoelectric prosthetic controls}}, month = {November},year = {2008}, url = {}}