In recent months, I have used the Myo armband in various performance-related projects. Although for the moment it is not as magical (powerful and reliable) as it seems to be in the official promotion video, the armband is nevertheless a very promising sensing technology for body gesture and movement. In this essay, I would like to share my experience of using myo in a scenario of technology-enhanced performances1.
(The version of the Myo SDK I am referring to in the main part of this essay is
Beta 7. The newest version as I am writing the text is
0.8.0, which introduces the ability to acquire raw EMG data.)
The Myo SDK exposes its functionality in a C++ API2, which is the native language of many major creative coding libraries such as openFrameworks and Cinder. Therefore, it is easy to employ the armband in creative coding applications.
More than one armbands can be used simultaneously, so the application can make use of the data from multiple people or from both arms of one person. However, the SDK does not provide a unique identifier for the program to recognize the armbands over different runs. Therefore, if more than one armband are used in the same application, the role of each armband will be assigned on the fly, which is a hassle for the user to discern.
By overriding the callback functions of the API, the programmer decides the application’s reaction to each type of the captured data. Two major types of data—spatial data and gesture data—are gathered, which will be described in detail later. The data is refreshed in a rate high enough for capturing body movements in most situations3. The armband communicates with the computer via Bluetooth connection. However, a dedicated Bluetooth receiver must be plugged into the computer so that the daemon program (Myo Connect) can find the armband. In order for the armband to fully sense all types of data, it must be worn on bare skin of the forearm, and a pairing gesture must be performed and recognized every time it is put on the arm.
The overall stability of the armband and the SDK is good. With careful programming to deal with the connection procedure, the armband can be used as a reliable data source on stage. However, when worn on a thin arm, the armband tends to slide during arm movements. And after every slide, the SDK will require the pairing gesture to be performed again before it can provide gesture data, which is unacceptable in performances. Moreover, the battery of the armband depletes in days when the armband is idle, and nowhere can the user find a exact indicator of the battery condition. Careful preparation must be done before the performance to make sure no problem will be caused by these known issues.
The Myo armband consists of a 3-degree accelerometer and a 3-degree gyroscope for the capture of spatial data, and eight EMG sensors for gesture recognition. For spatial data, the SDK provides:
- orientation data in the form of quaternion and Euler angles
- raw accelerometer reading in the form of 3-dimensional vector
- raw gyroscope reading in the form of 3-dimensional vector
The quaternion and Euler angles are different representations of the same arm direction, of which the latter is easier for human perception. The three components of the Euler angles correspond to the arm’s
- pitch (vertical angle)
- yaw (horizontal angle)
- roll (rotational angle)
The pitch data is very reliable. It always refers to the horizontal plane as origin, and is unrelated to the horizontal direction of the arm. Therefore, the performer can feel free to turn around during the performance, without worrying about the pitch data going out of scope. The data ranges from -π/2 (arm towards ground) to π/2 (arm towards sky). According to my experience, this is probably the most expressive kind of data provided by the armband. People do not raise their arms for no reason. Therefore, the vertical direction of the arms is a very good indicator of the emotional state of the performer. By making use of the absolute pitch or the relative pitch over time, simple but effective mechanism can be conceived to respond to the performer’s emotional state.
The yaw and roll data refer to their initial state as origin. That means, once the armband get initialized, the yaw data represents the arm’s horizontal direction relative to this fixed origin rather than the current frontal direction of the performer’s body. In consequence, when the performer turns her body, the reading will be shifted. Since we have no way to capture the performer’s body direction, the yaw data is useless in most cases, unless the performer never turns her body during the whole performance. A possible use of the yaw data is to capture data from both arms, and calculate their difference to examine whether the openness of the arms. Another issue of the yaw and roll data is that the reference coordinate tends to drift over time, which makes the data even more unreliable. These data range from -π to π (representing a whole circle).
The raw data from the accelerometer and gyroscope can also be accessed. In fact, these are the data source for the SDK to calculate the orientation data. Yet beyond this usage, the data has their own significance—they measure the linear and angular acceleration of the armband. The data units are g (the gravitational constant) or °/s (degree per second), respectively. Being viewed separately, each component of these data might not be of great use in most performance scenarios. However, if we calculate the SRSS (square root of the sum of the squares) of all the components of either the accelerometer data or the gyroscope data, we have the magnitude of the linear or angular acceleration of the arm, which are very effective indicators of the intensity of the arm movement, which in turn contains emotional or rhythmical information of the performance.
After proper pairing, the armband also provides gesture data of the hand, which indicates which of the following gesture the arm is making:
- waving in
- waving out
- fingers spread
- thumb to pinky
However, this data is not as useful as it may seem at first sight. The hand gesture is calculated from the EMG data measured on the skin of the forearm, which is a side effect of the muscle movement. Therefore, the calculated gesture may not loyally indicate the actual gesture of the hand. What’s more, when exterior forces are applied to the muscles, the accuracy of the measurement can be largely affected. In fact, when the performer wears tight clothes on her upper arm, the hand gesture data tends to be nearly unusable.
While it was widely hoped that the armband can recognize custom hand gestures, this feature is still missing from the Myo SDK, which disappointed many developers. In spite of that, starting from version
0.8.0 of the SDK, the eight streams of raw EMG reading can be now accessed. This not only means that custom gesture recognition becomes possible (though it might require great efforts from the developer), but also opens various possibilities. I suppose that the EMG data might be employed by new media artists in ways similar to how EEG data from the brain is used.
A prior essay with a focus on the application of Myo in my project The Humanistic Movement can be accessed via http://shi-weili.com/the-humanistic-movement-bodily-data-gathering-and-cross-application-interoperability/. ↩
Scripting is also supported by the Myo SDK in Myo Scripts. ↩
According to https://www.thalmic.com/blog/raw-uncut-drops-today/ , it seems that the refresh rate of the armband data is 200 Hz. ↩
(Title image credit: Thalmic Labs)