166 — PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following

Van Zandt-Escobar, Caramiaux, & Tanaka (http://www.nime.org/proceedings/2014/nime2014_511.pdf)

Read on 02 February 2018
#piano  #music  #augmentation  #machine-learning  #gesture-recognition  #gvf 

I read a review yesterday about robotic musical cooperation. One technology mentioned in that paper was PiaF, a tool that enables pianists to augment their performance by signaling with gestures. This system uses a Kinect to track gestures which are used first to train a model to recognize movements. Then, later movements are recognized automatically and are translated into audio outputs.

The authors group gestures performed by a musician during performance into two main categories: Expressive gestures, such as tapping a foot, are ancillary, and do not create music. Musical gestures, such as pressing a key, are directly tied to musical production.

The PiaF system itself reads gestures with a Kinect system and translates them into semantic meaning using a Gesture Variation Follower (GVF), a system that tracks gestures in realtime. These results are fed into an audio processor which constucts parameters for musical synthesis. In particular, PiaF maps gestures to playback of prerecorded MIDI, taking care to scrub through the playing of the file in time with the gesture timing.

It looks like the use of this system is still rather cumbersome, requiring pedal use to indicate that the gesture is being performed (in order to separate discrete gestures). But it shows a lot of promise for future robotic accompaniment, considering that the paper is a lot older (2014) than I realized when I began reading.