In this video, I control my computer by jumping, flexing my biceps, and leaning. Sci-fi!
My Kinect setup is mostly explained in my previous article about Kinect + speech recognition. Here’s what’s different from the previous video:
The previous video dealt with speech recognition and tracking my body in the Z axis. This time, things were triggered by specific gestures. I set FAAST to output individual key strokes, which were assigned to clips in Ableton Live. Here’s what I used the various software for:
FAAST – Sends a user-specified output (mouse, keyboard, etc.), whenever certain gesture criteria are met (limb position, angle, velocity, etc.). I primarily chose to output individual keystrokes, and use Ableton Live’s key mapping (CTRL+K). The interface is quite straight-forward, and there’s a lot of gesture options!
Bome’s MIDI Translator Pro – Converts individual key strokes to key sequences. I could only get FAAST to output individual key strokes, but I needed more for the jump. An individual key stroke is converted to alt+f+down+down+right+enter, which opens the most recent project in Ableton Live.