OK, OK. So that first video didn’t really illustrate what was going on. This one is better, I promise.
I’ve added a sensing region around the board so that I can effectively mute the camera when I move a hand anywhere near the board. This way, sweeping hand motions are largely ignored. Also, to clarify the relationship between the steps and the drum samples, I’ve added some distinctive animations to correspond to the different samples.
I designed the system to play well with alternative step layouts. Hexagonal steps would be pretty simple to implement. I’ve done some experimentation with Pentagons, you know — just to be difficult:
The fun thing about the pentagonal layout is that I can group the steps into different groupings of fours to mix things up a bit. You may notice that the highlighted steps are each the top-center of a group of four: Group A in the image below. They are also the bottom-center of an alternative grouping: Group B below. So instead of stepping across a row or column of steps as with a square layout, I can step around these different groupings of four. (The left-most such grouping is actually missing a step. Whoops.)
The downside is that as a viewer it’s hard to predict what’s going to trigger when and on what track/sample.
By the way, I have a new trick to ease the pain of capturing the output of an inherently real time app. Doing screen captures in real time is much more feasible with the Windows Media Foundation. It even supports audio. This is a far better solution that trying to store a frame sequence as I’d been doing. I captured the video above at 640×480 at a very smooth 30fps. (I’m pretty sure I could have done 60fps since right now I’m just ignoring the capture every-other frame and my display performance looks fine.) Storing JPG or BMP sequences in real time (as I had been) takes considerably more than 16ms/frame, so it screws up the inter-app communication and performance. Anyway, the WMF class I wrote to work with OF will ONLY work on Windows 7. Which is too bad.
And for now, I still had to record the Ableton Live output into a fresh track, transcode my capture to quicktime, sync it back up in a new Ableton track and re-encode. Again. So there’s still room for improvement.
This demo was made with openFrameworks and Ableton Live.