How It Works:
The Synplode system is comprised of two pieces of software: an OpenFramesworks (OF) application and Ableton Live running in Set mode.
All real time control is programmed through Live. Simple commands are sent to the OF application encoded as MIDI Note On pitch values. This makes it easy to send synchronized commands by sequencing notes in the MIDI Note Editor.
One track outputs Note On messages to control the wave step-by-step as it passes across the grid. Often, it is sufficient to trigger these Steps on every beat or eighth note, but by providing this control in Live, one can also synchronize the flashing wave with accents in the music, in this case the base line around 0:32 in the video:
(There are 16 Steps listed because the higher steps (8-15) represent diagonal sets of trigger regions that might be used instead of the vertical columns. This is not illustrated in the video.)
Another track is used to control the graphics. Color schemes in the system are each comprised of 5 colors, and the scheme can be changed here to coincide with changes in the music. There are options to select between the grid and radial patterns, and choose between the glowing Synplosion animation and the segmented circular expansion that you can see around 1:18 in the video. (I call that the “‘Splode Mode.”) The subtle colored background of the projection is made up of rotating gradient segments, the direction of which can be controlled here to synchronize with the music.
The Step and Color tracks each contain a MIDI Effect Rack that is used simply to provide the labels in the note editors and map the notes to the appropriate pitches for the OF application to recognize.
(Now the “Colors” pitches will be mapped to c-2 through b-2, but we won’t be bothered by such details in the MIDI Note Editor.)
The OF application uses an IR camera with basic background subtraction to determine where participants are present. Warping is applied to the camera and projection separately, so the system can be mapped to a physical grid on the ground, or perhaps a checker board. Camera analysis is all handled in its own thread in order to keep it from interfering with the smoothness of the graphics.
The OF application receives the wave/step messages, checks for collisions between the flashing wave and participant-activated trigger regions, and triggers the appropriate graphical Synplosion while returning the appropriate MIDI Note On message back to Live. The 8 rows of the projected grid represent 8 MIDI pitch values: 0 to 7. Basically, the circular patterns map the rings to 5 pitches: 0 to 4.
Back in Live, these pitch values run through a few different MIDI Effect Racks and Pitch devices to map them to musical values. First, the ModeSelector sets the mode/scale/chord. For each mode, a MIDI Effect Rack contains a 8 Pitch devices to transpose each incoming MIDI pitch (0 to 7) up by the appropriate amount. Finally, one last Pitch device transposes the mode into whatever key is appropriate for the music.
Once the notes from OF are mapped to musical values, I can select an Instrument device from an Instrument rack. Naturally, the appropriate mode, key, and instrument may change as the music changes.
Discoveries Along The Way:
Often, two adjacent trigger regions will Synplode simultaneously. From an interactivity standpoint, this seems perfectly reasonable. Musically it can lead to problems, especially where the adjacent pitches are a half step apart. For this reason, I don’t actually use the basic one-octave scale for mapping the Synploded note pitch values. Fifths (by which I mean the I and V in every octave) sounds great, but isn’t terribly interesting. Smaller intervals between adjacent step leads to greater perceived control on the part of the participants, but opens the door for greater musical unpleasantness. In one extreme, there is the ultimate control of chromatic mapping. In the other extreme there is zero control in simply triggering the root pitch every octave. I have yet to determine the sweet spot between the two, but for now I find that changing the mode from time to time works to keep things interesting.
Bonus Awesome Things About This System Design:
Participants are given control of the system based on their location, and one’s location can only change so quickly. Effectively, the OF application can know that a Synplosion will occur well before it actually does. Using negative track delay, the step and color control tracks in Live can be nudged a bit earlier in time in order to ensure perfect synchronization between graphics and audio without adversely affecting the interactive experience.
The OF application was designed to be very flexible in its creation of region maps. Rather than checking the camera input for activity within a particular rectangle or segment, the application essentially performs a comparison of the camera frame against a rendered region map. This means that if I can draw it, the camera can detect it without requiring any complicated math. The application also provides a graphical editing interface for selecting which regions should be assigned to which pitch and step, opening the door to alternative patterns within a single map. (Note the difference in the flashing wave pattern between 1:13 and 1:53 in the video. The first wave moves across almost linearly like it does for the grid map while the second moves around in a circle. These two patterns coincide with different assignments of regions to pitches.)
The grid and radial maps are clear in the video, but already I have code to handle a pentagonal map and development of a hexagonal map would be very simple. Future work will involve thinking outside the box about new kinds of maps that are innovative but intuitive.
The next steps for Synplode involve writing or importing more music for it and getting it installed for people to use at parties or events. It will be very interesting to see how people actually react to it when they don’t understand the inner working.