Vol. 08: MIDI Controller Sock Monkey

A/V monkeyshines with flex sensors and a MIDIsense board.



+ Downloads & Extras:

The full article in Make issue 8 uses a sock monkey packed with flex sensors to show how you can turn real world inputs (like bending a stuffed animal) into music and visuals. It's just one example, but it could be applied to a wide variety of other projects. Here, I'll cover some of the details of the behind-the-scenes software workings that didn't fit in the magazine.

More Resources

For more information on some of the topics mentioned in the article:

Primer on MIDI control, including an excerpt from my book Real World Digital Audio, as a supplement to the article in Make 07: http://makezine.com/07/primer/

MIDIsense project page, which includes a good resource on how to acquire sensors: http://www.ladyada.net/make/midisense/

Arduino (also a physical computing board; uses USB in place of MIDI), complete with extensive documentation and tutorials: http://www.arduino.cc/

Tutorial on adding MIDI to the Arduino (also applicable to similar boards) from New York University's Interactive Telecommunications Program: http://itp.nyu.edu/physcomp/Labs/MIDIOutput

Also from NYU ITP, a comprehensive Wiki related to sensors: http://itp.nyu.edu/physcomp/sensors/Main/HomePage

Physical computing resources on my site, Create Digital Music: http://createdigitalmusic.com/tag/physical-computing/

More designs and craftworks from Anne Kirn, who designed the monkey plushie: http://www.shineblitzon.com/exitrance/

Calibrating MIDI Sensors

To use the MIDIsense, you’ll need to first configure the board to transmit the MIDI data type you want, and to calibrate MIDI’s 0-127 data range to the data coming from the sensors. Limor has written software for both Windows and Mac that lets you set up the MIDIsense board. The interfaces are nearly identical, so here, I'll demonstrate the procedure using the Windows version:

Demo Video

Quartz Composer Patch Explained

Part of the advantage of MIDI is that you can use it to create whatever you want. I wanted a quick visualization to hook up to my MIDI flex sensor sock monkey, so I built one in the development tool from Apple, Quartz Composer. Quartz Composer ships free with Mac OS X 10.4 and later; all you have to do is install the developer tools, and you’ll find it at [root] > Developer > Applications > Graphics Tools > Quartz Composer.

Quartz Composer is Mac only, but if you don’t have a Mac, there are other free options. On Windows, I’d suggest vvvv (http://vvvv.org), a similar, free “visual programming” tool built on Microsoft’s DirectX. Linux users (and users of other platforms) can use Pure Data (Pd, http://puredata.info) with an add-on graphics library like GEM (available via the Pd site). These two options are actually more powerful than Quartz Composer in many ways. I chose QC for its relative simplicity, so you could easily apply the basic concept of this patch to patches in the other programs.

For those of you who do have a Mac, you can download the source patches:

Source Patches

Note that you will need to adjust the patch to make it work, because of the custom MIDI assignments needed. Quartz Composer, like vvvv, Pd, and software like Max/MSP/Jitter, uses a visual patching metaphor for programming. Instead of lines of code, you connect interactive objects to one another using virtual patch cords (hence, “patching”).

Normally, people associate MIDI with sound and music, but this patch demonstrates that MIDI can be used for visuals, as well. Here’s how to make it work:

Image Image

1. Calibrate the sensors for MIDI: Using the MIDIsense configuration utility, make sure each sensor is assigned to a MIDI parameter. It doesn’t really matter which parameters you choose, but you will need to match the Control Change numbers here to the input messages in the Quartz Composer patch. Configure the maximum and minimum values to the bend range of each of the monkey’s limbs, torso, and tail. (Naturally, you don’t have to use the monkey; you’ll need to do the same with any other sensors attached to the MIDIsense for your own project, or your own MIDI controller.)

My assignments:

Left leg: CC 16 (General Purpose 1)
Right leg: CC 17 (General Purpose 2)
Left arm: CC 18 (GP 3)
Right arm: CC 19 (GP 4)
Torso: CC 80 (GP 6)
Tail: CC 81 (GP 7)
Image

2. Try out the patch with the mouse: To test out this patch, I created a separate version called monkey_mousetest.qtz with mouse input in place of MIDI, so you can see what the end result is supposed to look like. You’ll see I divide the mouse Y coordinate using a Math object; this puts its data in a range that’s more like what you’ll get from the MIDI sensor. Move your mouse up the Y axis, and you should get a stream of bananas. (For the test, all of them are connected at once. That could slow down some computers, so if your machine bogs down, try disconnecting some of the patch cords from Math > Resulting Value to the various Mouse coord inputs.)

Image

3. Navigate the monkey: Open the monkey_MIDI.qtz patch to switch to the MIDI-enabled version. You’ll see the overall patch is grouped into sections for the limbs, tail, and torso of the monkey. These are called “macro patches”, and they’re a way of encapsulating patches within patches in a hierarchy. To navigate into a macro, double-click it; to return to the previous, higher level, click the Edit Parent button. (You can also use the Hierarchy Browser, which you toggle from the toolbar, to view a Finder-style, multi-column view.)

Image

4. Adjust MIDI assignments: For each macro patch, select the correct MIDI input. The label will depend on which interface you’re using, but make sure you select the correct MIDI interface and port. Click the MIDI Controllers object in each macro, open the Inspector, and choose Settings from the dropdown at the top of the window. Select your input under “Observed MIDI Sources.” (Ignore IAC: it’s the Inter-Application Communications bus, for routing MIDI between Mac apps – a cool feature, but you want the external MIDI interface.) Also, if you’ve deviated from my assignments listed above, you’ll need to match the “Observed MIDI Controllers” to the MIDI controller you assigned to the sensor in the MIDIsense utility. (Make sure the CC numbers match up, in other words.)

How it works: The root level of the patch creates the background; the “Clear” the background fill, and the “Billboard” displays a plane textured with my drawing of the monkey, imported from Illustrator. The real magic happens inside the macro patches. The bananas are mapped as a texture in a particle system, which is a fancy way of saying that Quartz Composer randomly generates lots of bananas and assigns them different velocities and physics so they spurt realistically out of the monkey. (Or, anyway, as realistically as a monkey can excrete fruit.) The MIDI input data feeds the Max Size of the particle system. When the data is zero, the bananas are invisible. They get larger as the MIDI data increases. The Math object is necessary in order to scale the data so the bananas don’t get too large. (Data scaling is something you’ll do a lot of when working with MIDI, whether for visuals or music.)

Join the conversation -- every MAKE article has an online page that includes a place for discussion. We've made these RSS and Atom feeds to help you watch the discussions: subscribe.