Using OSCulator as a MIDI Bridge – OpenCV, Processing, OSCulator, and Ableton Live. Really.

Over the past few nights I have been working on bringing audio content from Processing into Ableton Live.  While, in general, I consider myself to be a proficient programmer, I have spent too many hours looking for ways get Python and other languages to speak MIDI.  My technical approach to connecting and encapsulating software components of the Interactive Performance and Technology  is to create network messages that work as interfaces.  This creates reusable components for future projects.

OSCulator supports communicating with Wiimotes and MIDI using a UDP based message protocol called OSC.  At this point in the project, the lack of complex messages is low priority, making way for hands on and empirical practice (hacking).

Combining thoughts from a discussion form and blog post about sending OSC messages, I made a Python based example and Java/Processing example for sending an OSC message that contains, pitch, velocity, and on/off.  I mapped the camera sensing from Fu Hao via Processing to send MIDI messages to Ableton Live. For the project, this allows us to connect the vibration of the Wiimote to the visuals on the screen to give the performer feedback.  It could also be used for additional sonification, but I’m not sure what should sound like.

Instead, here is a video of a demo of sonification control using this bridge.

What you can’t hear me explain in this video is that horizontal position is mapped to low and high pitches, while tempo is mapped from slow at the bottom to fast at the top.  This demo is surprisingly expressive, which is not really conveyed very well by this video.  If someone want’s more technical details about how this works, leave a comment.  This code will be available soon in at https://github.com/rhema/art .

 

Leave a Reply

Your email address will not be published. Required fields are marked *