Search This Blog

Monday, August 15, 2011

MIDI Polyphony (Part 1)

In working on the 3i controller my experience with the iConnectMIDI has led me down an interesting path.

One of the things the 3i needs to be able to do is detect "movement nuance" and to transmit information to an associated sampler or synth about what to do with that movement nuance.

I define "movement nuance" to be a continuously variable parameter that is associated with a given playing note.  MIDI Aftertouch is an example of this - typically used to determine how much "pressure" is being used to hold down a key after a note is sounded, i.e., once the piano key is down I can push "harder" to generate a different Aftertouch value.

MIDI Pitchbend is similar as well except that it affects all the notes, not just the one you are currently touching.

Normal MIDI keyboards are "note polyphonic" which means that you can play a chord.  But the "effects" that MIDI allows in terms of "movement nuance" are limited.

Traditionally MIDI is used with keyboards and to a degree with wind instruments where its limited ability to handle "nuance movement" is not really an issue or where things like Aftertouch is sufficient to deal with whatever the musician is trying to do.

But if you think about other kinds of instruments, for example string instruments, then there is a lot more nuance associated with playing the note.  Some examples of this include "hammer on" and "hammer off", "bending" as on a guitar or violin, sliding "into a note" on a violin, the notion that some other element such as a bow is sounding a note at one pitch and the pitch now needs to change, and so on.

Then there is the effect after releasing the note, e.g., "plucking" a string or sounding a piano note with the damper up.  The note sounds in response to the initial note event.  The event is release (MIDI Note Off) but the note still sounds.

While this is fine on a piano its much different on a stringed instrument.   For something like a violin you may be holding down a string which is plucked.  After the pluck is released you may wish to slide your finger that's holding the note to change pitch.

Here MIDI does not off significant help.

The notion of a note sounding beyond your interaction with a keyboard is also very non-MIDI.  MIDI is designed around a basic "note on"/"note off" model where everything else revolves around that.

Certainly its possible to try and "extend" MIDI with a notion of CC values or other tricks in order to get more information per note - but MIDI wasn't really designed for that.  (Some MIDI guitars run one note per channel - but that's about all the innovation they can muster.)

From my perspective with the 3i really the only thing left is to reconsider the whole notion of what a MIDI keyboard really is.

So my idea is that, in a very general sense, when you touch a playing surface you create a sort of MIDI "virtual channel" that is associated with whatever that touch (and corresponding sound) is supposed to do.  That "virtual channel" lives on through the sounding of the note through whatever "note on" activities are required, through the "note off" and on through any sort of "movement nuance" that continues after the "note off" events, e.g., plucking a violin note on E and sliding your finger up to E#.

This model gives the controller a full MIDI channel to interact with a synth or sampler for the entire duration (life time) of the "note event" - which I define as lasting from the initial note sounding through note off through the "final silence".

So instead of having a rack of virtual instruments for an entire "keyboard" you now have the idea of a full MIDI channel of virtual instruments for a single "touch" that lasts the lifetime of a note.  The virtual rack being conjured up at the initiation of the sound and being "run" until "final silence" - at which point the virtual rack is recycled for another note.

Each "virtual rack" associated with a note has to last as long as the note is sounding.

This allows a controller to manipulate MIDI data on its "virtual channel" as the life of the note progresses.

Now, since a note can last a long time (seconds) and the player can continue to pluck or play other notes you find that you can quickly run out of channels on a standard 16 channel MIDI setup because a MIDI channel can be tied up with that note for quite a while (also seconds) - particularly when you factor in having two different instruments (say bass and guitar) set up (as a split) or when you have layers.

This is effectively MIDI polyphony in the sense that you need to figure out the maximum number of channels that could be active at a given time.  On "hardwired" synths or samplers this is typically something like 64 or 128 or more. (Of course they are managing simultaneous sound samples, not MIDI channels.  But in my case we need MIDI via the 3i to do the job of the hardware sampler so we need MIDI channels instead of (or to act as) "voices".)

Which leave us with 16 MIDI channels is simply not enough.

And this means that using a MIDI device to move data from the 3i to the synth or sampler is inherently limited because at most it can support only 16 channels.

This if further compounded by the fact that once the MIDI reaches outside the host OS it must be throttled down to 32.5K baud in order to be compliant with the MIDI standard (see this PDF).

I believe this is what I am seeing with iConnectMIDI - as fast as the device is it cannot move data off of an iOS device faster than 32.5K baud because iOS is throttling it.

So the only way to get data off the device is via a high speed USB cable (WiFi being no good for reasons mentioned before).

No comments:

Post a Comment