Search This Blog

Thursday, July 28, 2011

iConnectMIDI - Product and Analysis

For the last several months I have been awaiting a product call iConnectMIDI.

The reason for this is that I have been doing work on iOS and would like very much for their to be a supported way to communicate with the corresponding devices (iPhone, iPad) at speeds that are suitable for live gigging.  Since, as I wrote in previous posts, I don't believe WiFi offers that performance I have been on sort of a quest to get the best technology available to help me.

Until iOS 4.2 there was no MIDI support at all in iOS save for the Line 6 MIDI Mobilizer which I believe has its own special software.  With 4.2 the basic Mac OS X MIDI interface API became available for iOS.  Along with this, on the iPad at least, came the Camera Connect 30-pin Apple to USB conversion adapter that allowed some USB MIDI devices to be connected directly to the iPad (the Camera unit does not work on the iPhone - I suspect because the iOS device battery is driving the MIDI unit and for iPhones the batter would die too quickly to suit Apple).

Not wanting limitations with my development effort, including performance limitations in getting data from the iPad to elsewhere either as MIDI or otherwise I found that I had to create my own, "Gold Standard" solution.  My solution, which so far out performs all other USB/MIDI apps available for iOS allows me to play MIDI (or do virtually anything else I can do over a wire) at full USB speeds.  Its Mac OS X (though it would work on Windows or Linux as well) software that allows me to simply connect my iOS devices to my Mac via USB and talk to them at full USB speed.

So far the best benchmark of performance has been this: generating continuously variable MIDI CC messages on an iOS device in a tight loop, i.e., while a finger is dragging.  For virtually all devices save my "Gold Standard" USB solution the devices cannot keep up.

Which brings me to the iConnectMIDI.  Mine arrive a few days ago from Sweetwater.

The first thing I did was tear it apart to see what makes it tick.  Here it is out of the box:



It has three USB port - two minis (D1 and D2) and one regular "A" USB along with a power light and an activity light for each of the type of different connectors. 

There are two full MIDI ports on the back.  It uses a non-standard 5.2V 800 ma power system (musicians prefer 9V) also on the back:


Inside the guts the iConnectMIDI uses a TI Stellaris (LM3S3749) ARM programmable controller and a set of three SILabs F321 USB 2.0 controllers (link) for each of its three USB ports (one regular, two mini).


Mine is serial #0x1B (27 if you don't speak hex).



The web site talks about the huge performance boost this device has over standard MIDI (which runs at 30K baud or about 3 Kb/second).  The 30K baud speed is based on ancient UART technology used in modems - save that MIDI runs a 0..5V signal.  The web site claims 12 Mb/second of transfer speed (see this) - significantly faster.

So I've played around with this for a day or so trying to see what it will do and not do.

Here are my results so far.

USB port (A wude) connector does not meet the USB MIDI specs for the iPad to talk directly to the iConnectMIDI like E-mu X-midi 2x2 (see this site for a full set of compatibility issues and devices).  This means that the iPad via the Camera Connector Kit does not recognize this device as a MIDI device.

(It would be nice if the iConnectMIDI did present its USB ports as iPad compatible.  This has to do with some sort of standard for MIDI USB devices and whether a particular device follows that standard or not.  The key here is "class compliant" USB MIDI.)

The iConnectMIDI (iCM) does, however, come with its own special Apple 30-pin to USB mini cable that magically works when plugged between the iConnectMIDI and the iPad.  (I think this cable is were all the real magic is happening but I am not sure how at this point.)

I have been using Native Instruments Kontakt 4.0 quite a bit for testing the 3i (for reason which will become clear in later posts).

Connecting multiple different MIDI-producing apps to the iCM via USB does not drive Kontakt (iPad -> iCM D2 mini-USB, iCM D1 mini-USB to Mac, Kontakt set to use iCM D1 port which appears in the Kontakt options area).

However, by renaming the iCM D1 port on the Mac with MIDI Patch bay to something else, e.g., "iConnectMIDI ICM USB D1 - >  Frank", does work if Kontakt is set to use "Frank" as a source of MIDI. 

Most interestingly, if I change the iPad over to use the Apple Camera Connector and something like the E-MU which is known to support the iPad USB MIDI then it does work. This implies that when the MIDI commands go through two SILABS F321 chips something is getting lost or munged along the way.  If the MIDI goes through only one by sending it through a MIDI connector (I do not think the F321s are involved in that signal chain) it does work.

(iCM - if you are listening this is a relatively serious issue...)

The same connection that does not work with Kontakt 4.0 does indeed drive Apple Logic correctly as well as MIDI Monitor.  This tells me that its probably at least in part a Kontakt issue but I don't see them fixing it any time soon.

The 3i names its MIDI ports on the Mac similarly to that of the iCM so I am not sure what this issues is - it does not to me appear to be name related though name length could be involved.

A more interesting test was to hook things up using full USB and then blast away with MIDI CC messages at full iPad bore.

As I said above my USB solution works sending these messages at full blast.  However, the iCM, for which I expected to see the same results, did not perform nearly as well.  It seemed like the iPad overran the iCM causing it to store a few seconds of CC commands and flow them out at a much slower pace.

However, after some thought I suspect that actually this is an iPad MIDI issue and not an issue with the iCM.  I think what's really happening is that the iCM is somehow magically tapping into raw iOS MIDI on the 30-pin port with its magical connector.  The iPad MIDI driver is what's actually throttling out the MIDI CC's based on some sort of internal mechanism.

The iCM (and I base this on observing the lights) is just getting them at this much slower rate.

(MIDI has not flow control so I suppose this makes sense.)

From my perspective this is very unfortunate because it means that the iPad is limiting the speed with which it can send MIDI to around 30K baud - no doubt to conserve battery life.

Why Apple chooses to inject these interesting little limitations on these devices is beyond me.  It would be nice if the MIDI were really fast - it would be more flexible that way.  Even nicer if this was a API controllable issue.

But this limitation really means that even if the iCM has the 12 Mb/sec of bandwidth the marketing literature talks about you're just not going to see it in the signal chain from you iPad.  So if I were iCM I would be bitching up a storm at Apple because clearly they are in at Apple to know the magic for the 30-pin connector yet the iPad and iPhone (though I haven't tested it yet) are throttled!

(Again, I understand that older equipment has limitations but this is just ugly.  I guess too that the iCM probably does not have much memory and hence is happy that the iPad ensures that there never will be a buffer overrun.)

Imagine, for example, I want a volume slider on the iPad and I want all the volume messages to go through because I am using it like whammy bar or something where the results need to be relatively continuous.  Certainly this prevents that and yes I know I can quantize my volume messages so there are fewer but that's not the point. 

Why create an iPad that's clearly very fast computationally and supports USB 2.0 very well and then nail on a fifth performance limiting MIDI fifth wheel made out of old, rotting timber?  Sure the MIDI standard limits speed to 30K baud - but don't hard code it because some devices might work faster - like your own!

For the money this kind of makes the iCM a pricey fifth wheel, especially when I can buy things like this USB MIDI at Amazon for $5.00 USD - for $10 bucks I get as much bandwidth.

I was really hoping for that 12Mb/sec of MIDI throughput to include the iOS devices...

Foiled again!

I guess its back to the Gold Standard....

Tuesday, July 26, 2011

Synthodeon, MIDI and the Future... (Part 2)

So far I have not said anything remotely interesting, at least technically, but hopefully that will change now... 

There are in fact products just like I described in my last post: ipMIDI, a C# MIDI code set, a Cycling 74 patch, QmidiNet, and many others.  All of these products, though, treat MIDI as a simple end-point-to-end-point topology that does nothing beyond the basic MIDI capabilities.

But my interest here goes beyond this basic ability.

First off, when you are working with software samplers, e.g., Kontakt, you have the ability to organize multiple samplers on the same MIDI channel or duplicate samplers on a set of MIDI channels.  This ability allows you to have much finer control over what is played for a given set of MIDI values as well as allows you to do things that you otherwise could not.

In terms of something like, say, Guitar samples, its often the case that there are a lot of articulations available for a given sound, e.g., strumming up versus strumming down, starting on one string, ending on another, mutes, harmonics, and so on.  Many software samplers today cram these sounds into chord sequences, e.g., I hold down "C" in on section of keys and I use "A#" in another to strum a C chord.  Some samplers allow you to control what chords are being sounded (on the fly or with setups) and others do not.

What happens is that the complexity of what is being (or needs to be played) quickly and exponentially gets out of hand - there are too many ways (inversions, where on the "guitar neck" your hand is, and so on) to play a "C" chord, which form of "C" is appropriate for the piece I am playing, and so on.

At a low level MIDI is getting the "note up & down" events to the sampler but the "sound" the musician is asking for is somewhat beyond this basic ability.

While things like MIDI CC values could be used to address this issue they are not generally because picking you own choices for specific CC values may be "non-standard" and would render any software or hardware that did so incompatible with a lot of existing MIDI equipment.

Even something somewhat standard, like breath control (velocity) MIDI values, are not always handled correctly.  For example, I've played an EWI 4000s live for several years using either Mr. Sax (from samplemodelling.com) or Roland XV-5050s with the Patchman Sound banks (BTW, I cannot say enough good things about these sound banks).  Some out-of-the-box 5050 patches works with a breath controller, some do not (patchman ones always do, even on the XV-5050, as they were designed for this purpose).

Mr. Sax uses a variety of CC controls for special things while playing - which is inconvenient if you do not have a way to generate them.  This is particularly true live - I have enough to haul around as it is without yet more knobs and pedals to munge CC values.

For the more sophisticated guitar functionality you would like a controller to be able to do "more" for the player in terms of triggering samples in response to the player's hand movements.  There are many cases where one MIDI channel is simply not enough to do this.

Take, for example, note bends.  On a synth keyboard you typically bend the whole keyboard with the pitch wheel, i.e., all the notes change in response to the wheel.  On a faux sax or clarinet with the EWI you bend a single note (by biting the mouth piece).  On a guitar you can bend multiple notes differently and simultaneously.  MIDI does not help you in this regard.  (Not to beat up MIDI - it was developed before the things I am talking about were even possible.)

Now throw into the mix your typical keyboard layering and/or splits.  Now I really need multiple MIDI channels per split to do anything beyond basic notes.

Then there are devices like the iOS ones from Apple (and well, possibly, as Android).  Playing them wireless in the context of a guitar or keyboard where exact millisecond  timing is required is a show stopper - its simply not reliable enough.

(Yes you can create a private network on your Mac, etc. but I've not had satisfactory results much less taken it into a paid live gig alone or with other musicians.)

You really need a wire to get this done.

(At least Macs work live - I've played a variety of samplers and Logic live without problems.  I guess the same is true for PCs but I personally would not rely on one...  BTW, I am no PC bigot and what I will be posting going forward will certainly apply to PCs as well.)

So the bottom line here is that we need more than just a single MIDI channel between a controller for modern samplers if we want to get anything done that gives us much closer control that we get for a single keyboard, etc.

My solution to this is very, very simple: combine collections of MIDI channels into (sub-)groups, or what I will refer to as simply groups and sets of MIDI virtual ports into what I will call Multis.

Now MIDI makes this easy on a computer because ports have names, e.g., Kontakt Virtual Output.  These names are derived from the software or, in the case of hardware, from the device itself as part of the USB standard for modern MIDI devices.

Devices are often, even in the wild, named sequentially with simple numbering, e.g., E-MU Xmidi 2x2 Midi Out 1, E-MU Xmidi 2x2 Midi Out 2, ..., so it makes sense to think of sequentially numbered devices as a Multi.

But fortunately we don't have to worry about actual devices for this abstract model.  If we have software that simply creates groups of ports through the local OS we can create and name them any way we like, e.g., Port 1, Port 2, Port 3.

Secondly, because in the world I describe we can support any number of MIDI port plus have a fast, reliable means to talk between the we can have a relatively large number of MIDI channels in a Multi, say four, and still achieve very high performance.

So, for example, in the diagram below we have "A MIDI 1" a (Single) logical OS-based MIDI port we've created on Node 1 that's linked to Node 2 (or to other random Nodes).  We also have a set of MIDI ports "B MIDI 1", "B MIDI 2", and "B MIDI 3" that we want to think about as a Multi.  We name this Multi by dropping off the space and number to get "B MIDI".  Any port the follows the template of "B MIDI x" where x is a sequential number is included.



We divide "A MIDI 1" up into groups of, for example, four MIDI channels - groups being named 1, 2, 3, and 4.  We also divide the Multi "B MIDI" into twelve groups of four MIDI channels.  Groups are numbered from one on the "first" MIDI port and consume units of n channels per group until you run out of channels (n is always less than 16 - at least for now - though I suppose there is no reason n couldn't be 32 or 18 or something else like that).

So what we have done is create logical groups of MIDI channels into a simple grouping and naming system.  There's no requirement as to how many or few channels fit into a group, or how many ports into a Multi - those are decisions based on the application.

So what?

You might love MIDI as it is and hate me for what I am saying - so be it.

But face it, as I said in the last post, MIDI is like a Platypus - its here, it works, we love it but its not the best we can do and its certainly not the latest design...

For one thing when we talk to a sampler it makes more sense to talk about a sound using a group rather than a channel.  (Certainly one sound can be one group which is one channel - which is what MIDI is today.)  But if I want to create a more complex controller that works like some of the guitar-based sampler keyboard inputs I can (no pun intended) re-group into a set of, say four or eight MIDI channels, and use the extra bandwidth to do interesting things in the controller to diddle the sampler in interesting ways.

For another we can send metadata along a given Multi between apps that speak the same MIDI protocol at the Multi level.  Because I have good hard bandwidth between, say, Node 1 and Node 2, I can encode things into MIDI and use it as a transport layer - either in a group or as a Multi.  (There are other, more serious reasons to want to do this which will be revealed as the '3i' gets closer to debut.)  This gives me a reason not to invent yet another MIDI over UDP/TCP protocol - though I may release tools that make of this simple to do.

After all this is basically kind of like what's going on with MIDI-controlled DAW control surfaces so why not buy into the same concept for actual MIDI controllers?

In the reality of the 3i what this all really means is that the 3i thinks about the world as groups of MIDI channels - not just one.  Which makes effects beyond what one channel can do possible if not down right easy.   The 3i sort of kind of only cares about output groups and its set involves knowledge of the target sampler to some degree in order to take advantage of what the sampler can really do.

Synthodeon, MIDI and the Future... (Part 1)

As a developer of software for the last 35 odd years and a gigging musician for the last 10 or so years I have come to know MIDI quite well.

Recently Synthodeon has been working on something called the '3i'.  This is, among other things, a MIDI controller that will allow new types of playing that currently impossible due to the fundamental limitations of MIDI.  While I cannot yet talk about exactly what the 3i is and does I can say a few things about what it needs to do MIDI-wise.

First of all MIDI has been tied to the notion of a MIDI cable and to some extent a "keyboard" - after all this is what it was originally designed for.  There are sixteen channels on the cable and each channel supports things like pitch wheels, notes and all the standard MIDI stuff.   Most of today's keyboard controllers support other things like CC messages for knobs and sliders.

But on the whole the "concept" is still very simple: Channels are tied to sound controllers (maybe with splits using extra MIDI channels to support two different sounds or to overlay sounds).  Keys pressed operate within a given channel using Note On/Off messages, and so on.

So what don't I like about MIDI?

First off I don't like the basic model because its so very limited - particularly when you look at what can be done with things like Kontakt or Logic or any other MIDI-able DAW-type software.  These apps have the ability to tie different samplers or functions to specific MIDI channels, i.e., I can have the same sampler on three consecutive MIDI channels.  But, for example, most keyboards can only access a single sampler.

Secondly certain types of MIDI software in Kontakt, e.g., guitars in Orange Tree or Vir2, have funky multi-key controller options for their samples, e.g., you have keys to select a chord or note and separate "strum" or other note-sounding keys, reminiscent an AutoHarp.  This is somewhat awkward and pushes the notion of MIDI away from keyboards - but there is no alternative place to go currently.

Third the MIDI bend (and in general the CC) model is very limited - I can only perform functions at the channel level.  What happens if I want to bend a single note?  Controllers I am aware of don't handle this at all.

Fourth, with the advent of network software (wireless, wired, USB, etc.) MIDI can run over a virtual channel from point A to point B.  However there is no standard model for this.  There is stuff like OSC and others but they are not standardized.

Fifth, things like iOS devices have limited asymmetric MIDI functionality.  What I mean by this is that on an iPad, for example, you can plug in a MIDI device but only through a "Camera Connector" - and then your device must meet a variety of vague requirements (USB and power) in order to work.

Sixth MIDI is just not reliable enough for serious live music over WiFi regardless of what you do - even on local, private WiFi's with Mac's.

Seventh, I have a lot of devices - computers, iOS devices, keyboards, wind instruments.  I want a MIDI model that represents each of these "places" on some sort of virtual "MIDI backbone" so that I can plug something into one device on the backbone and have that connection available at other points on the backbone.

Now I am sure there are a lot of arguments about why I am wrong, or why such-and-such a specific tool can overcome one of the limitations I listed above, but the bottom line is MIDI is twenty-some years old and really hasn't kept up with modern times.  I want something that works in a modern way.

(Can you think of anything else you use that old on a daily computer or music basis save something like a physical guitar?)

I can't.

The oldest stuff I have that I use all the time are things like studio monitors and vintage synths - and most of that is not twenty years old.

So I've been looking around and found this site (OpenMuse) which talks somewhat about this in general.  There are links to some standards efforts which mostly appear to no longer exist.  There are also some patent links to US Patents claiming some form of jurisdiction over MIDI over networks.

Irrespective of any "standards" efforts there are a couple of patents to note.  First there is U.S. Patent 5,983,280 "System Using Standard Ethernet Frame Format for Communicating MIDI Information Over an Ethernet Network" and U.S. Patent 6,353,169 "Universal Audio Communications and Control System and Method."

The first patent ('280) has a UK assignee.  The patent discusses MIDI using Ethernet broadcast of various forms, multicast, and so on and it appears to be protecting some proprietary hardware and protocols.  (See this for a reasonable discussion with which I would concur.)

The second patent, owned by Gibson, uses MIDI to control audio.  Again I do not see this as relevant here.

So neither of these patents from my reading address what I will be describing here (though I am not an attorney and my opinions are worth exactly what you have paid to read article). 

Are there more patents in this area?  Its hard to say (and this is a reason the US Patent system is flawed - there is no way to discover what else is out there for a given topic).

The fact that there are numerous software apps (various OSC things, etc.) out there doing this sort of thing in one fashion or another leads me to believe that if there are they are probably not valid or everyone out there would be getting stomped on.

All that said what I want to see a MIDI software topology, at least at an abstract level, is something like this:


Here each "Node" (Node 1, Node 2, etc.) is a physical device (an iOS device, a computer, etc.).  The MIDI ports listed on each device are virtual, i.e., not physical.  If I need to link things to the outside world on a given device I can use some sort of MIDI port connector or bridge software to do that.

I want fast, clean delivery of MIDI messages - plain and simple.  No other fancy nonsense to get in the way, no wizards, no autoconnect please press okay.  (Who wants their shit to go dead during their epic solo?  Worse, who want's to thrash through menus and crap trying to fix it afterword while the audience and your band mates stare at you with hate-filled grimaces.)

Basically I want to create MIDI endpoints between devices easily, reliably, deterministically and cleanly.

I want a physical wire (green arcs) between the Nodes (USB speed or better) because I want sub-millisecond response (WiFi is okay for testing and noodling around but not for live).  To play live you need this along with the headroom it offers (for really fast pickers).

(I measured my response via MIDI on moderately fast piano piece - I can hear down to about 2.5ms - 3ms between chord strikes.  I am not a pro and I cannot play really fast but my ears are pretty good.  Sub-millisecond leaves a factor of about 2x which should be enough.)

I don't want automatic or automagic connectivity via something like Bonjour because for live gigs I don't want software "helping me" set things up.  I don't really want to (accidentally) be on the venue WiFi (where some clown will start downloading porn videos to his iPhone just as I hit the first note in my solo).

Instead I want hard coded/wired links via fixed device-based TCP/IP settings over deterministic, private connections that stay the way I set them up and come back up spot-on. (Who hasn't had the circuit breaker or generator go out in the middle of a gig at a festival?)

I don't know if a local BlueTooth network is fast enough for this - especially outside of a "computer deveice", e.g., iOS.  I have done a lot of testing and so far all I can say is "I am not sure."  I am also concerned about BlueTooth interference (audience members with lots of active BlueTooth gear on).

From a device perspective Mac OS X works reliably on stage - at least in terms of no OS, audio or midi glitches and no battery fails - networking and iOS wise this all remains to be seen.  I do not yet know about Android for this type of application though from what I have been reading its still immature in the developer area.