Powered by Squarespace
« Music Studio iPhone Review | Main | nanoloop Updated! »

MIDI Jitter on iOS 2: The Reckoning

Following my article last week on MIDI Jitter in iOS apps, a lot of healthy conversation cropped up here and on the original KVR thread. Several developers have been a part of the discussion, including regular contributor Fessaboy, with consensus forming on the idea that the jitter is related to buffer lengths used by problematic apps. This makes sense, and fits with Kevin's observations in Animoog for iPhone. Earlier iPhones were not nearly as beefy as iPads, so developers tried to support them as best they could with crazy high buffers that are easier on the CPU, but slower to react.

This all raised some great questions about the stability of iOS as a professional platform for musicians, so I've been doing some additional research on the subject. In the original article I mentioned several apps that did not appear to be effected by any jitters, so I picked one of those for further testing. Magellan was my goto. Due to its recent popularity, I figured this would be a good one a lot of people would be curious about.

Kevin had asked for some additional details on my tests at high BPM, so I'm including here my results with Magellan in both a wired and wireless MIDI configuration. In all instances I was sending 16th notes (with a 32nd note length, so there are gaps), with the first note in each bar being a few semitones higher so I'd know when a new bar starts.

Testing Setups:

Wired Setup: Ableton Live 8 going out through USB to my Akai MPK61 keyboard, through the Akai MPK61's secondary MIDI dins, into my M-Audio Uno USB-MIDI adapter, and finally into the camera connection kit connected to my iPad 3.

Wireless Setup: I don't have a Mac, so I'm using rtpMIDI which uses the Bonjour service to connect from Ableton Live 8, through my wireless router, and from there to the iPad 3. This is not a good solution, and it is recommended that you use an Ad-Hoc network for wireless MIDI, but I think this is a more typical "real world" setup.

I'm focusing on the second and third bars in the recording, because (as you'll see later) that gives it enough time to screw up!

First up is a wired connection at 120 BPM. No problems at all, everything is lining up nicely.

I cranked it up to 160 BPM. Still looking good! There is some movement in there, but this is to be expected and nothing you'd notice.

Alright, full throttle! 240 BPM, BABY!

There is slightly more wandering here, but still not enough that you'd notice. Take a listen for yourself.

Considering how much of a mess my MIDI chain is, I think it is safe to say that a wired MIDI connection is reliable on iOS. Any faults come down to software that isn't designed to cope, or possibly an even worse MIDI setup than mine. Though I honestly can't imagine how that could be; I bought the Uno because it was the cheapest USB-MIDI interface I could buy locally!

We can't rule out the possibility of a faulty MIDI connection though. Rather than buying a bunch of MIDI interfaces I wouldn't need, I decided to test the idea with a MIDI setup that is clearly "doing it wrong", by running wireless MIDI through a non-Ad Hoc WiFi network.

At 120 BPM through the wireless setup, Magellan is only marginally off.

I'd say that is somewhere between 120-160 on the wired connections. Once we get to 160 BPM the cracks begin to show.

Now we're getting into sloppy mess territory, which you can also hear for yourself.

I hope this puts to rest the notion that MIDI Jitter "prevents iOS from being a serious platform for software instruments."

If you're having problems try changing apps, MIDI interfaces or do a proper Ad-Hoc WiFi connection!

Reader Comments (14)

I'm curious how a wireless test with a Mac would look. Wireless MIDI on Windows is a bag of hurt compared to OS X's Core MIDI.

October 16, 2012 | Unregistered CommenterDC

Yea, it feels pretty flimsy on Windows... though rtpMIDI is much more reliable than it used to be. DSMIDIWiFi actually works better on Windows and iOS than it ever did with Nintendo DS homebrew!

October 16, 2012 | Registered CommenterTim Webb

Send me your project and I'll try it on my machine. I have an eight core 2009 Mac Pro with a wireless adapter I installed specifically to use for iPad music applications. I have Ableton 8.3.4 and Magellan. My Mac is still running Lion, version 10.7.5.

October 16, 2012 | Unregistered CommenterDC

There isn't much to the project... it is just a bunch of D#s and a G on the first note. The rest is just routing everything all over the place for my own setup here. Just make sure the notes themselves are separated by 32nd note sized gaps.

October 16, 2012 | Registered CommenterTim Webb

I have done extensive testing to deal with latency and jitter in apps. Audio latency (ie: how many samples is a buffer of 44.1khz data? I can't deal with more than 256, but most apps are 512 or higher). has always been an issue. But I have never had MIDI latency be high enough to worry about it when using the Camera Connection Kit. (MIDI latency over Wifi is comically bad, and isn't even worth measuring.). Geo Synthesizer seemed to get better MIDI latency than audio latency. Because it's an immediate-response app (ie: no sequencing), there is no buffering at all. I send out MIDI as soon as there is enough data to make a packet. But none of what I am referring (response to a touch latency) has anything to do with what you deal with in drum machines (sending out an even clock signal that tracks real-time properly - it's totally different because you can buffer more MIDI data and put timestamps on it to avoid deadline misses).

October 16, 2012 | Unregistered CommenterRob Fielding

New video: http://www.youtube.com/watch?v=DUQNN36DWp0

Thanks for posting again with more results Tim! Everything looks good at 1/32 resolution but how does it look at 1/64 or 1/128 resolution in Ableton? Are you getting better results than I do in the new video? If so I will shell out for the CCK and a new interface.

I know you had really good intentions with the 240 BPM 16th notes but I would actually be more interested in hearing the 120/160 notes. Reason being, notes fall in and out of place at intervals in my tests, and at a high BPM like that it might sound like a pattern because it's a flurry of notes instead of a clearly delineated rhythm. Although, the fact that Magellan plays all the notes to begin with is great!

If you and I have reached a consensus here and it's a matter of taste or opinion, I'd be interested to know. Thanks again!

October 17, 2012 | Unregistered CommenterKevin McCoy

I really think you're too dismissive, even in your latest video. Trying one bit of third-party kit does not invalidate a whole computing platform. I didn't want to mention this, because I give them enough of a hard time here, but I think your problem is that piece of shit from IK. They are a software company and their hardware has never been warmly regarded. I regularly see people complaining about their first iRig guitar interface, but this is the first I've seen anyone with an iRig MIDI and so many problems.

I just tried a similar test with Sunrizer and did not experience the problems you have, so either the iRig is causing the problems or something else in your chain is doing it. Here are fresh samples for you in 120 and 160, from a simple single Pulse (50 Width) Osc patch.

120 BPM Sunrizer
160 BPM Sunrizer

I will admit that there are some minor problems with the 160 one, but nothing game breaking, especially when that 120 is as good as one could ever hope for from an external source.

October 17, 2012 | Registered CommenterTim Webb

Thank you again Tim. You've invested a lot of time in demonstrating this and I appreciate it. My intention is certainly not to be dismissive! Rather, I have spent a lot of money on that piece of shit iRig, synth apps, etc., always hoping that one will be better than the others or solve the problem as in the case of Midibridge. Like you, I don't want to spend more money on another MIDI interface that I don't need if you and I were getting identical results. That is what is preventing me from going out and spending even more money on a new interface and the CCK. Other readers have reported issues using the CCK+MIDI as well.

My personal take listening to them is that 120 BPM *might* be usable, but 160 BPM is all over the place. This is a judgement call for each producer though and that's certainly not a verdict on my end.

If you wouldn't mind making the two files downloadable (just flipping the download option on Soundcloud should be fine), I can import them to Ableton, A/B our results, and compare them with other external sources. If your results are indeed better than mine I will concede this one to you and go with the CCK. Consumers should also know that the iRig is a piece of shit and not to buy it IF that's what's causing the problem, right ;)

Remember,results are the still the same or worse with interapp MIDI as well, where I have also made investments of time and money. When I buy something promising that it "blurs the boundaries between iOs and hardware synthesizers" (Sunrizer), has "powerful MIDI capabilities" (Magellan), "professional polyphonic synthesizer designed exclusively for the iPad" (Animoog), I expect it to be able to professionally handle 16th notes at 120-160 BPM. We should probably forgive these developers for advertising like this because they are just trying to make a living too, and the prices of these things are great compared to HW, so it's not a gargantuan investment... but it's still hype in my opinion given the applications' underwhelming performance when interfaced with other apps and the external world.

I'm not a software synth hater at all, which is why I tried to show what a normal VST synth can handle with the KORG Polysix plugin. By the way, when I up the latency in Ableton to 2048, no change in results. Solid stream of notes that just comes 2048 samples later. I'd like to see iOS apps respond in the same way even if they need more latency to make awesome and complex sounds. That would be truly professional and blur the boundaries between hardware and iOS.

October 17, 2012 | Unregistered CommenterKevin McCoy

They are now downloadable! Please report back with your results!

I suppose in the interests of full-disclosure I should mention I rarely go above 100BPM in my own music, so this has never been something that even crossed my mind before.

October 17, 2012 | Registered CommenterTim Webb

I just tested with the exact same patch in Sunrizer as you, and guess what... we have identical results! Almost down to the sample.

See KVR post for more: http://bit.ly/PDKDuV

So I don't see a conceivable difference between iRig and your CCK setup - they are both CoreMIDI anyway.

You're right, 100 BPM with Sunrizer or Magellan probably won't be noticeable, but the fundamental issue is still there. We should expect more from a softsynth - do they need to come with a disclaimer "not suitable for music above 120 BPM"? "Caution: Can be hazardous to 16th notes"? :D

October 18, 2012 | Unregistered CommenterKevin McCoy

Tim: where this issue comes into play is when you play complex phrases on the instrument. IYou only need a little bit of legatoing, arp speeping, scale runs here and there for this phenomenon to be a major annoyance.

I got outrage fatigue over this issue about 2 years ago, though there has been some improvement. When Mugician came out, 1024 length buffers was the norm (and if you set it lower, the OS would put it back up to 1024 when you unplugged the headphone jack and plugged it back in and almost no apps could cope with it). The buffer size issue was the reason why Geo came out incompatible with Sunrizer, and Animoog (ouch!), because anything below 512 was pretty much unheard of. But when it's a guitar-like interface that's designed to play faster than a real guitar (compare to chapman stick playing is more fair) with note overlapping rules that let you trigger on every finger down (hammer on) and finger up (hammer off), the latency matters more than anything else to the realistic feel to the fingers.

October 18, 2012 | Unregistered CommenterRob Fielding

Yea, a lot of early apps can be super problematic because of the huge buffers. I first encountered this issue myself on Korg's iMS-20 when trying to send it an acid bassline of 16th notes, all while screwing around with about 4-6 MIDI CCs. To date that was the only time I got to a secondary tier within Korg while reporting the bug... but nothing came of it.

I'm optimistic about the future though. As processors are getting faster, older devices are obsolesced, things like Virtual MIDI and Audiobus will push the "default" buffer length lower!

October 19, 2012 | Registered CommenterTim Webb

Let's hope so. I am really looking forward to seeing what Audiobus can do for music production on iOS! In the meantime, I will try to find a developer willing to talk about whether it is possible to implement more solid MIDI and note generation features in synth apps. Did you ever hear back from Amos at Moog?

October 19, 2012 | Unregistered CommenterKevin McCoy

just a couple of footnotes...not sure whether they'll be specifically applicable but, fwiw
here, paragraph one mentions jitter but, since it doesn't seem to apply to hardware only situations, the specific app developers would have to chime in.
http://en.wikipedia.org/wiki/MIDI_timecode
here, a different type of audio is discussed but, maybe it's an adaptable technique. at very least the developer is knowledgeable, if not hopefully very busy on other stuff!
http://atastypixel.com/blog/a-simple-fast-circular-buffer-implementation-for-audio-processing/

October 19, 2012 | Unregistered Commentera1
Comments for this entry have been disabled. Additional comments may not be added to this entry at this time.