Following my article in March, Meteor Multitrack Recorder Not Actually Multitrack Recording, 4Pockets have been hard at work to rectify the situation! Real multitrack recording is in today's update! Thanks to readers ZenLizard and Bianca for this news!
What's new in Meteor Multitrack Recorder v1.60:
- Meteor now supports simultaneous multiple channel audio recording.
- Selectable Inputs for both Audio Sources and AudioBus.
- Selectable Outputs for each track.
- The Video plugin now allows you to export a movie with soundtrack to a file or the camera roll.
- Stacked Recordings (Loop Mode in Single Track Mode).
- Improved CPU usage and improved quality of Pitch Shifting
- Improved Quality Time Stretching.
- Added a Basic Auto Tune for monophonic vocal recordings to the wave editor.
- Added an Invert Phase option to the wave editor.
- Added Markers for easier project navigation.
- Added MIDI Track Event Inspector.
There is also a demo video for some of the new features added.
Live looping with analog keyboards, USB controllers, iOS devices, and Ableton Live. All loops are recorded live, nothing's pre-recorded. iPads running Lemur and Animoog are mirrored on the wall monitors, and Animoog is also wirelessly receiving MIDI from a KMI QuNexus. Lemur is wirelessly controlling Ableton Live's filters, reverbs, and delays in the template (my apologies for blocking the Lemur monitor with my head for most of the jam... I'll move the camera to a different spot for the next jams).
Download or stream the audio at SoundCloud: http://snd.sc/11GbMko
- DJ TechTools MIDI Fighter 3D in MF3D Launcher mode
- iPads running Lemur and Animoog
- iTouch running Sunrizer w/ iConnectMIDI
- Akai APC20
- DSI MoPho Keyboard
- M-Audio Oxygen 25
- KMI QuNeo, QuNexus, and 12-Step
- Novation Launchpad
- Moog Prodigy
- Fender Rhodes w/ MXR Wylde Overdrive
- PreSonus 1818VSL connected to a Macbook Pro
- 5x Chauvet Slim Par 56 LED
- 2x American DJ Mega 50 RGB
iFretlesss Guitar got the one effect all guitar apps need: Overdrive!
What's new in iFretless Guitar v2.7:
- new overdrive effect
- 2 new completely clean guitar sounds great for use with effects processor apps (via audiobus) or with iFretless built-in effects
- 3 New pad sounds
- Supports midi inputs from external keyboard devices
- Various other improvements and bug fixes.
Here is a new demo video for this version.
Although the initial slide revealing Inter-app Audio in iOS 7 was easy to dismiss, having failed to materialize when they promised this in iOS 6, this time around it seems like the real deal. Several developers, frustrated with Apple's draconian NDA, are flat out breaking their non-disclosure and speaking somewhat openly about the contents of the SDK.
In an effort to get ahead of the rumors that we saw last year, and before anyone announces the death of Audiobus, JACK or even Auria, I want to compile what we actually know.
The official public word:
Leaked iOS 8 Concept Art
This doesn't really tell us a lot, and the use cases here sound confusing. What in the hell is the difference between publish audio streams and, uses the combination of those streams to compose a song? Fortunately that isn't all we have to go on!
Almost immediately after the release of the SDK an anonymous poster commented here with the official private word. This has been confirmed by three other sources, so I feel I have even better journalistic integrity posting this than CNN.
The official private word:
From the dev member center:
The Audio Unit framework (AudioUnit.framework) adds support for Inter-App Audio, which enables the ability to send MIDI commands and stream audio between apps on the same device. For example, you might use this feature to record music from an app acting as an instrument or use it to send audio to another app for processing. To vend your app’s audio data, publish a AURemoteIO instance as an audio component that is visible to other processes. to use audio features from another app, use the audio component discovery interfaces in iOS 7.
This Audio Unit phrase gives a good indication of what Apple has in mind for this technology. It is important to keep in mind that this is a technology; not a product like Audiobus and JACK. Audio Units (AU) are the plug-in format used by Logic, Apple's professional DAW, similar to RTAS, VST, AAX, etc. It is just a proprietary Apple format for plugin architecture. AUs didn't kill any of the other formats.
The inclusion of MIDI commands, and the ability to launch other apps, makes this sound like a tool for offline-rendering. For instance you could sequence your DAW app to load up an effect app, send it some audio and perhaps some parameters for how that audio should be effected, and then it collects the results. This is just an example, but that seems a lot different from what we're doing now with Audiobus and JACK today.
It is unclear at this time how this will actually be implemented or what sort of advantages it can offer. There are persistent rumors of a Logic for iPad, but the way that private blurb if phrased it sounds like they are opening this up for other DAW apps to use. Without getting any developers in trouble, I will just say that there is excitement about the possibilities here. This isn't a replacement for Audiobus, this is something else all together. The only thing we know for sure is that Apple are trying to keep their lead over other mobile OSes, by continuing to advance the platform well ahead of them all.
Update: After this article went up I was contacted by a dev I know and trust. He provided the following on the condition of anonymity.
Hi Tim, just read your blog post. I see this more as an opportunity for developers to implement a DAW hosting AU-Instruments via inter-app-audio. If I were Steinberg, that would be my focus for Cubasis. And probably the GB version will be ready for iOS 7.
There are example projects in the SDK which give the working code for this. A little host, an instrument "plugin" app and an effect "plugin" app. Both can be instantiated from the host app like you do in Cubase, Logic whatever. Apple would not spread this example if they wouldn't like to see this scenario.
The important point here, that this solves the sandbox problem: The sandbox concept of iOS was prohibiting loadable plugins which are not contained in the app like Auria does. However, with the idea of inter-app-audio this is solved. The instrument app published its own AU to any app it wants. The host checks about which AUs are there and remotely instantiate this app.
The MIDI connection and rendering is of course realtime. The only point unclear is, wether there is also support for plugin parameters, preset loading and persistency. But this could be easily added.
So, this IS much more powerful than Jack and AudioBus, I'm sorry to say. AudioBus of course could base its own engine on the Apple technology, but it gains nothing. In fact, AudioBus then is competing more with Cubasis, i.e. AudioBus is then just another host, and in that respect it is weak compared to Cubasis, GB etc. AudioBus/Jack loose the technology advantage of audio inter-app streaming.
Excellent insights! Note that although this does mean Audiobus will loose its monopoly, it does not lose its uses.