Audiobus Is About To Make Audio On iOS A Lot More Interesting

AudiobusDevelopers Audanika and A Tasty Pixel have announced a new project, Audiobus, that promises to make audio on iOS a lot more interesting.

One of the key limitations for iOS has been the difficulty of working with audio from multiple music apps ‘in the box’. Audiobus, described as a ‘virtual mixing board and FX processor for iOS’, could help address that.

Here’s what the developers have to say about Audiobus:

Audiobus is a local (and network) communications channel for sending live audio, like the audible output of a synthesizer app, or the output of a guitar amp effects app, for example.

More than that, Audiobus is also a multi-track recorder and content creation platfom, allowing you to record and mix the output of several apps all at once, and then publish the result.

At this point, Audiobus is underdevelopment. But, it’s being spearheaded by the same people that helped drive the rapid development of MIDI collaboration with iOS – so this could get interesting fast.

Audiobus is ‘coming soon’. See the Audiob.us site for details.

via idesignsound

9 thoughts on “Audiobus Is About To Make Audio On iOS A Lot More Interesting

  1. How will this work?

    Will apps all send their audio to Audiobus only and then Audiobus mixed and outputs it?

    And where do effects fit in?

    Will Audiobus be midi controllable?

  2. You’ve raised some great questions Iman. I too was wondering about it being midi controllable, which would be nice. What I’m hoping is that it will function in a similar fashion as ReWire. It would be awesome to be able to have garageband or music studio open and be able to control (via midi) Sunrizer, NLog etc. and be able to record them simultaneously into the DAW. This would really open up the platform. I’ve been making tracks on my iPhone 4, primarily in Nanostudio and then making loops with Sunrizer, NLog and Animoog, then using copy/paste to get them into Nanostudio and then assigning them to the TRG pads, which is a huge pain in the ass. Especially when I went to send them through Filtatron first. While at one point I’m amazed that I can do this all from my phone, it’s a cumbersome process and Audiobus sounds like the answer. MIDI would be super key.

  3. As far as I know, it’s smuggling audio over MIDI Sysex messages; because MIDI is available as a fast enough channel between apps. There was a lot of talk about creating channels between apps this way a few months ago, and they announced crude prototypes back then. I really hope that this is the death knell for AudioCopyPaste, which I refused to implement as too much effort for such an insufficient result. You really need an (unbounded) real-time audio pipe to do things like 1) I write a touch controller app that makes basic expressive sounds 2) Somebody else writes a guitar effects app 3) somebody else writes a DAW to record the output of the effects app.

    The real beauty of this may end up being that you can think of it as Audio data that rather than just being audio waves is semantically *marked* *up* with MIDI messaging to describe the audio. If you mingle the audio with a MIDI transcription of it, you might be able to have effects processors that would not even be possible otherwise. Kind of like the situation with MIDI guitar where you aren’t guessing what notes are played by analyzing waveforms, but they are handed to you from the original gestures that signal the intent of what you are doing.

  4. This is all very well but it’s still like taking a step back in time workflow wise (Load host – manually load synth – select preset – manually load effect – select preset – repeat). Whats the chance of Apple actually developing a proper audio plugin architecture on iOS?

  5. ive been studying the discussions on different blogs today pertaining to this app, and its interesting to note there are two parties here

    1 the excited ones who see a new way of doing things

    2 the sceptics who feel this app may create more work and slow down their workflow

    i guess i fall under category one

    i was relieved when acp was introduced and made life somewhat easier but i think virtual midi sucks
    to be honest (you can record, only sequence on the likes of genome and even then you have to still acp)

    what audiobus may well do is atually encourage more creativity and the developer told me today that this app can RECORD, YES RECORD.

    this brings a brand new dimension to ios, one that some thought never to be possible, the ability to record live jamming on several apps at once, not just mere sequencing

    if audiobus can indeed do that, thats a huge leap forward and the announcement of this app is causing a well deserved stir, it has been a slow period after xmas and now as ipad3 is imminent aswell as the Auria app , things look very bright for app musicians

    so what if workflow is still slow? so what? ios device owners know that things will have to be done differently and many relish that. for those that dont like it, you can always stick to pc stuff no bother

    we would do well to remember brian enos words- process is more important than result

    this is a remarkable thing the developers are doing and i am personally urging all serious app developers to grab the sdk of this

    overall, audiobus will bring the live element into ios music performance methinks and that is effing amazing!

  6. ultimatly this app will help revolutionise the music industry, especially the live aspect that this has the potential to fascilitate

    its all about WHAT it will do, not HOW

    i think Apple have left room for others to develop things and make some money, dwell on this point a while..

Leave a Reply to Rob Fielding Cancel reply

Your email address will not be published. Required fields are marked *