Apple’s iOS 4.2 update, due in November, includes some interesting new features that didn’t make Steve Job’s official announcement.
According to a report in AppleInsider, beta testers have found that iOS 4.2 adds support for the CoreMIDI framework to the operating system’s application program interfaces.
CoreMIDI could be huge for iOS music apps, because it provides standardized MIDI support. It is a built-in system that supports MIDI devices and how they communicate with other applications.
MIDI support means that the iPad could be a touchscreen control surface for any MIDI device; developers will be able to create multi-touch sequencers that control MIDI hardware; and you should be able to plug your MIDI keyboard into an iPhone, iPod touch or iPad and play your iOS software synths.
CoreMIDI on iOS 4.2
CoreMIDI support should help make MIDI more plug-and-play on iOS devices.
Here’s what Apple has to say about CoreMIDI:
In Mac OS X, Apple provides a new set of system services, so that applications and MIDI hardware can communicate in a single unified way, using a single API.
MIDI services, which are low level, provide high-performance access to MIDI hardware devices. There is a driver model in the MIDI world that “talks” directly to IOKit, so your application has a direct path from the MIDI services API to the hardware.
Using this driver model, third-party manufacturers can create driver plugins that talk to IOKit. Those can then be loaded and managed by a server, which applications talk to through the Core MIDI framework.
The CoreMIDI framework provides the client-side API that applications use to interface to MIDI devices.
The primary goal of MIDI services in Mac OS X is to have interoperability between applications and hardware, so that everyone is working to the same standard. Other goals include providing MIDI I/O with highly accurate timing, as required by professional applications. This means from a musical point of view being able to get a MIDI event into and out of the computer within one millisecond, i.e., to keep latency under one millisecond, and also to keep jitter –– i.e., the variations in I/O –– under 200 microseconds.
Another goal is to provide a single system-wide configuration, i.e., knowing what devices are present, and being able to assign names to those devices, manufacturer names, and what MIDI channels they’re receiving on and so on.
The MIDI services are designed as an extensible system. Toward that end, a device can have any number of properties attached to it. And a device manufacturer can publish their particular properties of their device.
Developers should be able to code apps with MIDI support, rather than having to support individual interfaces. CoreMIDI could also open up using the Apple iPad Camera Connection Kit for connecting existing USB MIDI devices.
Developers are already starting to support MIDI hardware on iOS, with devices like the Line 6 MIDI Mobilizer interface, This trend is likely to explode, though, with CoreMIDI support.
It’s not clear yet how “baked” MIDI support will be in iOS 4.2. If you’re an iOS developer or if you’re testing the iOS 4.2 beta, let us know if you’ve done any MIDI testing yet.
iPhone, iPad & iPod touch Getting MIDI Support
With this news, iOS 4.2 just got a lot more interesting for musicians.
What do you think of the promise of standardized MIDI support on the iPhone, iPad and iPod touch? And, if you’ve got one of these devices, what MIDI gear are you interested in connecting?