MIDI 2.0 Promises Auto-Configuration, Extended Resolution, Tighter Timing & Backward Compatibility

The MIDI Manufacturers Association (MMA) and AMEI (the Japanese MIDI association) today announced plans for MIDI 2.0, a major update to the ubiquitous Musical Instrument Digital Interface standard.

MIDI 2.0 promises to bring MIDI auto-configuration, new DAW/web integrations, extended resolution, increased expressiveness and tighter timing.

Member companies of the organizations are currently working together to develop prototypes for MIDI 2.0 devices and software, based on a jointly developed, feature-complete, draft specification.

Here’s what they have to say about their plans for MIDI 2.0:

A members-only plugfest to test compatibility between some early MIDI 2.0 prototypes is planned for Winter NAMM 2019. Participating companies include Ableton/Cycling ’74, Art+Logic, Bome Software, Google, imitone, Native Instruments, Roland, ROLI, Steinberg, TouchKeys, and Yamaha.

As with MIDI 1.0, AMEI and the MMA are working closely together and sharing code to streamline the prototype development process. Prototyping is planned to continue during 2019 as the associations work together on MIDI 2.0 launch plans, including exploring the development of a MIDI 2.0 logo and self-certification program for MMA and AMEI member companies.

During the prototyping phase, the proposed MIDI 2.0 specification is available only to MMA and AMEI members, because the prototyping process may trigger minor enhancements to the specification. Once a final specification is adopted, it will join the current MIDI specifications as a free download on www.midi.org.

The MIDI 2.0 initiative updates MIDI with auto-configuration, new DAW/web integrations, extended resolution, increased expressiveness, and tighter timing — all while maintaining a high priority on backward compatibility. This major update of MIDI paves the way for a new generation of advanced interconnected MIDI devices, while still preserving interoperability with the millions of existing MIDI 1.0 devices. One of the core goals of the MIDI 2.0 initiative is to also enhance the MIDI 1.0 feature set whenever possible.

All companies that develop MIDI products are encouraged to join the MMA to participate in the future development of the specification, and to keep abreast of other developments in MIDI technology.

51 thoughts on “MIDI 2.0 Promises Auto-Configuration, Extended Resolution, Tighter Timing & Backward Compatibility

    1. if it depends for you – this is of an almost revolutionary level of proportions for electronic music. Im for one, super stoked about this!

  1. Just as long as it’s fully backwards compatible and they don’t omit and can somehow still remember that aftertouch is a thing and that it also can be polyphonic, because that seems to have been forgotten if you look at the trend from manufaturer’s the last 20 years.

    1. Poly AT works great from my ’80s Kurzweil MIDIboard. I assume it will use the standard DIN-5 connector if it’s backwards compatible? CAT5 would make more sense to me.

  2. Wild. I remember writing one of those backpage editorials in Sound on Sound saying this was overdue back in 2003 or something. But I’ve learned my lesson. The minute this comes in the kids will be all about that original MIDI stutter beat.

  3. Having Google and NI on a committee for standards is about the worst possible idea imaginable since these are both companies well-known for introducing new products and features and then dropping them like it’s nothing a short time later. Guarantee Midi 2.0 will probably get five years life at most before a new standard is “necessary”…
    Can’t wait to haul all my MIDI 5.0/4.0/3.0/2.0 converters on stage.

    1. I don’t see why you would ever need a converter – midi is the same, devices will recognize it differently – EG new devices, DAWs, new FW – it isn’t going to physically change your DIN that exists and honestly it woud be great if there was something like a USB on every machine that could host further functionality. I mean it would be different if it was apple up there because they would want you to throw away and buy a new synth every year.

      1. the problem is that the standard hasnt actually been published, so we cant say anything for certain beyond what they’ve promised… “backwards compatible” – i havent seen anything that says they intend to keep the same DIN5 connector going forward – nor should they. But that’s like saying USB-C is compatible with USB1.1 – yes, its “compatible” – by using dongles. Are musicians going to carry around bags of MIDI2-MIDI1 adapters? Soon after that MIDI1.0 plugs will become “legacy” connectors, and as time goes on, we’ll have fewer and fewer of them as newer manufacturers consider them obsolete – rendering it necessary to purchase converters to use your gear with new systems.
        Second, NI and Google make profit off of stuff like this. You cite Apple as an example of this, I cite Komplete upgrades as evidence these companies will deliberately try to cripple the standard somehow so they can profit again when they have to go to 3.0 prematurely.

    2. NI released Maschine MKI just about ten years ago and not only have they consistently supported the original hardware, they have added features and developed optional products around it and have expanded the platform. The only thing I can remember them dropping was Kore- which admittedly was a disaster. But that was one thing.

  4. Great !!

    First, we are in 2019, obsolete 7 bit description should be goneeeee by now, so please replace it by a float 0.0 to 0.1 value type.

    Arbitrary numbers assigned to arbitrary predefined choice should also be gone too.
    It should be custom name string dot function name string (Something like [channel_name/number].[CC_name] is need)

    Example:
    Ch01.volume = 0.2
    Ch02.pan = 0.8
    Piano_Yamaha.modulation = 0.0
    Piano_Yamaha.pan = 0.5
    Piano_Yamaha.sustain = 0.9
    Guitar_solo.tremolo = 0.4
    Guitar_solo.volume = 1.0
    MySynthPlugin.portamento = 0.51 (true)
    … unlimited

    No more 0-127 preassigned limitations, no more 0 to 127 dumb values.

      1. Oh thanks for making me discover OSC. That rocks !! Why isn’t it the standard ?
        Guys, just make OSC the midi 2.0 then. No need to work. The better standard Is already here…

    1. Sysex to the rescue: F0 + MIDI 2.0 opcode .+ one or two current unused CC code + 4 bytes + F7 = 32 bits float. with current US 2.0 / 3.0 speedw that should not be an issue. Just need to agree on it. But its’ probably not going to be so simple because MMA membesr may want to make easy things difficult and monetize on it.

  5. I hope that some of the MIDI 1.0 features still work … I’m mostly hoping that bank/patch change specs stay the same … I’d really hate to have to re-program all of that stuff

  6. Please watch the ADC18 Youtube videos and you will know what the scope of this testing is. The shell, not the heart of the new MIDI. In a speech it was mentioned that it will take several years before we will see the new MIDI arise. Do you hate SYSEX / well MIDI 2.0 is for the most part SYSEX (read the spec). I am afraid the spec will be so open that you can’t get your hands on it w/o paying up hindered of dollars per year as a subscription Might be alike that it took 30+ years before the MIDI 1.0 spec became officially available on line for free. Yes for “Users” it’s great, for small developers likely the biggest nightmare to push our products that support it.
    Not sure how to reduce latency and increase speed when the 3-4 bytes we have today that cover almost everything needed still runs on top of USB 1/2/3, and IP…… (note ROLI’s products are USB 1.1)

  7. Midi 1’s binary digits have a special warmth that Midi 2.0’s 1s and 0s just can’t match. With good monitors you can hear it.

  8. Hi,

    A couple of quick replies to various comments.

    OSC- OSC is a great protocol and very flexible, but there is no interoperability between OSC devices and that is what MIDI is all about.

    Regarding SysEx and compatibility, most people who use MIDI will never have to worry about these things because all that will be handled in the operating system by APIs and new class compliant drivers, in DAW hosts and other larger parts of the MIDI environment. Manufacturers and developers will handle translation.

    There are no plans to change any physical connectors for MIDI.

    “During the prototyping phase, the proposed MIDI 2.0 specification is available only to MMA and AMEI members, because the prototyping process may trigger minor enhancements to the specification. Once a final specification is adopted, it will join the current MIDI specifications as a free download on http://www.midi.org.”

    (1) will it be an open specification?
    The MIDI 2.0™ specification is not open to be modified ( just as MIDI 1.0 specifications are not). That can only be done by the MMA and AMEI standards bodies (ensure interoperability) but it will be open and free for the public to use.

    (2) will it be patent encumbered?
    No.

    3) will it require a license fee?
    No

    (4) under what license will the specification be available to read?
    It will be available for free download on https://www.midi.org. There will be no license requirement.

    (5) under what license will the specification be available to deploy?
    Just like MIDI 1.0, there will be no license.

    Simply put MIDI 2.0™ will be just like MIDI 1.0.

    MIDI has been and always should be free for people to use to play and create music and music products.

    By the way, we just updated the MIDI-CI article on the site with some more details of MIDI 2.0™ and how some of the new features can also be used with MIDI 1.0 ( like TimeStamps).

    https://www.midi.org/articles-old/midi-manufacturers-association-mma-adopts-midi-capability-inquiry-midi-ci-specification

  9. Self description would be nice, … it’s something I’ve wanted in OSC for years.
    OSC is the real solution of course. send high level musical events that get translated into sound, either directly or mitigated to somewhat less capable MIDI devices.
    Originally, all MIDI devices were supposed to publish their interpretation of MIDI specs, and have robust sysexes so external programs could do library work, and sometimes generate patches as sysexes. This needs to be easier!
    There was a really good next gen “MIDI” like protocol called ZIPI you folks should look into.

  10. “There are no plans to change any physical connectors for MIDI.”

    This is confusing.. isn’t the main problem with MIDI the fact that it’s Serial in nature and that typically MIDI is used by “musicians” to create/play “music” or sounds, and a large quantity of this “music” consists of inherently parallel actions(chords) and synchronization between actors or machines?

    Isn’t the baud rate too low and this leads to undesirable skewing/jitter, etc.. that human beings are especially sensitive to and which ultimately has the effect of dampening musical impact?

    “AI” entities will be even more sensitive to this and they will have to hear temporally drifting “music” made by humans. This might become a source of frustration and conflict between humans and synthetic beings, who could potentially interpret this as disturbing/intolerable noise pollution.

    It seems too many unhelpful assumptions about “Music” were made during the genesis phase of MIDI 1.0 that did not adequately predict future developments and it’s on the verge of becoming a hindrance to free expression.. if it’s not already.

    The protocol should not have anything to do with notions of tuning or “musicality” be they “eastern” or “western”, this is a deeply flawed concept with real-world ramifications. It goes without saying that there are no “notes”, just as there are no actual lines separating landmasses on this planet. So-called “western” notions of “music” should not be projected onto the frequency spectrum. It’s fine to support static “pitches” and “scales”, but please don’t impose them so readily.. not everyone agrees that there are 12 tones per octave or four beats to a measure and we need more flexibility here.. especially in regards to dynamism.

    Backwards compatibility is great, except when it’s built upon a fundamentally flawed foundation with unworkable restrictions regarding resolution, fidelity and expressivity at the interface level.. similarly, making a pipe larger on one end does not increase throughput or flow rates.

    Please be careful what you do, because these decisions will have far-reaching ramifications for many decades to come!

    Respectfully,
    Ryan Dean

    1. Hi,

      “Isn’t the baud rate too low and this leads to undesirable skewing/jitter, etc”

      This is a very common misconception about MIDI. The MIDI protocol ( set of messages) is not tied to any single transport and MIDI’s speed depends on what transport it is on. Yes, 5 PIN DIN has a 31.25 KB baud rate, but MIDI over USB, Ethernet or on other transports is much faster.

      Also part of the new MIDI 2.0 initiative are TimeStamps specifically designed to reduce jitter.

      “So-called “western” notions of “music” should not be projected onto the frequency spectrum.”

      We recognized some limitations in MIDI 1.0 so there are many new mechanisms for direct pitch control in MIDI 2.0 (including per note pitch bend).

      When MIDI 1.0 was released no one envisioned DAWs, soft synths, Ableton Push controllers, DJ mixers or iPhones and iPads. But somehow MIDI has survived for 36 years by being flexible and adaptable.

      MIDI 2.0 is not only about the specific new features now, but also about all the new possibilities for future expansion.

  11. First World Problem: I wish I could afford enough gear for this to become an issue. Pros like Jeff Rona will probably be glad of certain advances to the spec, but I’m an in-the-box C.H.U.D. with only 2 MIDI hardware synths at this point. I’ll never feel ’em. It seems doubtful that there could be a need for a MIDI 3.0. This version is more than musically comprehensive enough for the real world.

  12. With MIDI 1.0, the biggest limitations were always with manufacturers/developers not implementing existing features within that spec.

    MIDI controllers often lack true 127-step velocity resolution (due to slow scan rates), or lack polyphonic aftertouch (or workarounds), or even release-velocity, or 14-bit RPNs or NRPNs. And synth makers usually fail to implement any ability to interpret those features. It’s a catch 22, with controller makers saying “most synths don’t read them” and synth makers saying “most controllers don’t transmit them”.

    In some cases, (like polyAT) the omission can be forgiven because of the cost to implement them. But as the cost of some of this technology has come down, we’ve seen little advancement in those features. We’re only now seeing Novation touting properly sufficient scan rates to make velocity sensing have a proper range and resolution. For synth developers, the lack of documentation/support for high-resolution (14-bit) CC, polyAT, release velocity is very disappointing.

    I AM excited about the new spec. If it invigorates a wave of new capabilities for controllers and synths in terms of features and enhancement of resolution, I hope that some additional effort might be made to allow MIDI 1.0 some degree of forward compatibility. For example, if the new spec provides higher resolution, let 1.0 devices access that higher resolution through conversion of RPN/NRPNs into the newer CC framework.

  13. Good stuff. This sort of thing is _hard_. I wish them all the luck.

    Would like to see the primary maintainers of Synthtopia, Matrix Synth, SonicState, CDM, and maybe one of the DJ oriented synth/music/tech blogs as a single representative entity committee member. Industry did well with MIDI 1.0 and they’ll probably do fine with 2.0 but would like to see a ‘first class’ set of end-user type voices. It wouldn’t need to be the actual blog maintainers—Jesse or Peter or Nick… could nominate a person to take their place.

    Industry wise, also not seeing anyone on the list that really represents the modular or iOS worlds. And not much outside of the music gear industry like lighting or whatever.

    All that said, great to see that Google is on board. They have *a lot* of experience with spec development and probably the best single repository of engineering talent available today. Plus, who isn’t excited to see how they track our purchase histories via MIDI 2.0. :/

  14. To The MIDI Association

    ”OSC- OSC is a great protocol and very flexible, but there is no interoperability between OSC devices and that is what MIDI is all about.”

    Can you be more clear about that. That’s a very vague statement to get rid of OSC comparison to me.

    OSC is clearly better designed than MIDI on every aspects. They just did not have the chance to have strong marketing and support.

    What if you implement an OSC like standard that features this interoperability simply then?

    I hope it’s not an other ego battle that will lead to a disappointing new standard in any case..

  15. I think Midi 2.0 should only be backward compatible in this way. A version 1.0 midi instrument cable can plug into a 2.0 device, in, out and through. But a Midi 2.0 cable will not plug into a 1.0 device (every 2.0 device would have the a 1.0 din plug). This would allow a new cable that would house a new transport. If the same Midi 1.0 5 din compatible cable must be used then a 6th pin in the center of the 5 pin connector that would connect a small cat6 or multiple connections connector that would fold in or sink into the cable if the 2.0 cable is connected to a 1.0 device.

  16. Not sure if it is a MIDI 1.0 limitation or a hardware tradition but BPM lower than 30 is a must. Being able to go lower than 30BPM even as far as 1BPM would be very useful in various scenarios.

  17. A lack of sensors in keyboard actions meant MIDI 1.0 had only to worry about which note, and how hard, and later how much pressure. Most keyboard action makers still have not evolved to offer pressure-per-note, or offer fast piano note repetition using the half-down sensor. Notwithstanding, the potential of string like response from multi sensor note actions is profound – for certain types of solo playing. Ensoniq tried this, giving a MIDI channel per note, in their VFX-SD model. It was called Mono Mode. It was seldom used though. MPE seems quite similar.

    It is a shame that a “channel per note” is a paradigm that will be incompatible with most legacy hardware. A better solution would be note specific controller messages, and it can’t be that hard to achieve that through Sysex messaging using MIDI1.0. These can be safely ignored by old gear, and chords would still be possible using one channel, making layering possible.

    Quote from VFXSD manual here
    https://twitter.com/All97Notes/status/993097087729487873

Leave a Reply to alacazam Cancel reply

Your email address will not be published. Required fields are marked *