MIDI 2.0

Well shucks, I had no idea this was even in the works. Are there more technical details available somewhere? It seems kind of lame the whole spec is being developed in private among these large member companies, but I suppose that’s the only way to push something like this through.

Very curious to see how little or much this deviates from the current spec! (Especially in regards to recent discussion here RE the usefulness of something like OSC for dealing with events as param clusters, streams, or in other non-note-centric ways…)


Oops, there is more detail about the specifics in this post:

Most interesting to me up front is that it’s now bi-directional… o_O

1 Like

It’s in the works since some decades already and there has always been talk about it. Good to see there’s some results finally. Extended resolution and tighter timing is a godsend!

1 Like

This talk clarified a whole lot for me. The basic idea (MIDI-CI) that Mike Kent came up with 2 years ago that makes all the new extensions possible is pretty clever & straightforward – the new bi-directionality in MIDI-CI is like a handshake devices can do to decide if all these new features can be used, otherwise they just fall back to MIDI 1.0…

The dude himself:

JSON over the property exchange API is pretty crazy exciting!

…protocol negotiation sounds like it could support even something crazy like OSC over MIDI…

This is super cool.


Thanks for the video. This could potentially be a new era for electronic music gear. They are talking about 32bit resolution (!!!) and much higher bandwidth and clocking rates too. This means Midi could probably even approach audio rates and the old argument about the resolution of Midi vs. continuous CV voltages would also be obsolete. Also tight sync would never be a problem again.

The downside of course is that the simplicity and elegance of the midi protocol would also be gone. All this talk about protocol negotiation kind of makes my head spin.


I think the protocol negotiation part will be easily encapsulated into a library or other set of functions, and the ability to debug/diagnose the protocol will still be fairly simple. No, we won’t be reading the actual bytes on the wire as easily (but if it stays more-or-less human-readable ASCII in JSON format, that’s a plus), but since they still translate to direct messages it won’t be hard to chart/plot/observe them discretely. It’s certainly a lot better than trying to decode each manufacturer’s custom sysex for a lot of the extended parameters!

I would like them to decide on a physical protocol though - ideally Ethernet or USB - because I see that becoming the next big issue once we need bidirectionality and higher bandwidth.


Mike Kent and co. at ADC '19 last month: https://www.youtube.com/watch?v=K2dAIvrI8zg

Resurrecting this…

I read the entire MIDI 2.0 spec set last night…

It is a testament to MIDI that it supports a whole world of use cases that the likes of us have no connection with… But, sad that 90% of the spec. is about those use cases.

Prediction: None of the profile and property exchange parts of the spec (which are the bulk of it) will have any impact on electronic musicians - we’ll never use any of it. Sort of the same way that General MIDI has little impact on us.

The only part that will get used is Universal Packet Format, the MIDI 2.0 voice messages (higher resolution), and the lovely 8-bit clean SysEx message.

As an implementer, the spec. is a pain, especially for small devices: It defines the 4th and 5th encodings of MIDI messages, so implementations will now have to handle:

  1. MIDI 1.0 over serial (the original stream encoding)
  2. MIDI 1.0 over USB (which is a packet encoding)
  3. MIDI 1.0 over BLE (which is a stream encoding, but with timestamps and different running status rules)
  4. MIDI 1.0 over UMP (the new packet format defined by MIDI 2.0, different than USB’s packet format, has optional timestamps, different than BLE’s)
  5. MIDI 2.0 over UMP (finally, the actual new messages)

MIDI 2.0 messages will be a bit of a mixed bag: The extended resolution of data values (velocity, controller, pitch bend, etc.) in 2.0 will be very appreciated, as will the per-note control values. On the other hand, things like per-profile-attributes, and complex per-note management are likely to get glossed over in many implementations.

How to come up with a uniform API for representing this so applications, and musicians who work with things like MAX, Pd, SuperCollider, etc…, can easily code and manipulate it is going to be quite a challenge.


tldr: Register for free as an individual MIDI Association member if you’re interested in updates on MIDI 2.0 and a potential Open Source SIG (Special Interest Group).

Longer version:

I attended a MIDI 2.0 online meeting with the MIDI Association (May 2021).

Many attendees asked how open source developers can get involved (when MIDI Association corporate membership starts at $600pa).

I was very impressed that the meeting chair, Athan, directly addressed these comments. He said he welcomed the open source community being involved.

He suggested anyone wishing to be kept in the loop about forming an Open Source SIG should register for free as an individual MIDI Association member.

Apparrently the MIDI Association is run by people volunteering their time. You can be a member of a SIG without being a corporate member.

In the last few years he said the MIDI Association initiated a process of opening up to wider communities. So if you have the time and energy they would like to hear from you.

There’s an unofficial guide to MIDI 2.0 that MIDI Association SIG member Evan and his team at imitone have put together.


The MIDI Association sent me an email recommending interested people register for free as an individual MIDI Association member and when you sign up indicate your primary MIDI interest as “I’m a developer of MIDI hardware or software”.

Then the MIDI Association can easily reach out later about an Open Source SIG.

The email also said the existing Intermedia Mapping and Scripting (IMS) Special Interest Group has a good deal of overlap with the monome community:

IMS Working Group Goals.pdf


I just finished putting together some thoughts that have been bouncing around in my head about MIDI & the state of music tech; I’d love to get some thoughts :blush: