Much of this could be done with another layer on top of OSC, one that mimics the functions of MIDI but with enhanced features. The question then becomes the extent to which this layer would be standardized. Maybe there are several nested standards. Or at least two: simple devices could support this basic functionality and just not respond to other OSC. More complex devices would respond to OSC in its full implementation.
Anyway, I see these proposals every 5-10 years, since the 1990’s, all the major players get involved and it never really goes anywhere.
I also think of getting beyond MIDI more generally, in terms of what it would take to get off the rigid time grid, and have some kind of variable or even interlocking event-based notion of musical time, and have this become expressible in a standard. And how to wrap this up so it’s intuitive for users. I’m fine with music “as is”, but am perpetually curious about these things. All this is already possible to some degree. But it’s not what is possible technically, it’s what kinds of uses occur most often, and standards do have an influence. The notion of musical time just seems very underdeveloped. Automatic tempo tracking is possible, so what do we do with it. How can we anchor events to other events; how do we make a malleable time base.