I’m building an M4L plugin for my Eventide H9 (and yes, I’ll be delighted to share it with the world once I get it working). I’ve got the core functionality working in Max already - it can query the device, display the currently selected preset by name and module type, set it’s 10 knobs to the current values of the effect parameters, and send/follow the knobs as they are updated either from the unit’s front panel or via the Max interface directly. So far so good.
I’ll also make it clear that V1 of this plugin is intended to be a playback/automation tool only - in other words, I’m not presently designing it to handle figuring out what algorithms you have loaded and controlling them independently of your preset list - it’ll just be able to observe the current algo/knob position, switch to another preset (by preset number), and set knobs to positions or follow the positions of knobs as you change them on the front panel. The expression pedal and hotswitch settings will also be tracked and can be controlled.
What I’d like to do is turn this into an M4L plugin so that effect parameters could be automated. I’m not sure how best to do this: should loading the file slam the H9 to the preset it was last saved with, and set the knobs from there? Should it be a physical “sync” action (either to or from the device)? How do integrate this with the data storage of Live best? How to connect the max pots and sliders to the Live datastream/automation params?
I’m wading through the M4L automation docs right now, but it all seems rather disconnected - there doesn’t seem to be a “best practice” guide for this situation, so I’m hoping some of you with more experience can weight in on a good strategy.
My other question: should this be an audio plugin? A MIDI plugin? Neither? What’s the best “object” to encapsulate this?
I was hoping to do something like the “external audio effect” plugin - audio in and then sent to some output, midi control via a sideband MIDI setting (e.g. a drop-down in the plugin params choosing interface and channel), a handful of knobs, an expression slider, and the left, right, and centre buttons, and then audio back in from some other external input and sent out the plugin’s audio data output connections onwards to the next device in the chain. I’d expect automation on the knobs to be the main form of control, not a MIDI track or anything. Does this make sense?