Anyone ported code to work inside Live?

Before I was using Live I wrote my own percussion step sequencer: It uses a Launchpad as an interface and generates MIDI out. I’d like to re-write this and bring it into Live. My aim is to have close integration. For example, rather than it’s own pattern storage, it should just use clips.

What I need help with is which path to go down, as there seem to be many:

  • Pure M4L device. Given the others I see on the Max4Live site, I’m guessing this is doable, though it isn’t clear to me that there are objects that will let me read/write clip data. (Somehow I’m failing to find doc of all the Live objects in Max… is there an on-line guide?) - Downside is that for things like modal UIs, I’m guessing patches are going to be horribly complex… and I’m a programmer by day, so…
  • M4L device, with much logic written as a Java element? Or JavaScript? M4L seems to support both - but again, I haven’t found what Live interfaces exists from within each of those two languages. Is one better or more complete than the other?
  • M4L device, with external code (in C, C++, or hey, even Haskell) - I’m guessing there are no APIs into Live from code compiled and included like this, so I’m guessing this way would be quite painful, having to wrap all interaction with Live in some custom protocol to my compiled code.
  • Remote Script. This isn’t perhaps as flexible, and again, the APIs from Remote Scripts into Live don’t seem to be obviously public - but I suppose I could reverse engineer them…

Can anyone help shed some light on which path might work, and where I can get a sense of the Live integration APIs available?

1 Like


Pure M4L

M4L in JavaScript

Max & PD in C/C++

1 Like

You can definitely use a M4L device to write / read MIDI data to and from clips, by using the Live API.

It’s been a while since I was last playing around with this stuff, but if I remember correctly you need to use the live.path and live.object devices within your M4L device. Here are the useful bits of the M4L documentation:

From looking at those links briefly, I thiiiiiink you need to set the Live API path by sending the “live_set tracks N clip_slots M clip” message (explained in the second of the two links) to a live.path device, then sending live.path’s output to a live.object (the process for using live.path and live.object to navigate and interact with Live’s API is covered in the first link), then sending a “replace_selected_notes” message to the same live.object (again, this messaging is covered in the second link).

Basically, you tell M4L “Okay, find out the designated path within Live’s API for this particular clip in that track… okay, now add this note at that time in the clip, with this velocity and that duration”

Once you’ve got that worked out (it’s a bit fiddly but makes sense once you get your head round it), it’s dead simple to connect those messages to a step sequencer.

@ehg is correct, you would do this using live.path and live.object. the LOM is a good reference, although i believe this one is more up to date.

Anyone have any opinion of whether or not doing this in pure Max or in a JavaScript object is a better approach? Is Java under Max no longer a thing?

You can use Java. I can’t speak to which you’d prefer (Java or JavaScript) but I do get the impression that you will appreciate having a text based language to complement your patching based on your past programming experience.

Java does introduce the dependency of Java, which has caused some problems for specific apps over time.